Hplm
/

Llama 3 Finetuned Historical Model (1910 - 1940)

This model was finetuned using DoRA adapters, from the Llama3 8B model.

It was finetuned on 10M words from the Gutenberg Corpus attributed to the time period 1910 - 1940.

Model Sources

Downloading the Model

Load the model like this:

import torch
from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("Hplm/dora_llama_model_1910_1940", torch_dtype=torch.float16)
tokenizer = AutoTokenizer.from_pretrained("Hplm/dora_llama_model_1910_1940")

License

Built with Meta Llama 3, and under the meta-llama licence.

Citation

@article{fittschen_diachroniclanguagemodels_2025,
  title = {Pretraining Language Models for Diachronic Linguistic Change Discovery},
  author = {Fittschen, Elisabeth and Li, Sabrina and Lippincott, Tom and Choshen, Leshem and Messner, Craig},
  year = {2025},
  month = apr,
  eprint = {2504.05523},
  primaryclass = {cs.CL},
  publisher = {arXiv},
  doi = {10.48550/arXiv.2504.05523},
  url = {https://arxiv.org/abs/2504.05523},
  urldate = {2025-04-14},
  archiveprefix = {arXiv},
  journal = {arxiv:2504.05523[cs.CL]}
}
Downloads last month
13
Safetensors
Model size
8.03B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including Hplm/dora_llama_model_1910_1940