Finetuned Historical Models
Collection
Collection of the finetuned historical models.
•
5 items
•
Updated
This model was finetuned using DoRA adapters, from the Llama3 8B model.
It was finetuned on 10M words from the Gutenberg Corpus attributed to the time period 1910 - 1940.
Load the model like this:
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("Hplm/dora_llama_model_1910_1940", torch_dtype=torch.float16)
tokenizer = AutoTokenizer.from_pretrained("Hplm/dora_llama_model_1910_1940")
Built with Meta Llama 3, and under the meta-llama licence.
@article{fittschen_diachroniclanguagemodels_2025,
title = {Pretraining Language Models for Diachronic Linguistic Change Discovery},
author = {Fittschen, Elisabeth and Li, Sabrina and Lippincott, Tom and Choshen, Leshem and Messner, Craig},
year = {2025},
month = apr,
eprint = {2504.05523},
primaryclass = {cs.CL},
publisher = {arXiv},
doi = {10.48550/arXiv.2504.05523},
url = {https://arxiv.org/abs/2504.05523},
urldate = {2025-04-14},
archiveprefix = {arXiv},
journal = {arxiv:2504.05523[cs.CL]}
}