File size: 1,708 Bytes
c65e1e2
 
98e6ed7
 
c65e1e2
98e6ed7
c65e1e2
 
98e6ed7
c65e1e2
 
98e6ed7
c65e1e2
98e6ed7
 
c65e1e2
98e6ed7
c65e1e2
98e6ed7
 
c65e1e2
98e6ed7
 
 
 
 
c65e1e2
98e6ed7
c65e1e2
10f71f9
c65e1e2
 
98e6ed7
c65e1e2
98e6ed7
c65e1e2
98e6ed7
b4e9ddd
 
98e6ed7
 
c65e1e2
98e6ed7
c65e1e2
 
 
98e6ed7
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
---
base_model: meta-llama/Llama-3.1-8B
library_name: transformers
model_name: inter-play-sim-assistant-sft
tags:
- generated_from_trainer
- trl
- sft
licence: license
---

# Model Card for inter-play-sim-assistant-sft

This model is a fine-tuned version of [meta-llama/Llama-3.1-8B](https://huggingface.co/meta-llama/Llama-3.1-8B).
It has been trained using [TRL](https://github.com/huggingface/trl).

## Quick start

```python
from transformers import pipeline

question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
generator = pipeline("text-generation", model="Sim4Rec/inter-play-sim-assistant-sft", device="cuda")
output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
print(output["generated_text"])
```

## Training procedure

[<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="150" height="24"/>](https://wandb.ai/jerome-ramos-20/huggingface/runs/lzvwgcqb) 


This model was trained with SFT.

### Framework versions

- TRL: 0.14.0
- Transformers: 4.51.3
- Pytorch: 2.6.0
- Datasets: 3.0.1
- Tokenizers: 0.21.0

## Citations



Cite TRL as:
    
```bibtex
@misc{vonwerra2022trl,
	title        = {{TRL: Transformer Reinforcement Learning}},
	author       = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallouédec},
	year         = 2020,
	journal      = {GitHub repository},
	publisher    = {GitHub},
	howpublished = {\url{https://github.com/huggingface/trl}}
}
```