File size: 1,623 Bytes
0864eaf a02751c 0864eaf |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 |
---
license: apache-2.0
base_model:
- google/flan-t5-small
pipeline_tag: text2text-generation
---
# dafilab/chat-title-generator
Fine-tuned `flan-t5-small` model for generating short titles from chats.
## Model Details
- **Base model**: google/flan-t5-small
- **Training examples**: 10,000
- **Epochs**: 2
- **Final training loss**: 0.778800
- **Train batch size per device**: 4
- **Total optimization steps**: 500
## Usage
```python
from transformers import T5ForConditionalGeneration, T5Tokenizer
model = T5ForConditionalGeneration.from_pretrained("dafilab/chat-title-generator")
tokenizer = T5Tokenizer.from_pretrained("dafilab/chat-title-generator", legacy=False)
def generate_chat_title(text):
input_text = "short title: " + text
inputs = tokenizer(input_text, return_tensors="pt", truncation=True, max_length=512)
outputs = model.generate(
input_ids=inputs.input_ids,
max_length=64,
num_beams=4,
early_stopping=True,
pad_token_id=tokenizer.pad_token_id,
eos_token_id=tokenizer.eos_token_id
)
return tokenizer.decode(outputs[0], skip_special_tokens=True)
text = """How can I access the GPU of my other computer remotely for ML training?
To access your other computer's GPU remotely for machine learning (ML) training,
you need to set up remote access to the machine and ensure that it can properly leverage the GPU for computations.
There are several ways to do this, depending on your operating system and the tools you prefer to use."""
print(generate_chat_title(text))
```
## Output
```
Remote Access for Machine Learning
``` |