Mixtral 8x7B - Holodeck

Model Description

Mistral 7B-Holodeck is a finetune created using Mixtral's 8x7B model.

Training data

The training data contains around 3000 ebooks in various genres. Most parts of the dataset have been prepended using the following text: [Genre: <genre1>, <genre2>]


Limitations and Biases

Based on known problems with NLP technology, potential relevant factors include bias (gender, profession, race and religion).

Downloads last month
4
Safetensors
Model size
46.7B params
Tensor type
FP16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for KoboldAI/Mixtral-8x7B-Holodeck-v1

Merges
4 models
Quantizations
2 models