|
--- |
|
base_model: |
|
- bond005/meno-tiny-0.1 |
|
- fblgit/miniclaus-qw1.5B-UNAMGS-GRPO |
|
- godlikehhd/alpaca_data_ifd_min_2600 |
|
- Youlln/ECE-PRYMMAL-YL-1B-SLERP-V2 |
|
- lalainy/ECE-PRYMMAL-YL-1B-SLERP-V3 |
|
- lalainy/ECE-PRYMMAL-YL-1B-SLERP-V4 |
|
- godlikehhd/alpaca_data_score_max_2500 |
|
- Youlln/ECE-PRYMMAL-YL-1B-SLERP-V1 |
|
- Sakalti/Saba1.5-1.5B |
|
library_name: transformers |
|
tags: |
|
- mergekit |
|
- merge |
|
|
|
--- |
|
# merge |
|
|
|
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). |
|
|
|
## Merge Details |
|
### Merge Method |
|
|
|
This model was merged using the [SCE](https://arxiv.org/abs/2408.07990) merge method using [bond005/meno-tiny-0.1](https://huggingface.co/bond005/meno-tiny-0.1) as a base. |
|
|
|
### Models Merged |
|
|
|
The following models were included in the merge: |
|
* [fblgit/miniclaus-qw1.5B-UNAMGS-GRPO](https://huggingface.co/fblgit/miniclaus-qw1.5B-UNAMGS-GRPO) |
|
* [godlikehhd/alpaca_data_ifd_min_2600](https://huggingface.co/godlikehhd/alpaca_data_ifd_min_2600) |
|
* [Youlln/ECE-PRYMMAL-YL-1B-SLERP-V2](https://huggingface.co/Youlln/ECE-PRYMMAL-YL-1B-SLERP-V2) |
|
* [lalainy/ECE-PRYMMAL-YL-1B-SLERP-V3](https://huggingface.co/lalainy/ECE-PRYMMAL-YL-1B-SLERP-V3) |
|
* [lalainy/ECE-PRYMMAL-YL-1B-SLERP-V4](https://huggingface.co/lalainy/ECE-PRYMMAL-YL-1B-SLERP-V4) |
|
* [godlikehhd/alpaca_data_score_max_2500](https://huggingface.co/godlikehhd/alpaca_data_score_max_2500) |
|
* [Youlln/ECE-PRYMMAL-YL-1B-SLERP-V1](https://huggingface.co/Youlln/ECE-PRYMMAL-YL-1B-SLERP-V1) |
|
* [Sakalti/Saba1.5-1.5B](https://huggingface.co/Sakalti/Saba1.5-1.5B) |
|
|
|
### Configuration |
|
|
|
The following YAML configuration was used to produce this model: |
|
|
|
```yaml |
|
models: |
|
- model: bond005/meno-tiny-0.1 |
|
- model: fblgit/miniclaus-qw1.5B-UNAMGS-GRPO |
|
- model: Sakalti/Saba1.5-1.5B |
|
- model: Youlln/ECE-PRYMMAL-YL-1B-SLERP-V1 |
|
- model: Youlln/ECE-PRYMMAL-YL-1B-SLERP-V2 |
|
- model: godlikehhd/alpaca_data_ifd_min_2600 |
|
- model: godlikehhd/alpaca_data_score_max_2500 |
|
- model: lalainy/ECE-PRYMMAL-YL-1B-SLERP-V3 |
|
- model: lalainy/ECE-PRYMMAL-YL-1B-SLERP-V4 |
|
tokenizer: |
|
source: base |
|
merge_method: sce |
|
base_model: bond005/meno-tiny-0.1 |
|
dtype: bfloat16 |
|
parameters: |
|
int8_mask: true |
|
``` |
|
|