--- base_model: - zhiyuanhucs/sequence-200-5 - zhiyuanhucs/formula-200-2 - zhiyuanhucs/backward-short-level1-epoch2-step224 library_name: transformers tags: - mergekit - merge --- # merge-formula-1-sequence-0.1-backward-old-0.2 This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [Linear](https://arxiv.org/abs/2203.05482) merge method. ### Models Merged The following models were included in the merge: * [zhiyuanhucs/sequence-200-5](https://huggingface.co/zhiyuanhucs/sequence-200-5) * [zhiyuanhucs/formula-200-2](https://huggingface.co/zhiyuanhucs/formula-200-2) * [zhiyuanhucs/backward-short-level1-epoch2-step224](https://huggingface.co/zhiyuanhucs/backward-short-level1-epoch2-step224) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: zhiyuanhucs/formula-200-2 parameters: weight: 1 - model: zhiyuanhucs/sequence-200-5 parameters: weight: 0.1 - model: zhiyuanhucs/backward-short-level1-epoch2-step224 parameters: weight: 0.2 merge_method: linear dtype: float32 ```