--- language: - en - es tags: - moe - merge base_model: - yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B - TomGrc/FusionNet_7Bx2_MoE_14B --- # LogoS-7Bx2-MoE-13B-v0.1 Fine-tuned model on English and Spanish language using MoE method. Model description The LogoS is a model to experiment with the MoE method, which could significantly increase the performance of the original model. The model has 12.9B parameters, and this model is fine-tuned with The Pile.