File size: 478 Bytes
1e4670d
bb81144
d40fcf4
 
 
 
 
 
878875b
bb81144
 
1e4670d
 
 
 
962d412
d40fcf4
962d412
3daa049
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
---

language:
- en
- es
tags:
- moe
- merge
base_model:
- yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B
- TomGrc/FusionNet_7Bx2_MoE_14B
---

# LogoS-7Bx2-MoE-13B-v0.1

Model built by @RubielLabarta using SLERP merge method. The model is release for research purposes only, commercial use is not allowed.

The LogoS is a model to experiment with the MoE method, which could significantly increase the performance of the original model. The model has 12.9B parameters.