LogoS-7Bx2-MoE-13B-v0.1

Model built by @RubielLabarta using SLERP merge method. The model is release for research purposes only, commercial use is not allowed.

The LogoS is a model to experiment with the MoE method, which could significantly increase the performance of the original model. The model has 12.9B parameters.

Open LLM Leaderboard Evaluation Results

Detailed results can be found here

Metric Value
Avg. 77.14
AI2 Reasoning Challenge (25-Shot) 74.49
HellaSwag (10-Shot) 89.07
MMLU (5-Shot) 64.74
TruthfulQA (0-shot) 74.57
Winogrande (5-shot) 88.32
GSM8k (5-shot) 71.65
Downloads last month
26
Safetensors
Model size
12.9B params
Tensor type
BF16
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for RubielLabarta/LogoS-7Bx2-MoE-13B-v0.2

Space using RubielLabarta/LogoS-7Bx2-MoE-13B-v0.2 1

Evaluation results