Mistral-nemo-3b-unhealed / mergekit_config.yml
Alignment-Lab-AI's picture
Upload folder using huggingface_hub
312dc82 verified
raw
history blame contribute delete
441 Bytes
dtype: bfloat16
merge_method: passthrough
slices:
- sources:
- layer_range: [0, 7] # Taking first 13 layers (0-12)
model: ArliAI/Mistral-Nemo-12B-ArliAI-RPMax-v1.3
- sources:
- layer_range: [18, 19] # Taking last 2 layers for output stability
model: mistralai/Mistral-Nemo-Base-2407
- sources:
- layer_range: [32, 39] # Taking last 2 layers for output stability
model: mistralai/Mistral-Nemo-Base-2407