File size: 441 Bytes
312dc82 |
1 2 3 4 5 6 7 8 9 10 11 12 13 |
dtype: bfloat16
merge_method: passthrough
slices:
- sources:
- layer_range: [0, 7] # Taking first 13 layers (0-12)
model: ArliAI/Mistral-Nemo-12B-ArliAI-RPMax-v1.3
- sources:
- layer_range: [18, 19] # Taking last 2 layers for output stability
model: mistralai/Mistral-Nemo-Base-2407
- sources:
- layer_range: [32, 39] # Taking last 2 layers for output stability
model: mistralai/Mistral-Nemo-Base-2407
|