🧠 Violet-Magcap-12B

So here’s the lore...

I might have taken Mag-Mell-12B-R1,  jacked it up on SFT reasoning data like it was pre-workout for logic bros. Then for chaos, slapped it together with Captain_Eris_Violet-GRPO like some twisted AI Voltron.

Did I stop there? No.
Double-tapped the merge with SFT on fresh reasoning data. Now it's solving problems like Bill Nye on a meme bender and hoarding cursed philosophy shitposts.


🛠️ Model Details

Feature Description
Base Models Mag-Mell-R1 + Captain-Eris
Size 12B Parameters
Architecture Magmell + Reasoning → Captain Eris Merge + post-SFT
Post-Merge Tuning Double SFT on new reasoning data

⚙️ Usage Presets

🎛️ ST Prompt Presets


💾 Quantized Versions

🧠 Lewdiculus Imatrix (GGUF)
🧠 Nitral 4bpw (ExL2)


📦 Prompt Formats

Reasoning Block + Prefix

Reasoning Format

ChatML Format

ChatML Format

Mistral Format

Mistral Format

🌀 Vibe Check

It will help you solve problems. It will also make you question your existence.
Use wisely—or don’t.

🧬 Created by: Nitral-AI

Downloads last month
41
Safetensors
Model size
12.2B params
Tensor type
FP16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Nitral-AI/Violet_Magcap-12B

Finetuned
(2)
this model
Quantizations
3 models