🌌 Abyssal-Seraph-12B
Where the light of the divine meets the poetry of the abyss.
🜂 Overview
Abyssal-Seraph-12B is a multi-stage creative merge designed for expressive storytelling, emotional depth, and lyrical dialogue.
It was crafted through a layered fusion using MergeKit:
- 🌙 LunaMaid × Vermilion-Sage — merged via NearSwap (
t=0.0008) to unify LunaMaid’s balanced composure with Vermilion-Sage’s radiant prose. - 🕯️ Dark-Quill × Mag-Mell-R1 — merged via NearSwap (
t=0.0008) to draw forth mysticism, poetic darkness, and a sense of dreamlike gravity. - ✨ Both intermediate results combined with the Karcher Mean — a geometric blend ensuring harmony between light and shadow.
🩶 Model Essence
| Trait | Description |
|---|---|
| 🧠 Core Nature | Philosophical, poetic, emotionally resonant |
| 💬 Style | Fluid prose, vivid imagery, articulate reflection |
| 💫 Tone | Dreamlike, balanced between divine warmth and abyssal calm |
| 🎭 Best For | Roleplay, character dialogue, introspection, lore writing, creative prose |
🧬 Merge Overview
Abyssal-Seraph-12B was created through a multi-stage, precision merge designed to blend expressive prose with poetic balance while maintaining model stability.
🌙 Stage 1
✨ Method: NearSwap (t = 0.0008)
🩵 Base: Vortex5/LunaMaid-12B
💮 Secondary: Vortex5/Vermilion-Sage-12B
Stage 1 Configuration
name: First
models:
- model: Vortex5/Vermilion-Sage-12B
merge_method: nearswap
base_model: Vortex5/LunaMaid-12B
parameters:
t: 0.0008
dtype: bfloat16
tokenizer:
source: Vortex5/LunaMaid-12B
🩶 Stage 2
⚙️ Method: NearSwap (t = 0.0008) 🖤 Base: Vortex5/Dark-Quill-12B 💫 Secondary: inflatebot/MN-12B-Mag-Mell-R1
Stage 2 Configuration
name: Second
models:
- model: inflatebot/MN-12B-Mag-Mell-R1
merge_method: nearswap
base_model: Vortex5/Dark-Quill-12B
parameters:
t: 0.0008
dtype: bfloat16`
🌌 Stage 3 — Final Merge
⚖️ Method: Karcher Mean (tol = 1e-9, max_iter = 20000) 🜂 Inputs: First + Second 💎 Purpose: To geometrically fuse both for coherence.
Final Merge Configuration
models:
- model: First
- model: Second
merge_method: karcher
dtype: bfloat16
parameters:
tol: 1e-9
max_iter: 20000
tokenizer:
source: First
🌑🜂 Acknowledgements 🜂🌑
- ⚙️ mradermacher — for static and imatrix quantization
- 🜛 DeathGodlike — for EXL3 quants
- 🩶 All original model authors and contributors whose work made this model possible.
Models merged in this creation:
- Downloads last month
- 24
