ZeroXClem/Astral-Fusion-Neural-Happy-L3.1-8B
Astral-Fusion-Neural-Happy-L3.1-8B is a hybrid model blending magic, creativity, and dynamic storytelling. It excels in instruction-following, immersive roleplaying, and magical narrative generation. This model fuses the finest qualities of Astral-Fusion (Llama 3.0), NIHAPPY (Llama 3.1), and NeuralMahou (Llama 3.0). β¨π
π Family Tree
This model is a merger of the following:
- ProdeusUnity/Astral-Fusion-8b-v0.0: A celestial, multi-purpose model that excels in instruction-following and role-playing. It merges storytelling and instructive content with a cosmic, mysterious flair.
- Arkana08/NIHAPPY-L3.1-8B-v0.09: A dynamic role-playing model, ideal for interactive storytelling, balancing creativity and intricate reasoning to produce compelling narratives.
- lemon07r/llama-3-NeuralMahou-8b: Infuses magical storytelling and creative imagination, perfect for generating fantasy-rich, immersive experiences with precision.
𧬠Model Lineage
A: ProdeusUnity/Astral-Fusion-8b-v0.0
- A fusion of several instructive and storytelling models like Celeste-Stable and Sao10K L3-8B-Stheno, this model provides a well-rounded approach to instruction-following while also being capable of generating cosmic-inspired narratives.
- Strong in roleplaying and narrative creation, with a focus on maintaining coherence in long-form outputs.
B: Arkana08/NIHAPPY-L3.1-8B-v0.09
- Merged with a focus on dynamic storytelling and rule-following. NIHAPPY excels in interactive roleplay situations where logical reasoning meets creative imagination.
- A perfect balance between storytelling and structured reasoning, making it ideal for immersive game worlds, character-driven dialogues, and dynamic experiences.
C: lemon07r/llama-3-NeuralMahou-8b
- This model brings in a magical essence with its creative and fantasy-driven approach to language generation. Infused with NeuralMahou's fantasy-themed elements, itβs perfect for magical narratives and imaginative storytelling.
- Trained to handle both technical and creative tasks, making it a versatile addition to this fusion.
π οΈ Merge Details
This model was merged using the Model Stock merge method, carefully integrating the unique strengths of all three models to create a cohesive, powerful system. Hereβs the configuration used:
base_model: ProdeusUnity/Astral-Fusion-8b-v0.0
dtype: bfloat16
merge_method: model_stock
slices:
- sources:
- layer_range: [0, 32]
model: Arkana08/NIHAPPY-L3.1-8B-v0.09
- layer_range: [0, 32]
model: lemon07r/llama-3-NeuralMahou-8b
- layer_range: [0, 32]
model: ProdeusUnity/Astral-Fusion-8b-v0.0
π― Key Features & Capabilities
1. Instruction Following & Reasoning:
Rooted in Astral-Fusion, this model excels in precise instruction-following and handling logical tasks. It maintains a celestial vibe while delivering clear, structured outputs.
2. Roleplay & Storytelling:
With the NIHAPPY model integrated, this model is perfect for crafting dynamic narratives, handling complex character dialogues, and maintaining immersive storylines in roleplaying contexts.
3. Magic & Imagination:
Thanks to NeuralMahou, this model shines in creative, magical storytelling. Whether you're crafting fantasy worlds or seeking magical responses, this model delivers imagination-rich outputs that blend logic and creativity seamlessly.
π Use Case & Applications
- Immersive Storytelling: Create fantasy-rich narratives, generate dynamic roleplaying characters, and write cosmic-inspired tales that blend magic, science, and imagination.
- Interactive Roleplay: With NIHAPPYβs dynamic capabilities, the model is perfect for AI-based NPC dialogues, interactive worlds, and even game development.
- Instructional & Long-Form Texts: Perfect for writing long-form instructional content, this model can switch effortlessly between creative and instructive modes.
π License
This model is open-sourced under the Apache-2.0 License, allowing free use, modification, and distribution with proper attribution.
π‘ Tags
merge
model_stock
AstralFusion
NIHAPPY
NeuralMahou
Llama3
roleplaying
storytelling
instruction-following
long-form-generation
- Downloads last month
- 455