Domain-Fusion-L3-8B
Recomended ST Presets: Domain Fusion Presets
Models Merged
Lineage of internal models: Hathor 0.1 x Poppy_0.72 = Hathor_Variant-X (slerp) | T-900 x Biollm = T-900xBioLLM. (slerp)
Configuration
The following YAML configuration was used to produce this model:
slices:
- sources:
- model: ./Hathor_Variant-X
layer_range: [0, 32]
- model: ./T-900xBioLLM
layer_range: [0, 32]
merge_method: slerp
base_model: ./Hathor_Variant-X
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5
dtype: bfloat16
- Downloads last month
- 109
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.