BigWeave v29 122b
The BigWeave models aim to experimentally identify merge settings for increasing model performance. The version number merely tracks various attempts and is not a quality indicator. Only results demonstrating good performance are retained and shared.
Prompting Format
Chatml, Mistral, Vicuna.
Merge process
This is a self-merge of 152334H/miqu-1-70b-sf. Layers are repeated in groups of 4 with a 2 layer overlap. The first and last 8/9 layers are not repeated.
Merge configuration:
slices:
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [0,11]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [9,13]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [11,15]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [13,17]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [15,19]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [17,21]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [19,23]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [21,25]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [23,27]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [25,29]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [27,31]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [29,33]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [31,35]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [33,37]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [35,39]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [37,41]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [39,43]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [41,45]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [43,47]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [45,49]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [47,51]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [49,53]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [51,55]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [53,57]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [55,59]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [57,61]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [59,63]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [61,65]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [63,67]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [65,69]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [67,71]
- sources:
- model: 152334H/miqu-1-70b-sf
layer_range: [69,80]
merge_method: passthrough
dtype: float16
- Downloads last month
- 82
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for llmixer/BigWeave-v29-122b
Base model
152334H/miqu-1-70b-sf