metadata
base_model:
- MSL7/INEX12-7b
- yam-peleg/Experiment26-7B
library_name: transformers
tags:
- mergekit
- merge
merge
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the DARE TIES merge method using MSL7/INEX12-7b as a base.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
models:
- model: MSL7/INEX12-7b
# No parameters necessary for base model
- model: yam-peleg/Experiment26-7B
parameters:
density: 0.53
weight: 0.6
merge_method: dare_ties
base_model: MSL7/INEX12-7b
parameters:
int8_mask: true
dtype: bfloat16
random_seed: 0