sophosympatheia's picture
a07e0d42029181e031fb64d9b26b108e8adc2a063f4be772660b990c2711efa3
903a822 verified
|
raw
history blame
853 Bytes
---
base_model: []
tags:
- mergekit
- merge
---
# mr-v2.0.3-miqu
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the SLERP merge method.
### Models Merged
The following models were included in the merge:
* /home/llm/mergequant/models/midnight-rose-70b-v2.0.1
* /home/llm/mergequant/models/wizard-tulu-dolphin-70b-v1.0-slerp
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: /home/llm/mergequant/models/midnight-rose-70b-v2.0.1
- model: /home/llm/mergequant/models/wizard-tulu-dolphin-70b-v1.0-slerp
merge_method: slerp
base_model: /home/llm/mergequant/models/wizard-tulu-dolphin-70b-v1.0-slerp
parameters:
t:
- value: [0.4, 0.6, 0.5]
dtype: float16
```