File size: 1,347 Bytes
5ca6e76 3a053dd 5ca6e76 241f3ac 2fb2ffe 241f3ac 756f8eb 7ff468d |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 |
---
library_name: transformers
license: mit
---
# LLM Model Merging
## YouTube Tutorial
<div align="center">
<a href="https://youtu.be/gNXBp3wttFU">Model Merging: Merge LLMs to Create Frankestein Models - Python, HuggingFace, Mergekit</a>
<br>
<br>
<a href="https://youtu.be/gNXBp3wttFU">
<img src="./thumbnail1-button.png" height="85%" width="85%%"/>
</a>
</div>
## GitHub
You can find the GitHub from here; https://github.com/uygarkurt/Model-Merge
In this specific case, I typed `llama-3` into the open LLM leaderboard, took the best 3 models, merged them and created
a better ranking model wihtout any training.
As the main libraries we will be using [mergekit](https://github.com/arcee-ai/mergekit).
<br/>
<div align="center">
<a href="">
<img alt="open-source-image"
src="https://img.shields.io/badge/%E2%9D%A4%EF%B8%8F_Open_Source-%2350C878?style=for-the-badge"/>
</a>
<a href="https://youtu.be/gNXBp3wttFU">
<img alt="youtube-tutorial"
src="https://img.shields.io/badge/YouTube_Tutorial-grey?style=for-the-badge&logo=YouTube&logoColor=%23FF0000"/>
</a>
<a href="https://github.com/uygarkurt/Model-Merge">
<img alt="github-image"
src="https://img.shields.io/badge/github-%23121011.svg?style=for-the-badge&logo=github&logoColor=white"
</a>
</div> |