Introduction
Yi-32b-x2-v2.0 is an MoE model created with mergekit + custom prompts. The following base models are used:
Weyaxi/Bagel-Hermes-34B-Slerp
one-man-army/UNA-34Beagles-32K-bf16-v1
Details
Used Librarys
- mergekit
- transformers
How to use
# pip install transformers==4.35.2
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("sumo43/Yi-32b-x2-v2.0")
model = AutoModelForCausalLM.from_pretrained(
"sumo43/Yi-32b-x2-v2.0"
)
- Downloads last month
- 5
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.