Multilingual Pruning Research
AI & ML interests
None defined yet.
Organization Card
Description
A collection of pruned models on different pruning methods, sparsity types and ratios, calibration sets as well as calibration languages. All of the model names are in the format:
pruned-<original model name>-<pruning method>-<sparsity ratio>-<sparsity type>-<calibration dataset>-<calibration languages>
Usage
Load the model using the transformers
library, which can be installed using pip install transformers
:
# main.py
from transformers import AutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("multilingual-pruning/<model name>")
Dependencies
transformers==4.42.3
numpy==1.26.4
models
72
multilingual-pruning/pruned-llama3-70b-instruct-sparsegpt-0.5-unstructured-mc4-sw-0
Text Generation
•
Updated
•
11
multilingual-pruning/pruned-llama3-70b-instruct-sparsegpt-0.5-unstructured-mc4-es-0
Text Generation
•
Updated
•
11
multilingual-pruning/pruned-aya-23-35b-sparsegpt-0.5-unstructured-mc4-de-0
Updated
multilingual-pruning/pruned-llama3-70b-instruct-sparsegpt-0.5-unstructured-mc4-en-0
Updated
multilingual-pruning/pruned-llama3-70b-instruct-sparsegpt-0.5-unstructured-mc4-de-0
Updated
multilingual-pruning/pruned-llama3-70b-instruct-sparsegpt-0.5-unstructured-mc4-ar-0
Text Generation
•
Updated
•
9
multilingual-pruning/pruned-llama3-70b-instruct-wanda-0.5-unstructured-mc4-ar-0
Text Generation
•
Updated
•
5
multilingual-pruning/pruned-llama3-70b-instruct-wanda-0.5-unstructured-mc4-es-0
Text Generation
•
Updated
•
5
multilingual-pruning/pruned-llama3-70b-instruct-wanda-0.5-unstructured-mc4-zh-0
Text Generation
•
Updated
•
5
multilingual-pruning/pruned-llama3-70b-instruct-wanda-0.5-unstructured-mc4-sw-0
Text Generation
•
Updated
•
5
datasets
None public yet