RichardErkhov commited on
Commit
50a91d2
β€’
1 Parent(s): bc8f459

uploaded readme

Browse files
Files changed (1) hide show
  1. README.md +104 -0
README.md ADDED
@@ -0,0 +1,104 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Quantization made by Richard Erkhov.
2
+
3
+ [Github](https://github.com/RichardErkhov)
4
+
5
+ [Discord](https://discord.gg/pvy7H8DZMG)
6
+
7
+ [Request more models](https://github.com/RichardErkhov/quant_request)
8
+
9
+
10
+ PersianLLaMA-13B - GGUF
11
+ - Model creator: https://huggingface.co/ViraIntelligentDataMining/
12
+ - Original model: https://huggingface.co/ViraIntelligentDataMining/PersianLLaMA-13B/
13
+
14
+
15
+ | Name | Quant method | Size |
16
+ | ---- | ---- | ---- |
17
+ | [PersianLLaMA-13B.Q2_K.gguf](https://huggingface.co/RichardErkhov/ViraIntelligentDataMining_-_PersianLLaMA-13B-gguf/blob/main/PersianLLaMA-13B.Q2_K.gguf) | Q2_K | 4.68GB |
18
+ | [PersianLLaMA-13B.IQ3_XS.gguf](https://huggingface.co/RichardErkhov/ViraIntelligentDataMining_-_PersianLLaMA-13B-gguf/blob/main/PersianLLaMA-13B.IQ3_XS.gguf) | IQ3_XS | 5.17GB |
19
+ | [PersianLLaMA-13B.IQ3_S.gguf](https://huggingface.co/RichardErkhov/ViraIntelligentDataMining_-_PersianLLaMA-13B-gguf/blob/main/PersianLLaMA-13B.IQ3_S.gguf) | IQ3_S | 5.45GB |
20
+ | [PersianLLaMA-13B.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/ViraIntelligentDataMining_-_PersianLLaMA-13B-gguf/blob/main/PersianLLaMA-13B.Q3_K_S.gguf) | Q3_K_S | 5.45GB |
21
+ | [PersianLLaMA-13B.IQ3_M.gguf](https://huggingface.co/RichardErkhov/ViraIntelligentDataMining_-_PersianLLaMA-13B-gguf/blob/main/PersianLLaMA-13B.IQ3_M.gguf) | IQ3_M | 5.75GB |
22
+ | [PersianLLaMA-13B.Q3_K.gguf](https://huggingface.co/RichardErkhov/ViraIntelligentDataMining_-_PersianLLaMA-13B-gguf/blob/main/PersianLLaMA-13B.Q3_K.gguf) | Q3_K | 6.08GB |
23
+ | [PersianLLaMA-13B.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/ViraIntelligentDataMining_-_PersianLLaMA-13B-gguf/blob/main/PersianLLaMA-13B.Q3_K_M.gguf) | Q3_K_M | 6.08GB |
24
+ | [PersianLLaMA-13B.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/ViraIntelligentDataMining_-_PersianLLaMA-13B-gguf/blob/main/PersianLLaMA-13B.Q3_K_L.gguf) | Q3_K_L | 6.63GB |
25
+ | [PersianLLaMA-13B.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/ViraIntelligentDataMining_-_PersianLLaMA-13B-gguf/blob/main/PersianLLaMA-13B.IQ4_XS.gguf) | IQ4_XS | 6.73GB |
26
+ | [PersianLLaMA-13B.Q4_0.gguf](https://huggingface.co/RichardErkhov/ViraIntelligentDataMining_-_PersianLLaMA-13B-gguf/blob/main/PersianLLaMA-13B.Q4_0.gguf) | Q4_0 | 7.06GB |
27
+ | [PersianLLaMA-13B.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/ViraIntelligentDataMining_-_PersianLLaMA-13B-gguf/blob/main/PersianLLaMA-13B.IQ4_NL.gguf) | IQ4_NL | 7.1GB |
28
+ | [PersianLLaMA-13B.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/ViraIntelligentDataMining_-_PersianLLaMA-13B-gguf/blob/main/PersianLLaMA-13B.Q4_K_S.gguf) | Q4_K_S | 7.11GB |
29
+ | [PersianLLaMA-13B.Q4_K.gguf](https://huggingface.co/RichardErkhov/ViraIntelligentDataMining_-_PersianLLaMA-13B-gguf/blob/main/PersianLLaMA-13B.Q4_K.gguf) | Q4_K | 7.52GB |
30
+ | [PersianLLaMA-13B.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/ViraIntelligentDataMining_-_PersianLLaMA-13B-gguf/blob/main/PersianLLaMA-13B.Q4_K_M.gguf) | Q4_K_M | 7.52GB |
31
+ | [PersianLLaMA-13B.Q4_1.gguf](https://huggingface.co/RichardErkhov/ViraIntelligentDataMining_-_PersianLLaMA-13B-gguf/blob/main/PersianLLaMA-13B.Q4_1.gguf) | Q4_1 | 7.81GB |
32
+ | [PersianLLaMA-13B.Q5_0.gguf](https://huggingface.co/RichardErkhov/ViraIntelligentDataMining_-_PersianLLaMA-13B-gguf/blob/main/PersianLLaMA-13B.Q5_0.gguf) | Q5_0 | 8.57GB |
33
+ | [PersianLLaMA-13B.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/ViraIntelligentDataMining_-_PersianLLaMA-13B-gguf/blob/main/PersianLLaMA-13B.Q5_K_S.gguf) | Q5_K_S | 8.57GB |
34
+ | [PersianLLaMA-13B.Q5_K.gguf](https://huggingface.co/RichardErkhov/ViraIntelligentDataMining_-_PersianLLaMA-13B-gguf/blob/main/PersianLLaMA-13B.Q5_K.gguf) | Q5_K | 8.81GB |
35
+ | [PersianLLaMA-13B.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/ViraIntelligentDataMining_-_PersianLLaMA-13B-gguf/blob/main/PersianLLaMA-13B.Q5_K_M.gguf) | Q5_K_M | 8.81GB |
36
+ | [PersianLLaMA-13B.Q5_1.gguf](https://huggingface.co/RichardErkhov/ViraIntelligentDataMining_-_PersianLLaMA-13B-gguf/blob/main/PersianLLaMA-13B.Q5_1.gguf) | Q5_1 | 9.33GB |
37
+ | [PersianLLaMA-13B.Q6_K.gguf](https://huggingface.co/RichardErkhov/ViraIntelligentDataMining_-_PersianLLaMA-13B-gguf/blob/main/PersianLLaMA-13B.Q6_K.gguf) | Q6_K | 10.18GB |
38
+ | [PersianLLaMA-13B.Q8_0.gguf](https://huggingface.co/RichardErkhov/ViraIntelligentDataMining_-_PersianLLaMA-13B-gguf/blob/main/PersianLLaMA-13B.Q8_0.gguf) | Q8_0 | 13.18GB |
39
+
40
+
41
+
42
+
43
+ Original model description:
44
+ ---
45
+ license: cc-by-nc-4.0
46
+ language:
47
+ - fa
48
+ library_name: transformers
49
+ tags:
50
+ - text-generation-inference
51
+ inference: false
52
+ pipeline_tag: text-generation
53
+ ---
54
+
55
+ # PersianLLaMA: Towards Building First Persian Large Language Model
56
+
57
+ <img src="https://huggingface.co/ViraIntelligentDataMining/PersianLLaMA-2-13B/resolve/main/persianllama.png" alt="PersianLLaMA" width=400/>
58
+
59
+ ## 🌟 Introduction
60
+ Welcome to the home of PersianLLaMA, the pioneering large language model for the Persian language. With 13 billion parameters, this model is trained on Persian Wikipedia corpus and designed to excel in multiple NLP tasks, setting a new benchmark for Persian language understanding and generation.
61
+
62
+ ## πŸ›  Model Description
63
+ PersianLLaMA is not just a model but a comprehensive tool for:
64
+ - πŸ“ **Text Generation**: Crafting coherent and contextually appropriate text.
65
+ - 🎯 **Instruct Tuning**: Executing tasks based on detailed instructions, ideal for scenarios where the model needs to adhere to specific guidelines or produce outputs tailored to particular requirements.
66
+ - ❓ **Question Answering**: Providing accurate answers to Persian queries.
67
+ - πŸ“Š **Text Summarization**: Condensing Persian texts into precise summaries.
68
+
69
+ This model has been collaboratively developed by a team of experts, including Mohammad Amin Abbasi, Arash Ghafouri, Mahdi Firouzmandi, Hassan Naderi, Behrouz Minaei Bidgoli.
70
+ ## πŸš€ Quick Start
71
+ To integrate PersianLLaMA into your project, follow these steps:
72
+ ```python
73
+ from transformers import AutoModelForCausalLM, AutoTokenizer
74
+
75
+ model_name = "ViraIntelligentDataMining/PersianLLaMA-13B"
76
+ tokenizer = AutoTokenizer.from_pretrained(model_name)
77
+ model = AutoModelForCausalLM.from_pretrained(model_name)
78
+
79
+ prompt = "Ψ§ΫŒΩ† Ω…ΨͺΩ† Ψ¨Ω‡ فارسی Ψ§Ψ³Ψͺ"
80
+ inputs = tokenizer(prompt, return_tensors="pt")
81
+ outputs = model.generate(inputs["input_ids"])
82
+ print(tokenizer.decode(outputs[0], skip_special_tokens=True))
83
+ ```
84
+
85
+ ## πŸ“ˆ Evaluation and Benchmarks
86
+ PersianLLaMA demonstrates superior performance over existing models, with robust evaluation metrics that highlight its capabilities in natural language understanding and generation.
87
+
88
+
89
+ ## πŸ“œ Citing PersianLLaMA
90
+ If you find PersianLLaMA useful in your research, please consider citing:
91
+
92
+ ```bibtex
93
+ @article{abbasi2023persianllama,
94
+ title={PersianLLaMA: Towards Building First Persian Large Language Model},
95
+ author={Abbasi, Mohammad Amin and others},
96
+ journal={https://arxiv.org/abs/2312.15713},
97
+ year={2023}
98
+ }
99
+ ```
100
+
101
+
102
+ ## πŸ“„ License
103
+ PersianLLaMA is open-sourced under the CC BY-NC 4.0 license.
104
+