Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,20 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
PhoneLM-0.5B-Instruct is a 0.5 billion parameter decoder-only language model.
|
2 |
|
3 |
## Usage
|
@@ -49,4 +66,4 @@ The model is a decoder-only transformer architecture with the following modifica
|
|
49 |
* **Tokenizer**: We use the SmolLM([Allal et al., 2024](https://huggingface.co/blog/smollm))'s tokenizer with a vocabulary size of 49,152.
|
50 |
|
51 |
## LICENSE
|
52 |
-
* This repository is released under the [Apache-2.0](https://huggingface.co/mllmTeam/PhoneLM-0.5B-Instruct/blob/main/LICENSE) License.
|
|
|
1 |
+
---
|
2 |
+
license: apache-2.0
|
3 |
+
datasets:
|
4 |
+
- stingning/ultrachat
|
5 |
+
- TIGER-Lab/MathInstruct
|
6 |
+
- ise-uiuc/Magicoder-Evol-Instruct-110K
|
7 |
+
- OpenAssistant/oasst2
|
8 |
+
- teknium/openhermes
|
9 |
+
- bigcode/commitpackft
|
10 |
+
- Open-Orca/SlimOrca
|
11 |
+
- ise-uiuc/Magicoder-OSS-Instruct-75K
|
12 |
+
language:
|
13 |
+
- en
|
14 |
+
library_name: transformers
|
15 |
+
base_model:
|
16 |
+
- mllmTeam/PhoneLM-0.5B
|
17 |
+
---
|
18 |
PhoneLM-0.5B-Instruct is a 0.5 billion parameter decoder-only language model.
|
19 |
|
20 |
## Usage
|
|
|
66 |
* **Tokenizer**: We use the SmolLM([Allal et al., 2024](https://huggingface.co/blog/smollm))'s tokenizer with a vocabulary size of 49,152.
|
67 |
|
68 |
## LICENSE
|
69 |
+
* This repository is released under the [Apache-2.0](https://huggingface.co/mllmTeam/PhoneLM-0.5B-Instruct/blob/main/LICENSE) License.
|