Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,65 @@
|
|
1 |
---
|
2 |
license: apache-2.0
|
|
|
|
|
|
|
|
|
3 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
license: apache-2.0
|
3 |
+
tags:
|
4 |
+
- snowflake
|
5 |
+
- arctic
|
6 |
+
- moe
|
7 |
---
|
8 |
+
|
9 |
+
## Model Details
|
10 |
+
|
11 |
+
Arctic is a dense-MoE Hybrid transformer architecture pre-trained from scratch by the Snowflake AI
|
12 |
+
Research Team. We are releasing model checkpoints for both the base and instruct-tuned versions of
|
13 |
+
Arctic under an Apache-2.0 license. This means you can use them freely in your own research,
|
14 |
+
prototypes, and products. Please see our blog
|
15 |
+
[Snowflake Arctic: The Best LLM for Enterprise AI — Efficiently Intelligent, Truly Open](https://www.snowflake.com/blog/arctic-open-and-efficient-foundation-language-models-snowflake)
|
16 |
+
for more information on Arctic and links to other relevant resources such as our series of cookbooks
|
17 |
+
covering topics around training your own custom MoE models, how to produce high-quality training data,
|
18 |
+
and much more.
|
19 |
+
|
20 |
+
* [Arctic-Base](https://huggingface.co/Snowflake/snowflake-arctic-base/)
|
21 |
+
* [Arctic-Instruct](https://huggingface.co/Snowflake/snowflake-arctic-instruct/)
|
22 |
+
|
23 |
+
For the latest details about Snowflake Arctic including tutorials, etc. please refer to our github repo:
|
24 |
+
* https://github.com/Snowflake-Labs/snowflake-arctic
|
25 |
+
|
26 |
+
**Model developers** Snowflake AI Research Team
|
27 |
+
|
28 |
+
**License** Apache-2.0
|
29 |
+
|
30 |
+
**Input** Models input text only.
|
31 |
+
|
32 |
+
**Output** Models generate text and code only.
|
33 |
+
|
34 |
+
**Model Release Date** April, 24th 2024.
|
35 |
+
|
36 |
+
## Model Architecture
|
37 |
+
|
38 |
+
Arctic combines a 10B dense transformer model with a residual 128x3.66B MoE MLP resulting in 480B
|
39 |
+
total and 17B active parameters chosen using a top-2 gating. For more details about Arctic's model
|
40 |
+
Architecture, training process, data, etc. [see our series of cookbooks](https://www.snowflake.com/en/data-cloud/arctic/cookbook/).
|
41 |
+
|
42 |
+
|
43 |
+
## Usage
|
44 |
+
|
45 |
+
As of 4/24/2024 we are actively working with the maintainers of `transformers` to include the Arctic
|
46 |
+
model implementation. Until this support is released please follow these instructions to get the
|
47 |
+
required dependencies for using Arctic:
|
48 |
+
|
49 |
+
```python
|
50 |
+
pip install git+https://github.com/Snowflake-Labs/transformers.git@arctic
|
51 |
+
```
|
52 |
+
|
53 |
+
Arctic leverages several features from [DeepSpeed](https://github.com/microsoft/DeepSpeed), you will need to
|
54 |
+
install the latest version of DeepSpeed to get all of these required features:
|
55 |
+
|
56 |
+
```python
|
57 |
+
pip install "deepspeed>=0.14.2"
|
58 |
+
```
|
59 |
+
|
60 |
+
### Inference
|
61 |
+
|
62 |
+
The Arctic github page has several code snippets and examples around running inference:
|
63 |
+
|
64 |
+
* Example with pure-HF: https://github.com/Snowflake-Labs/snowflake-arctic/blob/main/inference
|
65 |
+
* Tutorial using vLLM: https://github.com/Snowflake-Labs/snowflake-arctic/tree/main/inference/vllm
|