Update README.md
Browse files
README.md
CHANGED
@@ -31,7 +31,7 @@ It is important to note that the primary intended use case of this model is to c
|
|
31 |
# pip install -q transformers
|
32 |
from transformers import AutoModelForCausalLM, AutoTokenizer
|
33 |
|
34 |
-
model = "HuggingFaceTB/finemath-ablation-
|
35 |
device = "cuda" # for GPU usage or "cpu" for CPU usage
|
36 |
|
37 |
tokenizer = AutoTokenizer.from_pretrained(model)
|
@@ -48,12 +48,12 @@ We are releasing intermediate checkpoints for this model at intervals of every 1
|
|
48 |
|
49 |
You can load a specific model revision with `transformers` using the argument `revision`:
|
50 |
```python
|
51 |
-
model = AutoModelForCausalLM.from_pretrained("HuggingFaceTB/finemath-ablation-
|
52 |
```
|
53 |
You can access all the revisions for the models via the following code:
|
54 |
```python
|
55 |
from huggingface_hub import list_repo_refs
|
56 |
-
out = list_repo_refs("HuggingFaceTB/finemath-ablation-
|
57 |
print([b.name for b in out.branches])
|
58 |
```
|
59 |
|
|
|
31 |
# pip install -q transformers
|
32 |
from transformers import AutoModelForCausalLM, AutoTokenizer
|
33 |
|
34 |
+
model = "HuggingFaceTB/finemath-ablation-3plus-160B"
|
35 |
device = "cuda" # for GPU usage or "cpu" for CPU usage
|
36 |
|
37 |
tokenizer = AutoTokenizer.from_pretrained(model)
|
|
|
48 |
|
49 |
You can load a specific model revision with `transformers` using the argument `revision`:
|
50 |
```python
|
51 |
+
model = AutoModelForCausalLM.from_pretrained("HuggingFaceTB/finemath-ablation-3plus-160B", revision="10B")
|
52 |
```
|
53 |
You can access all the revisions for the models via the following code:
|
54 |
```python
|
55 |
from huggingface_hub import list_repo_refs
|
56 |
+
out = list_repo_refs("HuggingFaceTB/finemath-ablation-3plus-160B")
|
57 |
print([b.name for b in out.branches])
|
58 |
```
|
59 |
|