legolasyiu commited on
Commit
e453c92
1 Parent(s): 5b6e541

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +16 -11
README.md CHANGED
@@ -16,8 +16,23 @@ pipeline_tag: text-classification
16
 
17
 
18
 
19
- # code
 
 
 
 
 
 
 
 
 
 
 
 
 
 
20
 
 
21
 
22
  ```python
23
  from unsloth import FastLanguageModel
@@ -45,16 +60,6 @@ outputs = model.generate(**inputs, max_new_tokens = 64, use_cache = True)
45
  tokenizer.batch_decode(outputs)
46
  ```
47
 
48
- # Uploaded model
49
-
50
- - **Developed by:** EpistemeAI
51
- - **License:** apache-2.0
52
- - **Finetuned from model :** unsloth/gemma-2-9b-bnb-4bit
53
-
54
- This gemma2 model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
55
-
56
- [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
57
-
58
  --
59
 
60
  ### Inputs and outputs
 
16
 
17
 
18
 
19
+ # Uploaded model
20
+
21
+ - **Developed by:** EpistemeAI
22
+ - **License:** apache-2.0
23
+ - **Finetuned from model :** unsloth/gemma-2-9b-bnb-4bit
24
+
25
+ This gemma2 model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
26
+
27
+ [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
28
+
29
+ How to use
30
+ This repository contains two versions of Meta-Llama-3.1-8B-Instruct, for use with transformers and with the original llama codebase.
31
+
32
+ Use with transformers
33
+ Starting with transformers >= 4.43.0 onward, you can run conversational inference using the Transformers pipeline abstraction or by leveraging the Auto classes with the generate() function.
34
 
35
+ Make sure to update your transformers installation via pip install --upgrade transformers.
36
 
37
  ```python
38
  from unsloth import FastLanguageModel
 
60
  tokenizer.batch_decode(outputs)
61
  ```
62
 
 
 
 
 
 
 
 
 
 
 
63
  --
64
 
65
  ### Inputs and outputs