Sandiago21 commited on
Commit
225d66b
1 Parent(s): 63caac3

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -5
README.md CHANGED
@@ -96,7 +96,7 @@ def generate_prompt(instruction: str, input_ctxt: str = None) -> str:
96
 
97
  Use the code below to get started with the model.
98
 
99
- 1. You can directly call the model from HuggingFace using the following code snippet:
100
 
101
  ```python
102
  import torch
@@ -104,12 +104,14 @@ from peft import PeftConfig, PeftModel
104
  from transformers import GenerationConfig, LlamaTokenizer, LlamaForCausalLM
105
 
106
  MODEL_NAME = "Sandiago21/llama-13b-hf-prompt-answering"
107
- BASE_MODEL = "decapoda-research/llama-13b-hf"
108
 
109
  config = PeftConfig.from_pretrained(MODEL_NAME)
110
 
 
 
 
111
  model = LlamaForCausalLM.from_pretrained(
112
- BASE_MODEL,
113
  load_in_8bit=True,
114
  torch_dtype=torch.float16,
115
  device_map="auto",
@@ -155,7 +157,7 @@ print(response)
155
  >>> The capital city of Greece is Athens and it borders Turkey, Bulgaria, Macedonia, Albania, and the Aegean Sea.
156
  ```
157
 
158
- 1. You can git clone the repo, which contains also the artifacts for the base model for simplicity and completeness, and run the following code snippet to load the mode:
159
 
160
  ```python
161
  import torch
@@ -163,11 +165,12 @@ from peft import PeftConfig, PeftModel
163
  from transformers import GenerationConfig, LlamaTokenizer, LlamaForCausalLM
164
 
165
  MODEL_NAME = "Sandiago21/llama-13b-hf-prompt-answering"
 
166
 
167
  config = PeftConfig.from_pretrained(MODEL_NAME)
168
 
169
  model = LlamaForCausalLM.from_pretrained(
170
- config.base_model_name_or_path,
171
  load_in_8bit=True,
172
  torch_dtype=torch.float16,
173
  device_map="auto",
 
96
 
97
  Use the code below to get started with the model.
98
 
99
+ 1. You can git clone the repo, which contains also the artifacts for the base model for simplicity and completeness, and run the following code snippet to load the mode:
100
 
101
  ```python
102
  import torch
 
104
  from transformers import GenerationConfig, LlamaTokenizer, LlamaForCausalLM
105
 
106
  MODEL_NAME = "Sandiago21/llama-13b-hf-prompt-answering"
 
107
 
108
  config = PeftConfig.from_pretrained(MODEL_NAME)
109
 
110
+ # Setting the path to look at your repo directory, assuming that you are at that directory when running this script
111
+ config.base_model_name_or_path = "decapoda-research/llama-7b-hf/"
112
+
113
  model = LlamaForCausalLM.from_pretrained(
114
+ config.base_model_name_or_path,
115
  load_in_8bit=True,
116
  torch_dtype=torch.float16,
117
  device_map="auto",
 
157
  >>> The capital city of Greece is Athens and it borders Turkey, Bulgaria, Macedonia, Albania, and the Aegean Sea.
158
  ```
159
 
160
+ 2. You can directly call the model from HuggingFace using the following code snippet:
161
 
162
  ```python
163
  import torch
 
165
  from transformers import GenerationConfig, LlamaTokenizer, LlamaForCausalLM
166
 
167
  MODEL_NAME = "Sandiago21/llama-13b-hf-prompt-answering"
168
+ BASE_MODEL = "decapoda-research/llama-13b-hf"
169
 
170
  config = PeftConfig.from_pretrained(MODEL_NAME)
171
 
172
  model = LlamaForCausalLM.from_pretrained(
173
+ BASE_MODEL,
174
  load_in_8bit=True,
175
  torch_dtype=torch.float16,
176
  device_map="auto",