ChloeAuYeung
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -174,6 +174,7 @@ inputs = tokenizer('北京的景点:故宫、天坛、万里长城等。\n深
|
|
174 |
inputs = inputs.cuda()
|
175 |
generated_ids = model.generate(inputs, max_new_tokens=70, eos_token_id=tokenizer.eos_token_id, repetition_penalty=1.1)
|
176 |
print(tokenizer.batch_decode(generated_ids, skip_special_tokens=True))
|
|
|
177 |
|
178 |
## 局限性与免责申明
|
179 |
|
|
|
174 |
inputs = inputs.cuda()
|
175 |
generated_ids = model.generate(inputs, max_new_tokens=70, eos_token_id=tokenizer.eos_token_id, repetition_penalty=1.1)
|
176 |
print(tokenizer.batch_decode(generated_ids, skip_special_tokens=True))
|
177 |
+
```
|
178 |
|
179 |
## 局限性与免责申明
|
180 |
|