File size: 716 Bytes
be50407 36e422b 4d88583 449cf33 be50407 449cf33 e9820bf 449cf33 26249e8 449cf33 be50407 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 |
---
library_name: peft
---
## Training:
Check out our github: https://github.com/AI4Finance-Foundation/FinGPT/tree/master/fingpt/FinGPT_Forecaster
## Inference
``` python
from datasets import load_dataset
from transformers import AutoTokenizer, AutoModelForCausalLM
from peft import PeftModel
base_model = AutoModelForCausalLM.from_pretrained(
'meta-llama/Llama-2-7b-chat-hf',
trust_remote_code=True,
device_map="auto",
torch_dtype=torch.float16, # optional if you have enough VRAM
)
tokenizer = AutoTokenizer.from_pretrained('meta-llama/Llama-2-7b-chat-hf')
model = PeftModel.from_pretrained(base_model, 'FinGPT/fingpt-forecaster_dow30_llama2-7b_lora')
model = model.eval()
```
- PEFT 0.5.0
|