|
--- |
|
library_name: transformers |
|
tags: [] |
|
--- |
|
|
|
# Model Card for Meta-Llama-3-8B-for-bank |
|
|
|
This model, **Meta-Llama-3-8B-for-bank**, is a fine-tuned version of the `meta-llama/Meta-Llama-3-8B-Instruct` model. It is optimized for financial service-related tasks, enabling users to interact with the model using natural language for common financial operations such as balance inquiries, retrieving stock lists, buying stocks, and performing deposit/withdrawal transactions. |
|
|
|
## Model Details |
|
|
|
### Model Description |
|
|
|
- **Model Name**: Meta-Llama-3-8B-for-bank |
|
- **Base Model**: `meta-llama/Meta-Llama-3-8B-Instruct` |
|
- **Fine-tuning Data**: Custom financial chat examples |
|
- **Version**: 1.0 |
|
- **License**: [Model License (if any)] |
|
- **Language**: English |
|
|
|
### Model Type |
|
|
|
- **Architecture**: LLaMA-3 |
|
- **Type**: Instruction-based language model |
|
|
|
### Model Usage |
|
|
|
This model is designed for financial service tasks such as: |
|
|
|
- **Balance Inquiry**: |
|
- *Example*: "Can you provide the current balance for my account?" |
|
- **Stock List Retrieval**: |
|
- *Example*: "Can you provide me with a list of my stocks?" |
|
- **Stock Purchase**: |
|
- *Example*: "I'd like to buy stocks worth $1,000.00 in Tesla." |
|
- **Deposit Transactions**: |
|
- *Example*: "I'd like to deposit $500.00 into my account." |
|
- **Withdrawal Transactions**: |
|
- *Example*: "I'd like to withdraw $200.00 from my account." |
|
- **Transaction History**: |
|
- *Example*: "I would like to view my transactions. Can you provide it?" |
|
|
|
### Inputs and Outputs |
|
|
|
- **Inputs**: Natural language queries related to financial services. |
|
- **Outputs**: Textual responses or actions based on the input query. |
|
|
|
### Fine-tuning |
|
|
|
This model has been fine-tuned with a dataset specifically created to simulate financial service interactions, covering a variety of questions related to account management and stock trading. |
|
|
|
## Intended Use |
|
|
|
This model is intended for integration into financial chatbots, virtual assistants, or other systems requiring automated handling of financial queries. |
|
|
|
## Limitations |
|
|
|
- **Domain Specificity**: The model may not perform well outside financial-related tasks. |
|
- **Misinterpretation Risks**: There is a potential risk of misunderstanding complex or ambiguous queries. |
|
|
|
## Ethical Considerations |
|
|
|
- **Bias**: Trained on synthetic data, the model may not represent all user demographics. |
|
- **Privacy**: The model should be used in compliance with financial privacy regulations. |
|
|
|
## How to Use |
|
|
|
```python |
|
from transformers import AutoModelForCausalLM, AutoTokenizer |
|
|
|
# Load tokenizer and model |
|
tokenizer = AutoTokenizer.from_pretrained("jeromecondere/Meta-Llama-3-8B-for-bank") |
|
model = AutoModelForCausalLM.from_pretrained("jeromecondere/Meta-Llama-3-8B-for-bank").to("cuda") |
|
|
|
# Example of usage |
|
name = 'Walter Sensei' |
|
company = 'Amazon Inc.' |
|
stock_value = 42.24 |
|
messages = [ |
|
{'role': 'system', 'content': f'Hi {name}, I\'m your assistant how can I help you'}, |
|
{"role": "user", "content": f"yo, can you just give me the balance of my account?"} |
|
] |
|
|
|
# Prepare the message using the chat template |
|
res1 = tokenizer.apply_chat_template(messages, tokenize=False) |
|
print(res1+'\n\n') |
|
|
|
# Prepare the messages for the model |
|
input_ids = tokenizer.apply_chat_template(messages, truncation=True, add_generation_prompt=True, return_tensors="pt").to("cuda") |
|
|
|
# Inference |
|
outputs = model.generate( |
|
input_ids=input_ids, |
|
max_new_tokens=100, |
|
do_sample=True, |
|
temperature=0.1, |
|
top_k=50, |
|
top_p=0.95 |
|
) |
|
print(tokenizer.batch_decode(outputs, skip_special_tokens=True)[0]) |
|
|