metadata
base_model: unsloth/llama-3-8b-bnb-4bit
language:
- en
license: apache-2.0
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- trl
- sft
Still in Beta!!!
This model will be used for the base model in J.O.S.I.E.v4o and furhter trained.
Uploaded model
- Developed by: Isaak-Carter
- License: apache-2.0
- Finetuned from model : unsloth/llama-3-8b-bnb-4bit
This llama model was trained 2x faster with Unsloth and Huggingface's TRL library.
"per_device_train_batch_size": 2,
"gradient_accumulation_steps": 8,
"max_steps": 100,
"learning_rate": 2e-4,
"optim": "adamw_8bit",
"weight_decay": 0.01,
"lr_scheduler_type": "cosine"