metadata
base_model: unsloth/llama-3-8b-bnb-4bit
language:
- en
license: apache-2.0
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- trl
- sft
Still in Beta!!!
This model will be used for the base model in J.O.S.I.E.v4o and furhter trained.
Uploaded model
- Developed by: Isaak-Carter
- License: apache-2.0
- Finetuned from model : Isaak-Carter/JOSIEv4o-8b-stage1-beta1
- Dataset used : Isaak-Carter/j.o.s.i.e.v4.0.1o
This llama model was trained 2x faster with Unsloth and Huggingface's TRL library.
"per_device_train_batch_size": 2,
"gradient_accumulation_steps": 8,
"max_steps": 300,
"learning_rate": 2e-4,
"optim": "adamw_8bit",
"weight_decay": 0.01,
"lr_scheduler_type": "cosine"
trained on this Prompt format
"""<|begin_of_text|>system
You are J.O.S.I.E. which is an acronym for "Just an Outstandingly Smart Intelligent Entity", a private and super-intelligent AI assistant, created by Gökdeniz Gülmez.
<|begin_of_text|>main user "Gökdeniz Gülmez"
{{ .Prompt }}
<|begin_of_text|>josie
{{ .Response }}<|end_of_text|>"""