|
--- |
|
tags: |
|
- generated_from_trainer |
|
model-index: |
|
- name: phi-600M-mix |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
[<img src="https://raw.githubusercontent.com/OpenAccess-AI-Collective/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/OpenAccess-AI-Collective/axolotl) |
|
<details><summary>See axolotl config</summary> |
|
|
|
axolotl version: `0.3.0` |
|
```yaml |
|
base_model: phi-600M-cont/checkpoint-5000 |
|
model_type: AutoModelForCausalLM |
|
tokenizer_type: AutoTokenizer |
|
trust_remote_code: true |
|
|
|
load_in_8bit: false |
|
load_in_4bit: false |
|
strict: false |
|
|
|
# max_steps: 8000 |
|
#pretraining_dataset: nampdn-ai/tiny-strange-textbooks |
|
datasets: |
|
- path: math-ai/StackMathQA |
|
name: stackmathqa100k |
|
type: |
|
system_prompt: "" |
|
field_system: system |
|
field_instruction: Q |
|
field_output: A |
|
format: "[INST] {instruction} [/INST]" |
|
no_input_format: "[INST] {instruction} [/INST]" |
|
train_on_split: train[:10%] |
|
- path: SciPhi/textbooks-are-all-you-need-lite |
|
type: completion |
|
field: completion |
|
train_on_split: train[:10%] |
|
|
|
|
|
dataset_prepared_path: |
|
val_set_size: 0.001 |
|
output_dir: ./phi-600M-mix |
|
|
|
sequence_len: 2048 |
|
sample_packing: true # currently unsupported |
|
pad_to_sequence_len: |
|
|
|
adapter: |
|
lora_model_dir: |
|
lora_r: |
|
lora_alpha: |
|
lora_dropout: |
|
lora_target_linear: |
|
lora_fan_in_fan_out: |
|
lora_modules_to_save: |
|
|
|
wandb_project: phine |
|
wandb_entity: willfulbytes |
|
wandb_watch: |
|
wandb_name: |
|
wandb_log_model: |
|
|
|
gradient_accumulation_steps: 4 |
|
micro_batch_size: 1 |
|
num_epochs: 1 |
|
optimizer: paged_adamw_8bit |
|
adam_beta2: 0.98 |
|
adam_epsilon: 0.0000001 |
|
max_grad_norm: 1.0 |
|
lr_scheduler: cosine |
|
learning_rate: 1e-4 |
|
cosine_min_lr_ratio: 0.2 |
|
|
|
train_on_inputs: false |
|
group_by_length: false |
|
bf16: true |
|
fp16: false |
|
tf32: true |
|
|
|
gradient_checkpointing: true |
|
early_stopping_patience: false |
|
resume_from_checkpoint: |
|
local_rank: |
|
logging_steps: 1 |
|
xformers_attention: |
|
flash_attention: true |
|
|
|
warmup_steps: 0 |
|
evals_per_epoch: 100 |
|
saves_per_epoch: 10 |
|
save_steps: |
|
debug: |
|
deepspeed: |
|
weight_decay: 0.1 |
|
fsdp: |
|
fsdp_config: |
|
resize_token_embeddings_to_32x: true |
|
special_tokens: |
|
pad_token: "<|endoftext|>" |
|
|
|
``` |
|
|
|
</details><br> |
|
|
|
# phi-600M-mix |
|
|
|
This model was trained from scratch on the None dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 1.6549 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 0.0001 |
|
- train_batch_size: 1 |
|
- eval_batch_size: 1 |
|
- seed: 42 |
|
- gradient_accumulation_steps: 4 |
|
- total_train_batch_size: 4 |
|
- optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-07 |
|
- lr_scheduler_type: cosine |
|
- num_epochs: 1 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | |
|
|:-------------:|:-----:|:----:|:---------------:| |
|
| 3.366 | 0.0 | 1 | 3.3037 | |
|
| 2.5809 | 0.01 | 84 | 2.5172 | |
|
| 2.5684 | 0.02 | 168 | 2.3902 | |
|
| 2.6054 | 0.03 | 252 | 2.3144 | |
|
| 2.2944 | 0.04 | 336 | 2.2658 | |
|
| 2.2836 | 0.05 | 420 | 2.2178 | |
|
| 2.4438 | 0.06 | 504 | 2.1837 | |
|
| 2.1093 | 0.07 | 588 | 2.1460 | |
|
| 2.1831 | 0.08 | 672 | 2.1220 | |
|
| 2.3081 | 0.09 | 756 | 2.0990 | |
|
| 1.9909 | 0.1 | 840 | 2.0850 | |
|
| 2.114 | 0.11 | 924 | 2.0550 | |
|
| 1.8529 | 0.12 | 1008 | 2.0410 | |
|
| 2.1594 | 0.13 | 1092 | 2.0215 | |
|
| 2.0632 | 0.14 | 1176 | 2.0035 | |
|
| 1.9221 | 0.15 | 1260 | 1.9906 | |
|
| 2.0664 | 0.16 | 1344 | 1.9861 | |
|
| 1.931 | 0.17 | 1428 | 1.9708 | |
|
| 1.9948 | 0.18 | 1512 | 1.9533 | |
|
| 1.9229 | 0.19 | 1596 | 1.9464 | |
|
| 2.0231 | 0.2 | 1680 | 1.9332 | |
|
| 2.2535 | 0.21 | 1764 | 1.9232 | |
|
| 1.8994 | 0.22 | 1848 | 1.9140 | |
|
| 1.9913 | 0.23 | 1932 | 1.8935 | |
|
| 1.8613 | 0.24 | 2016 | 1.8916 | |
|
| 1.9724 | 0.25 | 2100 | 1.8790 | |
|
| 1.9965 | 0.26 | 2184 | 1.8653 | |
|
| 2.0012 | 0.27 | 2268 | 1.8648 | |
|
| 1.9752 | 0.28 | 2352 | 1.8572 | |
|
| 1.9709 | 0.29 | 2436 | 1.8504 | |
|
| 1.7314 | 0.3 | 2520 | 1.8432 | |
|
| 1.7373 | 0.31 | 2604 | 1.8470 | |
|
| 1.93 | 0.32 | 2688 | 1.8353 | |
|
| 1.7185 | 0.33 | 2772 | 1.8210 | |
|
| 1.8435 | 0.34 | 2856 | 1.8201 | |
|
| 1.8117 | 0.35 | 2940 | 1.8118 | |
|
| 2.1292 | 0.36 | 3024 | 1.8095 | |
|
| 1.7536 | 0.37 | 3108 | 1.8023 | |
|
| 1.7596 | 0.38 | 3192 | 1.7956 | |
|
| 1.9481 | 0.39 | 3276 | 1.7890 | |
|
| 1.7915 | 0.4 | 3360 | 1.7872 | |
|
| 1.8639 | 0.41 | 3444 | 1.7782 | |
|
| 1.6688 | 0.42 | 3528 | 1.7754 | |
|
| 1.6312 | 0.43 | 3612 | 1.7669 | |
|
| 1.8053 | 0.45 | 3696 | 1.7602 | |
|
| 1.8867 | 0.46 | 3780 | 1.7544 | |
|
| 1.9305 | 0.47 | 3864 | 1.7546 | |
|
| 1.7926 | 0.48 | 3948 | 1.7496 | |
|
| 1.8326 | 0.49 | 4032 | 1.7436 | |
|
| 1.7334 | 0.5 | 4116 | 1.7437 | |
|
| 1.6552 | 0.51 | 4200 | 1.7348 | |
|
| 1.6622 | 0.52 | 4284 | 1.7330 | |
|
| 1.9858 | 0.53 | 4368 | 1.7303 | |
|
| 1.7784 | 0.54 | 4452 | 1.7271 | |
|
| 1.8752 | 0.55 | 4536 | 1.7222 | |
|
| 1.5931 | 0.56 | 4620 | 1.7186 | |
|
| 1.6785 | 0.57 | 4704 | 1.7131 | |
|
| 1.8382 | 0.58 | 4788 | 1.7101 | |
|
| 1.5888 | 0.59 | 4872 | 1.7081 | |
|
| 1.8055 | 0.6 | 4956 | 1.7062 | |
|
| 1.6869 | 0.61 | 5040 | 1.7021 | |
|
| 1.8096 | 0.62 | 5124 | 1.6999 | |
|
| 1.9318 | 0.63 | 5208 | 1.6980 | |
|
| 1.6153 | 0.64 | 5292 | 1.6963 | |
|
| 1.6556 | 0.65 | 5376 | 1.6924 | |
|
| 1.4087 | 0.66 | 5460 | 1.6908 | |
|
| 1.7946 | 0.67 | 5544 | 1.6881 | |
|
| 1.6097 | 0.68 | 5628 | 1.6867 | |
|
| 1.6397 | 0.69 | 5712 | 1.6847 | |
|
| 1.7799 | 0.7 | 5796 | 1.6828 | |
|
| 1.6216 | 0.71 | 5880 | 1.6809 | |
|
| 1.5052 | 0.72 | 5964 | 1.6790 | |
|
| 1.6931 | 0.73 | 6048 | 1.6773 | |
|
| 1.5936 | 0.74 | 6132 | 1.6762 | |
|
| 1.803 | 0.75 | 6216 | 1.6737 | |
|
| 1.5175 | 0.76 | 6300 | 1.6719 | |
|
| 1.6305 | 0.77 | 6384 | 1.6711 | |
|
| 1.715 | 0.78 | 6468 | 1.6698 | |
|
| 1.8779 | 0.79 | 6552 | 1.6686 | |
|
| 1.6844 | 0.8 | 6636 | 1.6669 | |
|
| 1.3624 | 0.81 | 6720 | 1.6658 | |
|
| 1.5534 | 0.82 | 6804 | 1.6650 | |
|
| 1.8579 | 0.83 | 6888 | 1.6648 | |
|
| 1.6093 | 0.84 | 6972 | 1.6632 | |
|
| 1.5325 | 0.85 | 7056 | 1.6618 | |
|
| 1.6753 | 0.86 | 7140 | 1.6619 | |
|
| 1.3612 | 0.87 | 7224 | 1.6611 | |
|
| 1.4817 | 0.88 | 7308 | 1.6606 | |
|
| 1.7252 | 0.89 | 7392 | 1.6599 | |
|
| 1.7463 | 0.9 | 7476 | 1.6586 | |
|
| 1.8894 | 0.91 | 7560 | 1.6581 | |
|
| 1.545 | 0.92 | 7644 | 1.6575 | |
|
| 1.7251 | 0.93 | 7728 | 1.6572 | |
|
| 1.7265 | 0.94 | 7812 | 1.6572 | |
|
| 1.7813 | 0.95 | 7896 | 1.6564 | |
|
| 1.7005 | 0.96 | 7980 | 1.6560 | |
|
| 1.6444 | 0.97 | 8064 | 1.6555 | |
|
| 1.5202 | 0.98 | 8148 | 1.6552 | |
|
| 1.8648 | 0.99 | 8232 | 1.6549 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.37.0.dev0 |
|
- Pytorch 2.0.1 |
|
- Datasets 2.16.1 |
|
- Tokenizers 0.15.0 |
|
|