Edit model card

Visualize in Weights & Biases

Mists-7B-v01-simpleQA

This model is a fine-tuned version of HachiML/Mists-7B-v01-simple-projector-trained on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0972

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 128
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.05
  • num_epochs: 1

Training results

Training Loss Epoch Step Validation Loss
0.1811 0.0888 50 5.1410
1.2662 0.1776 100 4.0209
0.5676 0.2664 150 3.6247
0.6122 0.3552 200 2.5723
0.3855 0.4440 250 1.8618
0.5471 0.5329 300 1.5969
0.5238 0.6217 350 0.9045
0.3799 0.7105 400 0.5475
0.2406 0.7993 450 0.1614
0.1381 0.8881 500 0.1055
0.0992 0.9769 550 0.0972

Framework versions

  • Transformers 4.42.3
  • Pytorch 2.0.1
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
7.62B params
Tensor type
BF16
·
F32
·
Inference Examples
Inference API (serverless) does not yet support model repos that contain custom code.

Model tree for HachiML/Mists-7B-v01-simpleQA

Collection including HachiML/Mists-7B-v01-simpleQA