File size: 4,789 Bytes
55ae6de
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
---
base_model: yihongLiu/furina
tags:
- generated_from_trainer
model-index:
- name: furina_seed42_eng_amh_esp
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# furina_seed42_eng_amh_esp

This model is a fine-tuned version of [yihongLiu/furina](https://huggingface.co/yihongLiu/furina) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0131
- Spearman Corr: 0.8481

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 128
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step  | Validation Loss | Spearman Corr |
|:-------------:|:-----:|:-----:|:---------------:|:-------------:|
| No log        | 0.59  | 200   | 0.0247          | 0.6933        |
| No log        | 1.18  | 400   | 0.0227          | 0.7395        |
| No log        | 1.76  | 600   | 0.0217          | 0.7604        |
| 0.0471        | 2.35  | 800   | 0.0192          | 0.7685        |
| 0.0471        | 2.94  | 1000  | 0.0186          | 0.7805        |
| 0.0471        | 3.53  | 1200  | 0.0178          | 0.7884        |
| 0.0196        | 4.12  | 1400  | 0.0179          | 0.7961        |
| 0.0196        | 4.71  | 1600  | 0.0177          | 0.7990        |
| 0.0196        | 5.29  | 1800  | 0.0178          | 0.8035        |
| 0.0196        | 5.88  | 2000  | 0.0164          | 0.8057        |
| 0.0145        | 6.47  | 2200  | 0.0167          | 0.8070        |
| 0.0145        | 7.06  | 2400  | 0.0159          | 0.8110        |
| 0.0145        | 7.65  | 2600  | 0.0165          | 0.8121        |
| 0.0106        | 8.24  | 2800  | 0.0161          | 0.8110        |
| 0.0106        | 8.82  | 3000  | 0.0159          | 0.8148        |
| 0.0106        | 9.41  | 3200  | 0.0155          | 0.8195        |
| 0.008         | 10.0  | 3400  | 0.0151          | 0.8227        |
| 0.008         | 10.59 | 3600  | 0.0149          | 0.8253        |
| 0.008         | 11.18 | 3800  | 0.0154          | 0.8244        |
| 0.008         | 11.76 | 4000  | 0.0147          | 0.8251        |
| 0.0064        | 12.35 | 4200  | 0.0148          | 0.8249        |
| 0.0064        | 12.94 | 4400  | 0.0149          | 0.8287        |
| 0.0064        | 13.53 | 4600  | 0.0147          | 0.8297        |
| 0.0052        | 14.12 | 4800  | 0.0142          | 0.8347        |
| 0.0052        | 14.71 | 5000  | 0.0148          | 0.8314        |
| 0.0052        | 15.29 | 5200  | 0.0141          | 0.8341        |
| 0.0052        | 15.88 | 5400  | 0.0139          | 0.8386        |
| 0.0045        | 16.47 | 5600  | 0.0139          | 0.8350        |
| 0.0045        | 17.06 | 5800  | 0.0137          | 0.8389        |
| 0.0045        | 17.65 | 6000  | 0.0136          | 0.8402        |
| 0.004         | 18.24 | 6200  | 0.0139          | 0.8400        |
| 0.004         | 18.82 | 6400  | 0.0138          | 0.8414        |
| 0.004         | 19.41 | 6600  | 0.0136          | 0.8433        |
| 0.0036        | 20.0  | 6800  | 0.0140          | 0.8420        |
| 0.0036        | 20.59 | 7000  | 0.0136          | 0.8434        |
| 0.0036        | 21.18 | 7200  | 0.0137          | 0.8451        |
| 0.0036        | 21.76 | 7400  | 0.0133          | 0.8445        |
| 0.0032        | 22.35 | 7600  | 0.0135          | 0.8451        |
| 0.0032        | 22.94 | 7800  | 0.0136          | 0.8447        |
| 0.0032        | 23.53 | 8000  | 0.0136          | 0.8449        |
| 0.003         | 24.12 | 8200  | 0.0132          | 0.8463        |
| 0.003         | 24.71 | 8400  | 0.0131          | 0.8472        |
| 0.003         | 25.29 | 8600  | 0.0133          | 0.8477        |
| 0.003         | 25.88 | 8800  | 0.0135          | 0.8472        |
| 0.0028        | 26.47 | 9000  | 0.0134          | 0.8478        |
| 0.0028        | 27.06 | 9200  | 0.0131          | 0.8477        |
| 0.0028        | 27.65 | 9400  | 0.0131          | 0.8478        |
| 0.0026        | 28.24 | 9600  | 0.0130          | 0.8482        |
| 0.0026        | 28.82 | 9800  | 0.0131          | 0.8479        |
| 0.0026        | 29.41 | 10000 | 0.0131          | 0.8481        |


### Framework versions

- Transformers 4.37.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.1
- Tokenizers 0.15.1