kaytoo2022 commited on
Commit
3affe93
1 Parent(s): b77d078

Training in progress epoch 14

Browse files
Files changed (2) hide show
  1. README.md +17 -3
  2. tf_model.h5 +1 -1
README.md CHANGED
@@ -15,9 +15,9 @@ probably proofread and complete it, then remove this comment. -->
15
 
16
  This model is a fine-tuned version of [google/flan-t5-base](https://huggingface.co/google/flan-t5-base) on an unknown dataset.
17
  It achieves the following results on the evaluation set:
18
- - Train Loss: 3.0296
19
- - Validation Loss: 2.3499
20
- - Epoch: 0
21
 
22
  ## Model description
23
 
@@ -44,6 +44,20 @@ The following hyperparameters were used during training:
44
  | Train Loss | Validation Loss | Epoch |
45
  |:----------:|:---------------:|:-----:|
46
  | 3.0296 | 2.3499 | 0 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
47
 
48
 
49
  ### Framework versions
 
15
 
16
  This model is a fine-tuned version of [google/flan-t5-base](https://huggingface.co/google/flan-t5-base) on an unknown dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Train Loss: 1.5907
19
+ - Validation Loss: 1.4118
20
+ - Epoch: 14
21
 
22
  ## Model description
23
 
 
44
  | Train Loss | Validation Loss | Epoch |
45
  |:----------:|:---------------:|:-----:|
46
  | 3.0296 | 2.3499 | 0 |
47
+ | 2.5467 | 2.1372 | 1 |
48
+ | 2.3870 | 2.0202 | 2 |
49
+ | 2.2760 | 1.9289 | 3 |
50
+ | 2.1699 | 1.8520 | 4 |
51
+ | 2.1014 | 1.7799 | 5 |
52
+ | 2.0080 | 1.7177 | 6 |
53
+ | 1.9476 | 1.6605 | 7 |
54
+ | 1.8703 | 1.6072 | 8 |
55
+ | 1.8273 | 1.5629 | 9 |
56
+ | 1.7563 | 1.5233 | 10 |
57
+ | 1.7263 | 1.4896 | 11 |
58
+ | 1.6807 | 1.4582 | 12 |
59
+ | 1.6361 | 1.4327 | 13 |
60
+ | 1.5907 | 1.4118 | 14 |
61
 
62
 
63
  ### Framework versions
tf_model.h5 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:b46e7a57e7d17fa861a4d0d8b1494be551218ce18d984db5f5500dd4be040a02
3
  size 1188285040
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d2d83def71d070974d71b5ddaa61de33ca12aaa0ebc78eead2c5eb94043ccc6f
3
  size 1188285040