|
--- |
|
license: other |
|
base_model: nvidia/mit-b0 |
|
tags: |
|
- vision |
|
- image-segmentation |
|
- generated_from_trainer |
|
model-index: |
|
- name: segformer-b0-finetuned-100by100PNG-50epochs-attempt2-removeNAN |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# segformer-b0-finetuned-100by100PNG-50epochs-attempt2-removeNAN |
|
|
|
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the JCAI2000/100By100BranchPNG dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 0.1268 |
|
- Mean Iou: 0.8754 |
|
- Mean Accuracy: 1.0 |
|
- Overall Accuracy: 1.0 |
|
- Accuracy Branch: 1.0 |
|
- Iou Branch: 0.8754 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 6e-05 |
|
- train_batch_size: 2 |
|
- eval_batch_size: 2 |
|
- seed: 42 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- num_epochs: 50 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Branch | Iou Branch | |
|
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:---------------:|:----------:| |
|
| 0.4487 | 1.05 | 20 | 0.6365 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.466 | 2.11 | 40 | 0.5024 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.4431 | 3.16 | 60 | 0.4013 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.3967 | 4.21 | 80 | 0.3739 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.2476 | 5.26 | 100 | 0.3191 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.3577 | 6.32 | 120 | 0.3235 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.2501 | 7.37 | 140 | 0.2839 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.3382 | 8.42 | 160 | 0.2674 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.3191 | 9.47 | 180 | 0.2512 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.1632 | 10.53 | 200 | 0.2197 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.1888 | 11.58 | 220 | 0.2095 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.1443 | 12.63 | 240 | 0.1975 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.1348 | 13.68 | 260 | 0.1836 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.1772 | 14.74 | 280 | 0.1742 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.1524 | 15.79 | 300 | 0.1893 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.1135 | 16.84 | 320 | 0.1710 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.1676 | 17.89 | 340 | 0.1789 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.131 | 18.95 | 360 | 0.1604 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.1693 | 20.0 | 380 | 0.1531 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.1031 | 21.05 | 400 | 0.1572 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.1432 | 22.11 | 420 | 0.1571 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.1711 | 23.16 | 440 | 0.1542 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.1287 | 24.21 | 460 | 0.1469 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.1228 | 25.26 | 480 | 0.1493 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.1316 | 26.32 | 500 | 0.1568 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.0737 | 27.37 | 520 | 0.1455 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.0914 | 28.42 | 540 | 0.1454 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.1122 | 29.47 | 560 | 0.1467 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.1482 | 30.53 | 580 | 0.1500 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.1006 | 31.58 | 600 | 0.1351 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.1069 | 32.63 | 620 | 0.1513 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.0985 | 33.68 | 640 | 0.1417 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.0794 | 34.74 | 660 | 0.1364 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.1065 | 35.79 | 680 | 0.1343 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.0993 | 36.84 | 700 | 0.1346 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.0904 | 37.89 | 720 | 0.1430 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.1159 | 38.95 | 740 | 0.1342 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.1787 | 40.0 | 760 | 0.1343 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.0621 | 41.05 | 780 | 0.1363 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.0844 | 42.11 | 800 | 0.1301 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.0919 | 43.16 | 820 | 0.1318 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.0728 | 44.21 | 840 | 0.1348 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.1073 | 45.26 | 860 | 0.1391 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.0563 | 46.32 | 880 | 0.1310 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.0827 | 47.37 | 900 | 0.1303 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.0633 | 48.42 | 920 | 0.1304 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
| 0.1452 | 49.47 | 940 | 0.1268 | 0.8754 | 1.0 | 1.0 | 1.0 | 0.8754 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.33.0 |
|
- Pytorch 2.0.1+cu117 |
|
- Datasets 2.14.4 |
|
- Tokenizers 0.13.3 |
|
|