Edit model card

segformer-b1-finetuned-segments-ic-chip-sample

This model is a fine-tuned version of nvidia/mit-b1 on the yijisuk/ic-chip-sample dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1227
  • Mean Iou: 0.4744
  • Mean Accuracy: 0.9489
  • Overall Accuracy: 0.9489
  • Accuracy Unlabeled: nan
  • Accuracy Circuit: 0.9489
  • Iou Unlabeled: 0.0
  • Iou Circuit: 0.9489

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Unlabeled Accuracy Circuit Iou Unlabeled Iou Circuit
0.4185 1.0 20 0.5878 0.3632 0.7265 0.7265 nan 0.7265 0.0 0.7265
0.4477 2.0 40 0.4288 0.4894 0.9788 0.9788 nan 0.9788 0.0 0.9788
0.9304 3.0 60 0.2053 0.4520 0.9041 0.9041 nan 0.9041 0.0 0.9041
0.1409 4.0 80 0.1817 0.4738 0.9477 0.9477 nan 0.9477 0.0 0.9477
0.392 5.0 100 0.1824 0.4900 0.9800 0.9800 nan 0.9800 0.0 0.9800
0.1589 6.0 120 0.1594 0.4814 0.9628 0.9628 nan 0.9628 0.0 0.9628
0.1848 7.0 140 0.1551 0.4625 0.9251 0.9251 nan 0.9251 0.0 0.9251
0.0874 8.0 160 0.1503 0.4829 0.9657 0.9657 nan 0.9657 0.0 0.9657
0.2172 9.0 180 0.1558 0.4591 0.9182 0.9182 nan 0.9182 0.0 0.9182
0.9914 10.0 200 0.1457 0.4698 0.9396 0.9396 nan 0.9396 0.0 0.9396
0.2387 11.0 220 0.1494 0.4709 0.9419 0.9419 nan 0.9419 0.0 0.9419
0.1242 12.0 240 0.1463 0.4743 0.9486 0.9486 nan 0.9486 0.0 0.9486
0.0819 13.0 260 0.1492 0.4757 0.9515 0.9515 nan 0.9515 0.0 0.9515
0.6077 14.0 280 0.1442 0.4793 0.9586 0.9586 nan 0.9586 0.0 0.9586
0.3156 15.0 300 0.1430 0.4813 0.9627 0.9627 nan 0.9627 0.0 0.9627
0.2564 16.0 320 0.1483 0.4673 0.9347 0.9347 nan 0.9347 0.0 0.9347
0.107 17.0 340 0.1467 0.4695 0.9390 0.9390 nan 0.9390 0.0 0.9390
1.1592 18.0 360 0.1437 0.4814 0.9628 0.9628 nan 0.9628 0.0 0.9628
0.0586 19.0 380 0.1396 0.4811 0.9622 0.9622 nan 0.9622 0.0 0.9622
0.9815 20.0 400 0.1399 0.4812 0.9624 0.9624 nan 0.9624 0.0 0.9624
0.3101 21.0 420 0.1411 0.4836 0.9672 0.9672 nan 0.9672 0.0 0.9672
0.2325 22.0 440 0.1395 0.4672 0.9344 0.9344 nan 0.9344 0.0 0.9344
0.1504 23.0 460 0.1420 0.4720 0.9441 0.9441 nan 0.9441 0.0 0.9441
0.2831 24.0 480 0.1393 0.4697 0.9395 0.9395 nan 0.9395 0.0 0.9395
0.0921 25.0 500 0.1418 0.4701 0.9401 0.9401 nan 0.9401 0.0 0.9401
0.141 26.0 520 0.1318 0.4648 0.9296 0.9296 nan 0.9296 0.0 0.9296
0.1381 27.0 540 0.1316 0.4697 0.9395 0.9395 nan 0.9395 0.0 0.9395
1.1864 28.0 560 0.1292 0.4774 0.9548 0.9548 nan 0.9548 0.0 0.9548
0.9492 29.0 580 0.1290 0.4709 0.9418 0.9418 nan 0.9418 0.0 0.9418
0.3061 30.0 600 0.1303 0.4536 0.9071 0.9071 nan 0.9071 0.0 0.9071
0.2511 31.0 620 0.1318 0.4725 0.9451 0.9451 nan 0.9451 0.0 0.9451
0.2706 32.0 640 0.1284 0.4790 0.9580 0.9580 nan 0.9580 0.0 0.9580
0.1508 33.0 660 0.1264 0.4698 0.9396 0.9396 nan 0.9396 0.0 0.9396
0.2802 34.0 680 0.1308 0.4733 0.9467 0.9467 nan 0.9467 0.0 0.9467
0.1897 35.0 700 0.1315 0.4681 0.9361 0.9361 nan 0.9361 0.0 0.9361
0.1981 36.0 720 0.1289 0.4766 0.9531 0.9531 nan 0.9531 0.0 0.9531
0.2742 37.0 740 0.1284 0.4818 0.9635 0.9635 nan 0.9635 0.0 0.9635
0.0418 38.0 760 0.1240 0.4762 0.9525 0.9525 nan 0.9525 0.0 0.9525
0.1946 39.0 780 0.1253 0.4750 0.9500 0.9500 nan 0.9500 0.0 0.9500
0.1692 40.0 800 0.1253 0.4836 0.9672 0.9672 nan 0.9672 0.0 0.9672
0.3071 41.0 820 0.1227 0.4751 0.9503 0.9503 nan 0.9503 0.0 0.9503
0.2003 42.0 840 0.1250 0.4762 0.9524 0.9524 nan 0.9524 0.0 0.9524
0.2099 43.0 860 0.1235 0.4740 0.9480 0.9480 nan 0.9480 0.0 0.9480
0.1218 44.0 880 0.1222 0.4743 0.9486 0.9486 nan 0.9486 0.0 0.9486
0.1583 45.0 900 0.1226 0.4708 0.9415 0.9415 nan 0.9415 0.0 0.9415
0.1506 46.0 920 0.1215 0.4686 0.9372 0.9372 nan 0.9372 0.0 0.9372
0.0643 47.0 940 0.1234 0.4779 0.9559 0.9559 nan 0.9559 0.0 0.9559
0.2006 48.0 960 0.1213 0.4757 0.9515 0.9515 nan 0.9515 0.0 0.9515
0.0783 49.0 980 0.1241 0.4726 0.9452 0.9452 nan 0.9452 0.0 0.9452
0.0552 50.0 1000 0.1227 0.4744 0.9489 0.9489 nan 0.9489 0.0 0.9489

Framework versions

  • Transformers 4.36.2
  • Pytorch 1.11.0+cu115
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
11
Safetensors
Model size
13.7M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for yijisuk/segformer-b1-finetuned-segments-ic-chip-sample

Base model

nvidia/mit-b1
Finetuned
(15)
this model