segformer-b5-finetuned-apple-dms-run8

This model is a fine-tuned version of nvidia/segformer-b5-finetuned-ade-640-640 on the AllanK24/apple-dms-materials dataset. It achieves the following results on the evaluation set:

  • Loss: 1.3158
  • Mean Iou: 0.3623
  • Mean Accuracy: 0.4418
  • Overall Accuracy: 0.7931
  • Accuracy Animal Skin: 0.0006
  • Iou Animal Skin: 0.0006
  • Accuracy Bone Teeth Horn: 0.0
  • Iou Bone Teeth Horn: 0.0
  • Accuracy Brickwork: 0.6866
  • Iou Brickwork: 0.5481
  • Accuracy Cardboard: 0.5888
  • Iou Cardboard: 0.4554
  • Accuracy Carpet Rug: 0.8537
  • Iou Carpet Rug: 0.7194
  • Accuracy Ceiling Tile: 0.8663
  • Iou Ceiling Tile: 0.7477
  • Accuracy Ceramic: 0.7220
  • Iou Ceramic: 0.5581
  • Accuracy Chalkboard Blackboard: 0.5775
  • Iou Chalkboard Blackboard: 0.5219
  • Accuracy Clutter: 0.0
  • Iou Clutter: 0.0
  • Accuracy Concrete: 0.5451
  • Iou Concrete: 0.3830
  • Accuracy Cork Corkboard: 0.0
  • Iou Cork Corkboard: 0.0
  • Accuracy Engineered Stone: 0.0
  • Iou Engineered Stone: 0.0
  • Accuracy Fabric Cloth: 0.8904
  • Iou Fabric Cloth: 0.7591
  • Accuracy Fiberglass Wool: 0.0
  • Iou Fiberglass Wool: 0.0
  • Accuracy Fire: 0.0
  • Iou Fire: 0.0
  • Accuracy Foliage: 0.9360
  • Iou Foliage: 0.8185
  • Accuracy Food: 0.8928
  • Iou Food: 0.7456
  • Accuracy Fur: 0.9172
  • Iou Fur: 0.7602
  • Accuracy Gemstone Quartz: 0.0
  • Iou Gemstone Quartz: 0.0
  • Accuracy Glass: 0.7353
  • Iou Glass: 0.5745
  • Accuracy Hair: 0.8039
  • Iou Hair: 0.6858
  • Accuracy Ice: 0.0
  • Iou Ice: 0.0
  • Accuracy Leather: 0.4730
  • Iou Leather: 0.3979
  • Accuracy Liquid Non-water: 0.0
  • Iou Liquid Non-water: 0.0
  • Accuracy Metal: 0.3912
  • Iou Metal: 0.3013
  • Accuracy Mirror: 0.5493
  • Iou Mirror: 0.4350
  • Accuracy Paint Plaster Enamel: 0.8595
  • Iou Paint Plaster Enamel: 0.7257
  • Accuracy Paper: 0.7236
  • Iou Paper: 0.5573
  • Accuracy Pearl: 0.0
  • Iou Pearl: 0.0
  • Accuracy Photograph Painting: 0.4124
  • Iou Photograph Painting: 0.3168
  • Accuracy Plastic Clear: 0.1583
  • Iou Plastic Clear: 0.1403
  • Accuracy Plastic Non-clear: 0.4829
  • Iou Plastic Non-clear: 0.3457
  • Accuracy Rubber Latex: 0.0001
  • Iou Rubber Latex: 0.0001
  • Accuracy Sand: 0.1871
  • Iou Sand: 0.1813
  • Accuracy Skin Lips: 0.8032
  • Iou Skin Lips: 0.6802
  • Accuracy Sky: 0.9700
  • Iou Sky: 0.9207
  • Accuracy Snow: 0.4513
  • Iou Snow: 0.4329
  • Accuracy Soap: 0.0
  • Iou Soap: 0.0
  • Accuracy Soil Mud: 0.5892
  • Iou Soil Mud: 0.3437
  • Accuracy Sponge: 0.0
  • Iou Sponge: 0.0
  • Accuracy Stone Natural: 0.6717
  • Iou Stone Natural: 0.4867
  • Accuracy Stone Polished: 0.1281
  • Iou Stone Polished: 0.1186
  • Accuracy Styrofoam: 0.0
  • Iou Styrofoam: 0.0
  • Accuracy Tile: 0.8029
  • Iou Tile: 0.6400
  • Accuracy Wallpaper: 0.5318
  • Iou Wallpaper: 0.4360
  • Accuracy Water: 0.9102
  • Iou Water: 0.7919
  • Accuracy Wax: 0.0
  • Iou Wax: 0.0
  • Accuracy Whiteboard: 0.7470
  • Iou Whiteboard: 0.5827
  • Accuracy Wicker: 0.4441
  • Iou Wicker: 0.3946
  • Accuracy Wood: 0.8514
  • Iou Wood: 0.7134
  • Accuracy Wood Tree: 0.1826
  • Iou Wood Tree: 0.1617
  • Accuracy Asphalt: 0.6368
  • Iou Asphalt: 0.4573

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 32
  • eval_batch_size: 16
  • seed: 42
  • distributed_type: multi-GPU
  • num_devices: 8
  • total_train_batch_size: 256
  • total_eval_batch_size: 128
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 0.1
  • num_epochs: 20
  • label_smoothing_factor: 0.1

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Animal Skin Iou Animal Skin Accuracy Bone Teeth Horn Iou Bone Teeth Horn Accuracy Brickwork Iou Brickwork Accuracy Cardboard Iou Cardboard Accuracy Carpet Rug Iou Carpet Rug Accuracy Ceiling Tile Iou Ceiling Tile Accuracy Ceramic Iou Ceramic Accuracy Chalkboard Blackboard Iou Chalkboard Blackboard Accuracy Clutter Iou Clutter Accuracy Concrete Iou Concrete Accuracy Cork Corkboard Iou Cork Corkboard Accuracy Engineered Stone Iou Engineered Stone Accuracy Fabric Cloth Iou Fabric Cloth Accuracy Fiberglass Wool Iou Fiberglass Wool Accuracy Fire Iou Fire Accuracy Foliage Iou Foliage Accuracy Food Iou Food Accuracy Fur Iou Fur Accuracy Gemstone Quartz Iou Gemstone Quartz Accuracy Glass Iou Glass Accuracy Hair Iou Hair Accuracy Ice Iou Ice Accuracy Leather Iou Leather Accuracy Liquid Non-water Iou Liquid Non-water Accuracy Metal Iou Metal Accuracy Mirror Iou Mirror Accuracy Paint Plaster Enamel Iou Paint Plaster Enamel Accuracy Paper Iou Paper Accuracy Pearl Iou Pearl Accuracy Photograph Painting Iou Photograph Painting Accuracy Plastic Clear Iou Plastic Clear Accuracy Plastic Non-clear Iou Plastic Non-clear Accuracy Rubber Latex Iou Rubber Latex Accuracy Sand Iou Sand Accuracy Skin Lips Iou Skin Lips Accuracy Sky Iou Sky Accuracy Snow Iou Snow Accuracy Soap Iou Soap Accuracy Soil Mud Iou Soil Mud Accuracy Sponge Iou Sponge Accuracy Stone Natural Iou Stone Natural Accuracy Stone Polished Iou Stone Polished Accuracy Styrofoam Iou Styrofoam Accuracy Tile Iou Tile Accuracy Wallpaper Iou Wallpaper Accuracy Water Iou Water Accuracy Wax Iou Wax Accuracy Whiteboard Iou Whiteboard Accuracy Wicker Iou Wicker Accuracy Wood Iou Wood Accuracy Wood Tree Iou Wood Tree Accuracy Asphalt Iou Asphalt
3.5688 1.7045 150 2.6137 0.0684 0.0998 0.5339 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.3088 0.2399 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.8568 0.4306 0.0 0.0 0.0 0.0 0.7912 0.5496 0.0 0.0 0.0 0.0 0.0 0.0 0.4691 0.2483 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0000 0.0000 0.0 0.0 0.7949 0.5744 0.0934 0.0876 0.0 0.0 0.0 0.0 0.0 0.0 0.0851 0.0734 0.0 0.0 0.0 0.0 0.0 0.0 0.9228 0.8755 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0000 0.0000 0.0 0.0 0.0 0.0 0.2255 0.1953 0.0 0.0 0.0038 0.0038 0.0 0.0 0.0 0.0 0.0 0.0 0.6390 0.2788 0.0 0.0 0.0 0.0
1.9211 3.4091 300 1.6497 0.1718 0.2221 0.7026 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.7838 0.5890 0.7946 0.6866 0.0131 0.0130 0.0 0.0 0.0 0.0 0.0537 0.0525 0.0 0.0 0.0 0.0 0.8937 0.6446 0.0 0.0 0.0 0.0 0.9330 0.6310 0.8090 0.6758 0.0000 0.0000 0.0 0.0 0.6752 0.4690 0.4928 0.4514 0.0 0.0 0.0 0.0 0.0 0.0 0.0434 0.0413 0.0046 0.0046 0.8531 0.6583 0.6791 0.4309 0.0 0.0 0.0 0.0 0.0 0.0 0.3271 0.2328 0.0 0.0 0.0 0.0 0.4968 0.3999 0.9584 0.9021 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2658 0.2160 0.0 0.0 0.0 0.0 0.7223 0.5106 0.0 0.0 0.9016 0.7058 0.0 0.0 0.0 0.0 0.0 0.0 0.8499 0.6209 0.0 0.0 0.0 0.0
1.5028 5.1136 450 1.4704 0.2406 0.3013 0.7487 0.0 0.0 0.0 0.0 0.2897 0.2634 0.0140 0.0140 0.8362 0.6516 0.8453 0.7138 0.6080 0.4702 0.0 0.0 0.0 0.0 0.3415 0.2590 0.0 0.0 0.0 0.0 0.8937 0.7035 0.0 0.0 0.0 0.0 0.9425 0.7495 0.8695 0.7195 0.7967 0.6711 0.0 0.0 0.6795 0.5239 0.7263 0.6182 0.0 0.0 0.0 0.0 0.0 0.0 0.2419 0.1988 0.4566 0.3693 0.8605 0.6916 0.7287 0.4862 0.0 0.0 0.2295 0.1865 0.0 0.0 0.3840 0.2779 0.0 0.0 0.0 0.0 0.7021 0.5816 0.9626 0.9116 0.0 0.0 0.0 0.0 0.0737 0.0701 0.0 0.0 0.6191 0.3434 0.0 0.0 0.0 0.0 0.7829 0.5881 0.0000 0.0000 0.9084 0.7466 0.0 0.0 0.0262 0.0260 0.0 0.0 0.8399 0.6702 0.0 0.0 0.0067 0.0067
1.3738 6.8182 600 1.3994 0.2871 0.3563 0.7682 0.0 0.0 0.0 0.0 0.5933 0.4517 0.3638 0.3274 0.8446 0.6768 0.8384 0.7310 0.6566 0.5143 0.0002 0.0002 0.0 0.0 0.4812 0.3422 0.0 0.0 0.0 0.0 0.8782 0.7294 0.0 0.0 0.0 0.0 0.9405 0.7912 0.8890 0.7304 0.8957 0.7288 0.0 0.0 0.6983 0.5410 0.7793 0.6494 0.0 0.0 0.0140 0.0139 0.0 0.0 0.3161 0.2497 0.4864 0.3975 0.8675 0.7052 0.7150 0.5289 0.0 0.0 0.3421 0.2572 0.0007 0.0007 0.3896 0.2926 0.0 0.0 0.0 0.0 0.7434 0.6241 0.9651 0.9163 0.0 0.0 0.0 0.0 0.4577 0.2661 0.0 0.0 0.6438 0.4209 0.0000 0.0000 0.0 0.0 0.7742 0.6133 0.2617 0.2405 0.9063 0.7827 0.0 0.0 0.5245 0.4024 0.0 0.0 0.8600 0.6829 0.0 0.0 0.4019 0.3214
1.3074 8.5227 750 1.3624 0.3062 0.3833 0.7779 0.0 0.0 0.0 0.0 0.6482 0.4869 0.5089 0.4049 0.8582 0.6968 0.8547 0.7378 0.7063 0.5348 0.0125 0.0125 0.0 0.0 0.5220 0.3647 0.0 0.0 0.0 0.0 0.8886 0.7402 0.0 0.0 0.0 0.0 0.9360 0.8043 0.8915 0.7420 0.9024 0.7365 0.0 0.0 0.7085 0.5541 0.7820 0.6668 0.0 0.0 0.2439 0.2241 0.0 0.0 0.3429 0.2704 0.5404 0.4121 0.8621 0.7150 0.7302 0.5411 0.0 0.0 0.3841 0.2874 0.0117 0.0116 0.4478 0.3203 0.0 0.0 0.0 0.0 0.7778 0.6488 0.9688 0.9166 0.0 0.0 0.0 0.0 0.5712 0.3040 0.0 0.0 0.6293 0.4485 0.0028 0.0028 0.0 0.0 0.7874 0.6242 0.4531 0.3827 0.9103 0.7865 0.0 0.0 0.6365 0.4227 0.0326 0.0326 0.8433 0.6982 0.0000 0.0000 0.5344 0.3900
1.2658 10.2273 900 1.3462 0.3228 0.4012 0.7828 0.0 0.0 0.0 0.0 0.6567 0.5265 0.5569 0.4308 0.8420 0.7075 0.8632 0.7423 0.7176 0.5415 0.2656 0.2578 0.0 0.0 0.5309 0.3676 0.0 0.0 0.0 0.0 0.8909 0.7461 0.0 0.0 0.0 0.0 0.9345 0.8103 0.8990 0.7369 0.9111 0.7475 0.0 0.0 0.6987 0.5598 0.7946 0.6730 0.0 0.0 0.3752 0.3252 0.0 0.0 0.3508 0.2778 0.5315 0.4176 0.8618 0.7180 0.7105 0.5476 0.0 0.0 0.3978 0.3011 0.0358 0.0351 0.4730 0.3324 0.0 0.0 0.0023 0.0023 0.7865 0.6617 0.9707 0.9154 0.0000 0.0000 0.0 0.0 0.6132 0.3008 0.0 0.0 0.6699 0.4709 0.0265 0.0263 0.0 0.0 0.7903 0.6315 0.4821 0.4054 0.9078 0.7889 0.0 0.0 0.7034 0.4958 0.1628 0.1600 0.8534 0.7028 0.0107 0.0106 0.5860 0.4135
1.2378 11.9318 1050 1.3312 0.3380 0.4175 0.7870 0.0 0.0 0.0 0.0 0.6778 0.5330 0.5772 0.4394 0.8673 0.7090 0.8605 0.7450 0.7195 0.5508 0.3793 0.3575 0.0 0.0 0.5300 0.3790 0.0 0.0 0.0 0.0 0.8896 0.7528 0.0 0.0 0.0 0.0 0.9396 0.8077 0.8880 0.7441 0.9088 0.7550 0.0 0.0 0.7071 0.5654 0.7963 0.6796 0.0 0.0 0.4251 0.3618 0.0 0.0 0.3783 0.2920 0.5525 0.4247 0.8571 0.7220 0.7336 0.5504 0.0 0.0 0.4271 0.3118 0.0769 0.0731 0.4776 0.3382 0.0 0.0 0.0566 0.0562 0.7978 0.6700 0.9688 0.9189 0.0896 0.0895 0.0 0.0 0.5679 0.3013 0.0 0.0 0.6787 0.4728 0.0725 0.0693 0.0 0.0 0.7887 0.6365 0.5238 0.4290 0.9096 0.7886 0.0 0.0 0.7214 0.5278 0.3295 0.3123 0.8575 0.7067 0.0440 0.0428 0.6361 0.4609
1.2215 13.6364 1200 1.3248 0.3517 0.4307 0.7900 0.0 0.0 0.0 0.0 0.6877 0.5403 0.5812 0.4504 0.8554 0.7148 0.8656 0.7473 0.7195 0.5531 0.5399 0.4932 0.0 0.0 0.5427 0.3793 0.0 0.0 0.0 0.0 0.8917 0.7551 0.0 0.0 0.0 0.0 0.9389 0.8136 0.8954 0.7466 0.9134 0.7592 0.0 0.0 0.7213 0.5705 0.8016 0.6808 0.0 0.0 0.4466 0.3779 0.0 0.0 0.3777 0.2941 0.5322 0.4285 0.8577 0.7240 0.7171 0.5551 0.0 0.0 0.4204 0.3161 0.1070 0.0990 0.4940 0.3439 0.0 0.0 0.1158 0.1140 0.7976 0.6745 0.9700 0.9198 0.3121 0.3043 0.0 0.0 0.5806 0.3254 0.0 0.0 0.6769 0.4762 0.0882 0.0837 0.0 0.0 0.7989 0.6351 0.5246 0.4316 0.9105 0.7910 0.0 0.0 0.7383 0.5688 0.3929 0.3607 0.8514 0.7101 0.1055 0.0988 0.6250 0.4500
1.2064 15.3409 1350 1.3207 0.3571 0.4362 0.7914 0.0000 0.0000 0.0 0.0 0.6850 0.5453 0.5846 0.4522 0.8558 0.7158 0.8546 0.7498 0.7210 0.5554 0.5726 0.5196 0.0 0.0 0.5473 0.3815 0.0 0.0 0.0 0.0 0.8874 0.7589 0.0 0.0 0.0 0.0 0.9381 0.8149 0.8911 0.7424 0.9186 0.7574 0.0 0.0 0.7069 0.5679 0.8050 0.6850 0.0 0.0 0.4640 0.3898 0.0 0.0 0.3907 0.2991 0.5293 0.4230 0.8644 0.7245 0.7226 0.5560 0.0 0.0 0.4203 0.3160 0.1310 0.1200 0.4817 0.3439 0.0000 0.0000 0.1269 0.1234 0.8050 0.6786 0.9705 0.9200 0.3876 0.3741 0.0 0.0 0.6014 0.3333 0.0 0.0 0.6697 0.4821 0.1135 0.1057 0.0 0.0 0.7979 0.6392 0.5285 0.4360 0.9072 0.7929 0.0 0.0 0.7434 0.5773 0.4286 0.3858 0.8513 0.7111 0.1509 0.1360 0.6294 0.4530
1.1984 17.0455 1500 1.3180 0.3601 0.4398 0.7925 0.0004 0.0004 0.0 0.0 0.6866 0.5485 0.5912 0.4540 0.8599 0.7154 0.8612 0.7493 0.7194 0.5573 0.5756 0.5187 0.0 0.0 0.5455 0.3841 0.0 0.0 0.0 0.0 0.8843 0.7607 0.0 0.0 0.0 0.0 0.9375 0.8172 0.8885 0.7442 0.9148 0.7614 0.0 0.0 0.7219 0.5725 0.8050 0.6858 0.0 0.0 0.4795 0.3998 0.0 0.0 0.3906 0.3000 0.5431 0.4321 0.8614 0.7252 0.7201 0.5573 0.0 0.0 0.4147 0.3164 0.1534 0.1366 0.4879 0.3461 0.0001 0.0001 0.1564 0.1519 0.8036 0.6797 0.9702 0.9209 0.4214 0.4056 0.0 0.0 0.5866 0.3357 0.0 0.0 0.6812 0.4817 0.1247 0.1153 0.0 0.0 0.8012 0.6404 0.5272 0.4339 0.9086 0.7921 0.0 0.0 0.7520 0.5788 0.4392 0.3911 0.8557 0.7121 0.1634 0.1459 0.6344 0.4573
1.1937 18.75 1650 1.3158 0.3623 0.4418 0.7931 0.0006 0.0006 0.0 0.0 0.6866 0.5481 0.5888 0.4554 0.8537 0.7194 0.8663 0.7477 0.7220 0.5581 0.5775 0.5219 0.0 0.0 0.5451 0.3830 0.0 0.0 0.0 0.0 0.8904 0.7591 0.0 0.0 0.0 0.0 0.9360 0.8185 0.8928 0.7456 0.9172 0.7602 0.0 0.0 0.7353 0.5745 0.8039 0.6858 0.0 0.0 0.4730 0.3979 0.0 0.0 0.3912 0.3013 0.5493 0.4350 0.8595 0.7257 0.7236 0.5573 0.0 0.0 0.4124 0.3168 0.1583 0.1403 0.4829 0.3457 0.0001 0.0001 0.1871 0.1813 0.8032 0.6802 0.9700 0.9207 0.4513 0.4329 0.0 0.0 0.5892 0.3437 0.0 0.0 0.6717 0.4867 0.1281 0.1186 0.0 0.0 0.8029 0.6400 0.5318 0.4360 0.9102 0.7919 0.0 0.0 0.7470 0.5827 0.4441 0.3946 0.8514 0.7134 0.1826 0.1617 0.6368 0.4573

Framework versions

  • Transformers 5.0.0
  • Pytorch 2.9.1+cu128
  • Datasets 4.5.0
  • Tokenizers 0.22.2
Downloads last month
8
Safetensors
Model size
84.6M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for AllanK24/segformer-b5-finetuned-apple-dms-run8

Finetuned
(22)
this model