ARC-Easy_Llama-3.2-1B-dygan5pn
This model is a fine-tuned version of meta-llama/Llama-3.2-1B on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 2.7461
- Model Preparation Time: 0.0057
- Mdl: 2258.2292
- Accumulated Loss: 1565.2852
- Correct Preds: 410.0
- Total Preds: 570.0
- Accuracy: 0.7193
- Correct Gen Preds: 387.0
- Gen Accuracy: 0.6789
- Correct Gen Preds 32: 99.0
- Correct Preds 32: 113.0
- Total Labels 32: 158.0
- Accuracy 32: 0.7152
- Gen Accuracy 32: 0.6266
- Correct Gen Preds 33: 113.0
- Correct Preds 33: 115.0
- Total Labels 33: 152.0
- Accuracy 33: 0.7566
- Gen Accuracy 33: 0.7434
- Correct Gen Preds 34: 96.0
- Correct Preds 34: 98.0
- Total Labels 34: 142.0
- Accuracy 34: 0.6901
- Gen Accuracy 34: 0.6761
- Correct Gen Preds 35: 79.0
- Correct Preds 35: 84.0
- Total Labels 35: 118.0
- Accuracy 35: 0.7119
- Gen Accuracy 35: 0.6695
- Correct Gen Preds 36: 0.0
- Correct Preds 36: 0.0
- Total Labels 36: 0.0
- Accuracy 36: 0.0
- Gen Accuracy 36: 0.0
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 112
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.01
- num_epochs: 100
Training results
| Training Loss | Epoch | Step | Validation Loss | Model Preparation Time | Mdl | Accumulated Loss | Correct Preds | Total Preds | Accuracy | Correct Gen Preds | Gen Accuracy | Correct Gen Preds 32 | Correct Preds 32 | Total Labels 32 | Accuracy 32 | Gen Accuracy 32 | Correct Gen Preds 33 | Correct Preds 33 | Total Labels 33 | Accuracy 33 | Gen Accuracy 33 | Correct Gen Preds 34 | Correct Preds 34 | Total Labels 34 | Accuracy 34 | Gen Accuracy 34 | Correct Gen Preds 35 | Correct Preds 35 | Total Labels 35 | Accuracy 35 | Gen Accuracy 35 | Correct Gen Preds 36 | Correct Preds 36 | Total Labels 36 | Accuracy 36 | Gen Accuracy 36 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 0 | 0 | 1.5354 | 0.0057 | 1262.6022 | 875.1692 | 172.0 | 570.0 | 0.3018 | 170.0 | 0.2982 | 154.0 | 154.0 | 158.0 | 0.9747 | 0.9747 | 0.0 | 0.0 | 152.0 | 0.0 | 0.0 | 15.0 | 17.0 | 142.0 | 0.1197 | 0.1056 | 1.0 | 1.0 | 118.0 | 0.0085 | 0.0085 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.3867 | 1.0 | 6 | 1.0016 | 0.0057 | 823.6421 | 570.9052 | 379.0 | 570.0 | 0.6649 | 379.0 | 0.6649 | 111.0 | 111.0 | 158.0 | 0.7025 | 0.7025 | 120.0 | 120.0 | 152.0 | 0.7895 | 0.7895 | 85.0 | 85.0 | 142.0 | 0.5986 | 0.5986 | 63.0 | 63.0 | 118.0 | 0.5339 | 0.5339 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.1746 | 2.0 | 12 | 0.9085 | 0.0057 | 747.0919 | 517.8447 | 384.0 | 570.0 | 0.6737 | 384.0 | 0.6737 | 96.0 | 96.0 | 158.0 | 0.6076 | 0.6076 | 130.0 | 130.0 | 152.0 | 0.8553 | 0.8553 | 97.0 | 97.0 | 142.0 | 0.6831 | 0.6831 | 61.0 | 61.0 | 118.0 | 0.5169 | 0.5169 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0054 | 3.0 | 18 | 1.5269 | 0.0057 | 1255.6062 | 870.3199 | 409.0 | 570.0 | 0.7175 | 409.0 | 0.7175 | 112.0 | 112.0 | 158.0 | 0.7089 | 0.7089 | 116.0 | 116.0 | 152.0 | 0.7632 | 0.7632 | 94.0 | 94.0 | 142.0 | 0.6620 | 0.6620 | 87.0 | 87.0 | 118.0 | 0.7373 | 0.7373 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 4.0 | 24 | 2.6585 | 0.0057 | 2186.1764 | 1515.3420 | 406.0 | 570.0 | 0.7123 | 405.0 | 0.7105 | 93.0 | 94.0 | 158.0 | 0.5949 | 0.5886 | 126.0 | 126.0 | 152.0 | 0.8289 | 0.8289 | 104.0 | 104.0 | 142.0 | 0.7324 | 0.7324 | 82.0 | 82.0 | 118.0 | 0.6949 | 0.6949 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 5.0 | 30 | 2.9638 | 0.0057 | 2437.2343 | 1689.3621 | 396.0 | 570.0 | 0.6947 | 388.0 | 0.6807 | 123.0 | 129.0 | 158.0 | 0.8165 | 0.7785 | 107.0 | 107.0 | 152.0 | 0.7039 | 0.7039 | 88.0 | 89.0 | 142.0 | 0.6268 | 0.6197 | 70.0 | 71.0 | 118.0 | 0.6017 | 0.5932 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 6.0 | 36 | 2.7461 | 0.0057 | 2258.2292 | 1565.2852 | 410.0 | 570.0 | 0.7193 | 387.0 | 0.6789 | 99.0 | 113.0 | 158.0 | 0.7152 | 0.6266 | 113.0 | 115.0 | 152.0 | 0.7566 | 0.7434 | 96.0 | 98.0 | 142.0 | 0.6901 | 0.6761 | 79.0 | 84.0 | 118.0 | 0.7119 | 0.6695 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.6357 | 7.0 | 42 | 2.8084 | 0.0057 | 2309.4303 | 1600.7751 | 403.0 | 570.0 | 0.7070 | 267.0 | 0.4684 | 46.0 | 108.0 | 158.0 | 0.6835 | 0.2911 | 87.0 | 115.0 | 152.0 | 0.7566 | 0.5724 | 74.0 | 98.0 | 142.0 | 0.6901 | 0.5211 | 60.0 | 82.0 | 118.0 | 0.6949 | 0.5085 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.9179 | 8.0 | 48 | 2.8006 | 0.0057 | 2303.0086 | 1596.3239 | 402.0 | 570.0 | 0.7053 | 60.0 | 0.1053 | 5.0 | 121.0 | 158.0 | 0.7658 | 0.0316 | 20.0 | 106.0 | 152.0 | 0.6974 | 0.1316 | 15.0 | 100.0 | 142.0 | 0.7042 | 0.1056 | 20.0 | 75.0 | 118.0 | 0.6356 | 0.1695 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0001 | 9.0 | 54 | 3.0054 | 0.0057 | 2471.4361 | 1713.0690 | 405.0 | 570.0 | 0.7105 | 153.0 | 0.2684 | 18.0 | 121.0 | 158.0 | 0.7658 | 0.1139 | 45.0 | 106.0 | 152.0 | 0.6974 | 0.2961 | 49.0 | 104.0 | 142.0 | 0.7324 | 0.3451 | 41.0 | 74.0 | 118.0 | 0.6271 | 0.3475 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 10.0 | 60 | 3.1819 | 0.0057 | 2616.5819 | 1813.6764 | 404.0 | 570.0 | 0.7088 | 302.0 | 0.5298 | 70.0 | 120.0 | 158.0 | 0.7595 | 0.4430 | 84.0 | 105.0 | 152.0 | 0.6908 | 0.5526 | 87.0 | 102.0 | 142.0 | 0.7183 | 0.6127 | 61.0 | 77.0 | 118.0 | 0.6525 | 0.5169 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 11.0 | 66 | 3.2572 | 0.0057 | 2678.5170 | 1856.6065 | 402.0 | 570.0 | 0.7053 | 349.0 | 0.6123 | 93.0 | 121.0 | 158.0 | 0.7658 | 0.5886 | 95.0 | 105.0 | 152.0 | 0.6908 | 0.625 | 95.0 | 101.0 | 142.0 | 0.7113 | 0.6690 | 66.0 | 75.0 | 118.0 | 0.6356 | 0.5593 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 12.0 | 72 | 3.2819 | 0.0057 | 2698.8538 | 1870.7029 | 405.0 | 570.0 | 0.7105 | 365.0 | 0.6404 | 99.0 | 121.0 | 158.0 | 0.7658 | 0.6266 | 99.0 | 105.0 | 152.0 | 0.6908 | 0.6513 | 96.0 | 101.0 | 142.0 | 0.7113 | 0.6761 | 71.0 | 78.0 | 118.0 | 0.6610 | 0.6017 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 13.0 | 78 | 3.3385 | 0.0057 | 2745.3597 | 1902.9383 | 404.0 | 570.0 | 0.7088 | 373.0 | 0.6544 | 103.0 | 121.0 | 158.0 | 0.7658 | 0.6519 | 101.0 | 105.0 | 152.0 | 0.6908 | 0.6645 | 96.0 | 100.0 | 142.0 | 0.7042 | 0.6761 | 73.0 | 78.0 | 118.0 | 0.6610 | 0.6186 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 14.0 | 84 | 3.3290 | 0.0057 | 2737.5652 | 1897.5356 | 403.0 | 570.0 | 0.7070 | 376.0 | 0.6596 | 106.0 | 121.0 | 158.0 | 0.7658 | 0.6709 | 102.0 | 105.0 | 152.0 | 0.6908 | 0.6711 | 96.0 | 100.0 | 142.0 | 0.7042 | 0.6761 | 72.0 | 77.0 | 118.0 | 0.6525 | 0.6102 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 15.0 | 90 | 3.3171 | 0.0057 | 2727.7934 | 1890.7623 | 403.0 | 570.0 | 0.7070 | 375.0 | 0.6579 | 105.0 | 121.0 | 158.0 | 0.7658 | 0.6646 | 101.0 | 105.0 | 152.0 | 0.6908 | 0.6645 | 96.0 | 100.0 | 142.0 | 0.7042 | 0.6761 | 73.0 | 77.0 | 118.0 | 0.6525 | 0.6186 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 16.0 | 96 | 3.3260 | 0.0057 | 2735.1033 | 1895.8291 | 402.0 | 570.0 | 0.7053 | 374.0 | 0.6561 | 105.0 | 121.0 | 158.0 | 0.7658 | 0.6646 | 101.0 | 105.0 | 152.0 | 0.6908 | 0.6645 | 96.0 | 100.0 | 142.0 | 0.7042 | 0.6761 | 72.0 | 76.0 | 118.0 | 0.6441 | 0.6102 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 17.0 | 102 | 3.3211 | 0.0057 | 2731.0335 | 1893.0082 | 406.0 | 570.0 | 0.7123 | 380.0 | 0.6667 | 108.0 | 122.0 | 158.0 | 0.7722 | 0.6835 | 102.0 | 106.0 | 152.0 | 0.6974 | 0.6711 | 96.0 | 100.0 | 142.0 | 0.7042 | 0.6761 | 74.0 | 78.0 | 118.0 | 0.6610 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 18.0 | 108 | 3.3253 | 0.0057 | 2734.5364 | 1895.4362 | 404.0 | 570.0 | 0.7088 | 377.0 | 0.6614 | 105.0 | 121.0 | 158.0 | 0.7658 | 0.6646 | 102.0 | 106.0 | 152.0 | 0.6974 | 0.6711 | 96.0 | 100.0 | 142.0 | 0.7042 | 0.6761 | 74.0 | 77.0 | 118.0 | 0.6525 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 19.0 | 114 | 3.3377 | 0.0057 | 2744.6939 | 1902.4768 | 402.0 | 570.0 | 0.7053 | 378.0 | 0.6632 | 108.0 | 121.0 | 158.0 | 0.7658 | 0.6835 | 102.0 | 105.0 | 152.0 | 0.6908 | 0.6711 | 95.0 | 99.0 | 142.0 | 0.6972 | 0.6690 | 73.0 | 77.0 | 118.0 | 0.6525 | 0.6186 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 20.0 | 120 | 3.3364 | 0.0057 | 2743.6099 | 1901.7254 | 405.0 | 570.0 | 0.7105 | 382.0 | 0.6702 | 108.0 | 121.0 | 158.0 | 0.7658 | 0.6835 | 103.0 | 105.0 | 152.0 | 0.6908 | 0.6776 | 97.0 | 101.0 | 142.0 | 0.7113 | 0.6831 | 74.0 | 78.0 | 118.0 | 0.6610 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 21.0 | 126 | 3.3373 | 0.0057 | 2744.3570 | 1902.2433 | 405.0 | 570.0 | 0.7105 | 379.0 | 0.6649 | 107.0 | 121.0 | 158.0 | 0.7658 | 0.6772 | 102.0 | 106.0 | 152.0 | 0.6974 | 0.6711 | 96.0 | 100.0 | 142.0 | 0.7042 | 0.6761 | 74.0 | 78.0 | 118.0 | 0.6610 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 22.0 | 132 | 3.3504 | 0.0057 | 2755.1293 | 1909.7101 | 404.0 | 570.0 | 0.7088 | 381.0 | 0.6684 | 109.0 | 121.0 | 158.0 | 0.7658 | 0.6899 | 103.0 | 106.0 | 152.0 | 0.6974 | 0.6776 | 96.0 | 100.0 | 142.0 | 0.7042 | 0.6761 | 73.0 | 77.0 | 118.0 | 0.6525 | 0.6186 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 23.0 | 138 | 3.3447 | 0.0057 | 2750.4965 | 1906.4989 | 407.0 | 570.0 | 0.7140 | 380.0 | 0.6667 | 108.0 | 122.0 | 158.0 | 0.7722 | 0.6835 | 103.0 | 106.0 | 152.0 | 0.6974 | 0.6776 | 96.0 | 101.0 | 142.0 | 0.7113 | 0.6761 | 73.0 | 78.0 | 118.0 | 0.6610 | 0.6186 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 24.0 | 144 | 3.3547 | 0.0057 | 2758.6616 | 1912.1585 | 404.0 | 570.0 | 0.7088 | 382.0 | 0.6702 | 109.0 | 121.0 | 158.0 | 0.7658 | 0.6899 | 103.0 | 105.0 | 152.0 | 0.6908 | 0.6776 | 96.0 | 100.0 | 142.0 | 0.7042 | 0.6761 | 74.0 | 78.0 | 118.0 | 0.6610 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 25.0 | 150 | 3.3695 | 0.0057 | 2770.8605 | 1920.6142 | 405.0 | 570.0 | 0.7105 | 381.0 | 0.6684 | 109.0 | 121.0 | 158.0 | 0.7658 | 0.6899 | 102.0 | 105.0 | 152.0 | 0.6908 | 0.6711 | 96.0 | 101.0 | 142.0 | 0.7113 | 0.6761 | 74.0 | 78.0 | 118.0 | 0.6610 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 26.0 | 156 | 3.3721 | 0.0057 | 2772.9899 | 1922.0901 | 403.0 | 570.0 | 0.7070 | 383.0 | 0.6719 | 111.0 | 121.0 | 158.0 | 0.7658 | 0.7025 | 102.0 | 104.0 | 152.0 | 0.6842 | 0.6711 | 96.0 | 101.0 | 142.0 | 0.7113 | 0.6761 | 74.0 | 77.0 | 118.0 | 0.6525 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 27.0 | 162 | 3.3721 | 0.0057 | 2772.9858 | 1922.0873 | 404.0 | 570.0 | 0.7088 | 382.0 | 0.6702 | 111.0 | 121.0 | 158.0 | 0.7658 | 0.7025 | 102.0 | 105.0 | 152.0 | 0.6908 | 0.6711 | 96.0 | 101.0 | 142.0 | 0.7113 | 0.6761 | 73.0 | 77.0 | 118.0 | 0.6525 | 0.6186 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 28.0 | 168 | 3.3810 | 0.0057 | 2780.3503 | 1927.1920 | 401.0 | 570.0 | 0.7035 | 380.0 | 0.6667 | 110.0 | 121.0 | 158.0 | 0.7658 | 0.6962 | 103.0 | 105.0 | 152.0 | 0.6908 | 0.6776 | 95.0 | 99.0 | 142.0 | 0.6972 | 0.6690 | 72.0 | 76.0 | 118.0 | 0.6441 | 0.6102 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 29.0 | 174 | 3.3540 | 0.0057 | 2758.0755 | 1911.7523 | 404.0 | 570.0 | 0.7088 | 385.0 | 0.6754 | 112.0 | 121.0 | 158.0 | 0.7658 | 0.7089 | 103.0 | 105.0 | 152.0 | 0.6908 | 0.6776 | 96.0 | 100.0 | 142.0 | 0.7042 | 0.6761 | 74.0 | 78.0 | 118.0 | 0.6610 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 30.0 | 180 | 3.3641 | 0.0057 | 2766.4616 | 1917.5651 | 402.0 | 570.0 | 0.7053 | 381.0 | 0.6684 | 111.0 | 121.0 | 158.0 | 0.7658 | 0.7025 | 102.0 | 105.0 | 152.0 | 0.6908 | 0.6711 | 96.0 | 100.0 | 142.0 | 0.7042 | 0.6761 | 72.0 | 76.0 | 118.0 | 0.6441 | 0.6102 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 31.0 | 186 | 3.3698 | 0.0057 | 2771.1427 | 1920.8097 | 404.0 | 570.0 | 0.7088 | 381.0 | 0.6684 | 110.0 | 121.0 | 158.0 | 0.7658 | 0.6962 | 102.0 | 105.0 | 152.0 | 0.6908 | 0.6711 | 96.0 | 101.0 | 142.0 | 0.7113 | 0.6761 | 73.0 | 77.0 | 118.0 | 0.6525 | 0.6186 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 32.0 | 192 | 3.3674 | 0.0057 | 2769.1561 | 1919.4328 | 402.0 | 570.0 | 0.7053 | 382.0 | 0.6702 | 110.0 | 121.0 | 158.0 | 0.7658 | 0.6962 | 102.0 | 104.0 | 152.0 | 0.6842 | 0.6711 | 96.0 | 100.0 | 142.0 | 0.7042 | 0.6761 | 74.0 | 77.0 | 118.0 | 0.6525 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 33.0 | 198 | 3.3907 | 0.0057 | 2788.3243 | 1932.7191 | 403.0 | 570.0 | 0.7070 | 385.0 | 0.6754 | 111.0 | 121.0 | 158.0 | 0.7658 | 0.7025 | 103.0 | 105.0 | 152.0 | 0.6908 | 0.6776 | 96.0 | 100.0 | 142.0 | 0.7042 | 0.6761 | 75.0 | 77.0 | 118.0 | 0.6525 | 0.6356 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 34.0 | 204 | 3.3846 | 0.0057 | 2783.2820 | 1929.2241 | 404.0 | 570.0 | 0.7088 | 383.0 | 0.6719 | 111.0 | 121.0 | 158.0 | 0.7658 | 0.7025 | 102.0 | 105.0 | 152.0 | 0.6908 | 0.6711 | 95.0 | 100.0 | 142.0 | 0.7042 | 0.6690 | 75.0 | 78.0 | 118.0 | 0.6610 | 0.6356 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 35.0 | 210 | 3.3754 | 0.0057 | 2775.7349 | 1923.9928 | 403.0 | 570.0 | 0.7070 | 386.0 | 0.6772 | 113.0 | 121.0 | 158.0 | 0.7658 | 0.7152 | 102.0 | 105.0 | 152.0 | 0.6908 | 0.6711 | 97.0 | 100.0 | 142.0 | 0.7042 | 0.6831 | 74.0 | 77.0 | 118.0 | 0.6525 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 36.0 | 216 | 3.3802 | 0.0057 | 2779.6781 | 1926.7260 | 406.0 | 570.0 | 0.7123 | 387.0 | 0.6789 | 112.0 | 122.0 | 158.0 | 0.7722 | 0.7089 | 103.0 | 105.0 | 152.0 | 0.6908 | 0.6776 | 96.0 | 101.0 | 142.0 | 0.7113 | 0.6761 | 76.0 | 78.0 | 118.0 | 0.6610 | 0.6441 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 3
Model tree for donoway/ARC-Easy_Llama-3.2-1B-dygan5pn
Base model
meta-llama/Llama-3.2-1B