ARC-Easy_Llama-3.2-1B-l00pih28
This model is a fine-tuned version of meta-llama/Llama-3.2-1B on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.0121
- Model Preparation Time: 0.0058
- Mdl: 832.3148
- Accumulated Loss: 576.9166
- Correct Preds: 402.0
- Total Preds: 570.0
- Accuracy: 0.7053
- Correct Gen Preds: 402.0
- Gen Accuracy: 0.7053
- Correct Gen Preds 32: 101.0
- Correct Preds 32: 101.0
- Total Labels 32: 158.0
- Accuracy 32: 0.6392
- Gen Accuracy 32: 0.6392
- Correct Gen Preds 33: 120.0
- Correct Preds 33: 120.0
- Total Labels 33: 152.0
- Accuracy 33: 0.7895
- Gen Accuracy 33: 0.7895
- Correct Gen Preds 34: 110.0
- Correct Preds 34: 110.0
- Total Labels 34: 142.0
- Accuracy 34: 0.7746
- Gen Accuracy 34: 0.7746
- Correct Gen Preds 35: 71.0
- Correct Preds 35: 71.0
- Total Labels 35: 118.0
- Accuracy 35: 0.6017
- Gen Accuracy 35: 0.6017
- Correct Gen Preds 36: 0.0
- Correct Preds 36: 0.0
- Total Labels 36: 0.0
- Accuracy 36: 0.0
- Gen Accuracy 36: 0.0
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 112
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.01
- num_epochs: 100
Training results
| Training Loss | Epoch | Step | Validation Loss | Model Preparation Time | Mdl | Accumulated Loss | Correct Preds | Total Preds | Accuracy | Correct Gen Preds | Gen Accuracy | Correct Gen Preds 32 | Correct Preds 32 | Total Labels 32 | Accuracy 32 | Gen Accuracy 32 | Correct Gen Preds 33 | Correct Preds 33 | Total Labels 33 | Accuracy 33 | Gen Accuracy 33 | Correct Gen Preds 34 | Correct Preds 34 | Total Labels 34 | Accuracy 34 | Gen Accuracy 34 | Correct Gen Preds 35 | Correct Preds 35 | Total Labels 35 | Accuracy 35 | Gen Accuracy 35 | Correct Gen Preds 36 | Correct Preds 36 | Total Labels 36 | Accuracy 36 | Gen Accuracy 36 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 0 | 0 | 1.5354 | 0.0058 | 1262.6022 | 875.1692 | 172.0 | 570.0 | 0.3018 | 170.0 | 0.2982 | 154.0 | 154.0 | 158.0 | 0.9747 | 0.9747 | 0.0 | 0.0 | 152.0 | 0.0 | 0.0 | 15.0 | 17.0 | 142.0 | 0.1197 | 0.1056 | 1.0 | 1.0 | 118.0 | 0.0085 | 0.0085 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.7537 | 1.0 | 7 | 1.0766 | 0.0058 | 885.2932 | 613.6385 | 375.0 | 570.0 | 0.6579 | 375.0 | 0.6579 | 128.0 | 128.0 | 158.0 | 0.8101 | 0.8101 | 93.0 | 93.0 | 152.0 | 0.6118 | 0.6118 | 90.0 | 90.0 | 142.0 | 0.6338 | 0.6338 | 64.0 | 64.0 | 118.0 | 0.5424 | 0.5424 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.521 | 2.0 | 14 | 1.0121 | 0.0058 | 832.3148 | 576.9166 | 402.0 | 570.0 | 0.7053 | 402.0 | 0.7053 | 101.0 | 101.0 | 158.0 | 0.6392 | 0.6392 | 120.0 | 120.0 | 152.0 | 0.7895 | 0.7895 | 110.0 | 110.0 | 142.0 | 0.7746 | 0.7746 | 71.0 | 71.0 | 118.0 | 0.6017 | 0.6017 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.1506 | 3.0 | 21 | 1.2469 | 0.0058 | 1025.3714 | 710.7333 | 381.0 | 570.0 | 0.6684 | 380.0 | 0.6667 | 93.0 | 94.0 | 158.0 | 0.5949 | 0.5886 | 111.0 | 111.0 | 152.0 | 0.7303 | 0.7303 | 91.0 | 91.0 | 142.0 | 0.6408 | 0.6408 | 85.0 | 85.0 | 118.0 | 0.7203 | 0.7203 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0034 | 4.0 | 28 | 2.0883 | 0.0058 | 1717.2546 | 1190.3102 | 387.0 | 570.0 | 0.6789 | 387.0 | 0.6789 | 105.0 | 105.0 | 158.0 | 0.6646 | 0.6646 | 113.0 | 113.0 | 152.0 | 0.7434 | 0.7434 | 98.0 | 98.0 | 142.0 | 0.6901 | 0.6901 | 71.0 | 71.0 | 118.0 | 0.6017 | 0.6017 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0001 | 5.0 | 35 | 3.1970 | 0.0058 | 2628.9738 | 1822.2658 | 385.0 | 570.0 | 0.6754 | 385.0 | 0.6754 | 106.0 | 106.0 | 158.0 | 0.6709 | 0.6709 | 109.0 | 109.0 | 152.0 | 0.7171 | 0.7171 | 101.0 | 101.0 | 142.0 | 0.7113 | 0.7113 | 69.0 | 69.0 | 118.0 | 0.5847 | 0.5847 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0215 | 6.0 | 42 | 3.4074 | 0.0058 | 2802.0180 | 1942.2109 | 382.0 | 570.0 | 0.6702 | 382.0 | 0.6702 | 96.0 | 96.0 | 158.0 | 0.6076 | 0.6076 | 114.0 | 114.0 | 152.0 | 0.75 | 0.75 | 109.0 | 109.0 | 142.0 | 0.7676 | 0.7676 | 63.0 | 63.0 | 118.0 | 0.5339 | 0.5339 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0001 | 7.0 | 49 | 3.6803 | 0.0058 | 3026.4624 | 2097.7839 | 382.0 | 570.0 | 0.6702 | 378.0 | 0.6632 | 87.0 | 87.0 | 158.0 | 0.5506 | 0.5506 | 116.0 | 116.0 | 152.0 | 0.7632 | 0.7632 | 109.0 | 109.0 | 142.0 | 0.7676 | 0.7676 | 66.0 | 70.0 | 118.0 | 0.5932 | 0.5593 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0001 | 8.0 | 56 | 3.6821 | 0.0058 | 3027.9010 | 2098.7811 | 385.0 | 570.0 | 0.6754 | 383.0 | 0.6719 | 95.0 | 95.0 | 158.0 | 0.6013 | 0.6013 | 110.0 | 110.0 | 152.0 | 0.7237 | 0.7237 | 108.0 | 108.0 | 142.0 | 0.7606 | 0.7606 | 70.0 | 72.0 | 118.0 | 0.6102 | 0.5932 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 9.0 | 63 | 3.8801 | 0.0058 | 3190.7166 | 2211.6362 | 386.0 | 570.0 | 0.6772 | 385.0 | 0.6754 | 94.0 | 94.0 | 158.0 | 0.5949 | 0.5949 | 110.0 | 110.0 | 152.0 | 0.7237 | 0.7237 | 107.0 | 107.0 | 142.0 | 0.7535 | 0.7535 | 74.0 | 75.0 | 118.0 | 0.6356 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0001 | 10.0 | 70 | 3.9489 | 0.0058 | 3247.2899 | 2250.8498 | 386.0 | 570.0 | 0.6772 | 385.0 | 0.6754 | 94.0 | 94.0 | 158.0 | 0.5949 | 0.5949 | 110.0 | 110.0 | 152.0 | 0.7237 | 0.7237 | 107.0 | 107.0 | 142.0 | 0.7535 | 0.7535 | 74.0 | 75.0 | 118.0 | 0.6356 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 11.0 | 77 | 4.0069 | 0.0058 | 3295.0467 | 2283.9523 | 386.0 | 570.0 | 0.6772 | 385.0 | 0.6754 | 94.0 | 94.0 | 158.0 | 0.5949 | 0.5949 | 110.0 | 110.0 | 152.0 | 0.7237 | 0.7237 | 107.0 | 107.0 | 142.0 | 0.7535 | 0.7535 | 74.0 | 75.0 | 118.0 | 0.6356 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 12.0 | 84 | 4.0100 | 0.0058 | 3297.5344 | 2285.6767 | 386.0 | 570.0 | 0.6772 | 385.0 | 0.6754 | 94.0 | 94.0 | 158.0 | 0.5949 | 0.5949 | 110.0 | 110.0 | 152.0 | 0.7237 | 0.7237 | 107.0 | 107.0 | 142.0 | 0.7535 | 0.7535 | 74.0 | 75.0 | 118.0 | 0.6356 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 13.0 | 91 | 4.0224 | 0.0058 | 3307.7362 | 2292.7480 | 387.0 | 570.0 | 0.6789 | 387.0 | 0.6789 | 94.0 | 94.0 | 158.0 | 0.5949 | 0.5949 | 110.0 | 110.0 | 152.0 | 0.7237 | 0.7237 | 108.0 | 108.0 | 142.0 | 0.7606 | 0.7606 | 75.0 | 75.0 | 118.0 | 0.6356 | 0.6356 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 14.0 | 98 | 4.0302 | 0.0058 | 3314.2083 | 2297.2341 | 387.0 | 570.0 | 0.6789 | 386.0 | 0.6772 | 94.0 | 94.0 | 158.0 | 0.5949 | 0.5949 | 110.0 | 110.0 | 152.0 | 0.7237 | 0.7237 | 108.0 | 108.0 | 142.0 | 0.7606 | 0.7606 | 74.0 | 75.0 | 118.0 | 0.6356 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 15.0 | 105 | 4.0472 | 0.0058 | 3328.1717 | 2306.9128 | 387.0 | 570.0 | 0.6789 | 386.0 | 0.6772 | 94.0 | 94.0 | 158.0 | 0.5949 | 0.5949 | 110.0 | 110.0 | 152.0 | 0.7237 | 0.7237 | 108.0 | 108.0 | 142.0 | 0.7606 | 0.7606 | 74.0 | 75.0 | 118.0 | 0.6356 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 16.0 | 112 | 4.0484 | 0.0058 | 3329.1411 | 2307.5847 | 387.0 | 570.0 | 0.6789 | 385.0 | 0.6754 | 94.0 | 95.0 | 158.0 | 0.6013 | 0.5949 | 110.0 | 110.0 | 152.0 | 0.7237 | 0.7237 | 107.0 | 107.0 | 142.0 | 0.7535 | 0.7535 | 74.0 | 75.0 | 118.0 | 0.6356 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 17.0 | 119 | 4.0769 | 0.0058 | 3352.6059 | 2323.8493 | 387.0 | 570.0 | 0.6789 | 386.0 | 0.6772 | 94.0 | 94.0 | 158.0 | 0.5949 | 0.5949 | 110.0 | 110.0 | 152.0 | 0.7237 | 0.7237 | 108.0 | 108.0 | 142.0 | 0.7606 | 0.7606 | 74.0 | 75.0 | 118.0 | 0.6356 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 18.0 | 126 | 4.0711 | 0.0058 | 3347.7892 | 2320.5106 | 385.0 | 570.0 | 0.6754 | 384.0 | 0.6737 | 93.0 | 93.0 | 158.0 | 0.5886 | 0.5886 | 110.0 | 110.0 | 152.0 | 0.7237 | 0.7237 | 107.0 | 107.0 | 142.0 | 0.7535 | 0.7535 | 74.0 | 75.0 | 118.0 | 0.6356 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 19.0 | 133 | 4.0933 | 0.0058 | 3366.1069 | 2333.2075 | 387.0 | 570.0 | 0.6789 | 386.0 | 0.6772 | 94.0 | 94.0 | 158.0 | 0.5949 | 0.5949 | 110.0 | 110.0 | 152.0 | 0.7237 | 0.7237 | 108.0 | 108.0 | 142.0 | 0.7606 | 0.7606 | 74.0 | 75.0 | 118.0 | 0.6356 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 20.0 | 140 | 4.0916 | 0.0058 | 3364.6355 | 2332.1876 | 388.0 | 570.0 | 0.6807 | 387.0 | 0.6789 | 94.0 | 94.0 | 158.0 | 0.5949 | 0.5949 | 110.0 | 110.0 | 152.0 | 0.7237 | 0.7237 | 109.0 | 109.0 | 142.0 | 0.7676 | 0.7676 | 74.0 | 75.0 | 118.0 | 0.6356 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 21.0 | 147 | 4.0825 | 0.0058 | 3357.2105 | 2327.0410 | 388.0 | 570.0 | 0.6807 | 387.0 | 0.6789 | 94.0 | 94.0 | 158.0 | 0.5949 | 0.5949 | 110.0 | 110.0 | 152.0 | 0.7237 | 0.7237 | 109.0 | 109.0 | 142.0 | 0.7676 | 0.7676 | 74.0 | 75.0 | 118.0 | 0.6356 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 22.0 | 154 | 4.0640 | 0.0058 | 3341.9560 | 2316.4674 | 387.0 | 570.0 | 0.6789 | 386.0 | 0.6772 | 94.0 | 94.0 | 158.0 | 0.5949 | 0.5949 | 110.0 | 110.0 | 152.0 | 0.7237 | 0.7237 | 108.0 | 108.0 | 142.0 | 0.7606 | 0.7606 | 74.0 | 75.0 | 118.0 | 0.6356 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 23.0 | 161 | 4.0841 | 0.0058 | 3358.4668 | 2327.9118 | 388.0 | 570.0 | 0.6807 | 387.0 | 0.6789 | 94.0 | 94.0 | 158.0 | 0.5949 | 0.5949 | 110.0 | 110.0 | 152.0 | 0.7237 | 0.7237 | 109.0 | 109.0 | 142.0 | 0.7676 | 0.7676 | 74.0 | 75.0 | 118.0 | 0.6356 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 24.0 | 168 | 4.0792 | 0.0058 | 3354.4495 | 2325.1272 | 388.0 | 570.0 | 0.6807 | 386.0 | 0.6772 | 94.0 | 95.0 | 158.0 | 0.6013 | 0.5949 | 110.0 | 110.0 | 152.0 | 0.7237 | 0.7237 | 108.0 | 108.0 | 142.0 | 0.7606 | 0.7606 | 74.0 | 75.0 | 118.0 | 0.6356 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 25.0 | 175 | 4.0917 | 0.0058 | 3364.7455 | 2332.2639 | 387.0 | 570.0 | 0.6789 | 386.0 | 0.6772 | 94.0 | 94.0 | 158.0 | 0.5949 | 0.5949 | 110.0 | 110.0 | 152.0 | 0.7237 | 0.7237 | 108.0 | 108.0 | 142.0 | 0.7606 | 0.7606 | 74.0 | 75.0 | 118.0 | 0.6356 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 26.0 | 182 | 4.1026 | 0.0058 | 3373.7140 | 2338.4803 | 387.0 | 570.0 | 0.6789 | 386.0 | 0.6772 | 94.0 | 94.0 | 158.0 | 0.5949 | 0.5949 | 110.0 | 110.0 | 152.0 | 0.7237 | 0.7237 | 108.0 | 108.0 | 142.0 | 0.7606 | 0.7606 | 74.0 | 75.0 | 118.0 | 0.6356 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 27.0 | 189 | 4.1036 | 0.0058 | 3374.5095 | 2339.0317 | 387.0 | 570.0 | 0.6789 | 386.0 | 0.6772 | 94.0 | 94.0 | 158.0 | 0.5949 | 0.5949 | 110.0 | 110.0 | 152.0 | 0.7237 | 0.7237 | 108.0 | 108.0 | 142.0 | 0.7606 | 0.7606 | 74.0 | 75.0 | 118.0 | 0.6356 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 28.0 | 196 | 4.1163 | 0.0058 | 3385.0067 | 2346.3078 | 387.0 | 570.0 | 0.6789 | 386.0 | 0.6772 | 94.0 | 94.0 | 158.0 | 0.5949 | 0.5949 | 110.0 | 110.0 | 152.0 | 0.7237 | 0.7237 | 108.0 | 108.0 | 142.0 | 0.7606 | 0.7606 | 74.0 | 75.0 | 118.0 | 0.6356 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 29.0 | 203 | 4.1011 | 0.0058 | 3372.5132 | 2337.6480 | 387.0 | 570.0 | 0.6789 | 386.0 | 0.6772 | 95.0 | 95.0 | 158.0 | 0.6013 | 0.6013 | 110.0 | 110.0 | 152.0 | 0.7237 | 0.7237 | 107.0 | 107.0 | 142.0 | 0.7535 | 0.7535 | 74.0 | 75.0 | 118.0 | 0.6356 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 30.0 | 210 | 4.1281 | 0.0058 | 3394.7200 | 2353.0406 | 387.0 | 570.0 | 0.6789 | 386.0 | 0.6772 | 95.0 | 95.0 | 158.0 | 0.6013 | 0.6013 | 110.0 | 110.0 | 152.0 | 0.7237 | 0.7237 | 107.0 | 107.0 | 142.0 | 0.7535 | 0.7535 | 74.0 | 75.0 | 118.0 | 0.6356 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 31.0 | 217 | 4.1409 | 0.0058 | 3405.1901 | 2360.2979 | 387.0 | 570.0 | 0.6789 | 385.0 | 0.6754 | 94.0 | 95.0 | 158.0 | 0.6013 | 0.5949 | 109.0 | 109.0 | 152.0 | 0.7171 | 0.7171 | 108.0 | 108.0 | 142.0 | 0.7606 | 0.7606 | 74.0 | 75.0 | 118.0 | 0.6356 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 32.0 | 224 | 4.1452 | 0.0058 | 3408.7242 | 2362.7476 | 387.0 | 570.0 | 0.6789 | 386.0 | 0.6772 | 94.0 | 94.0 | 158.0 | 0.5949 | 0.5949 | 110.0 | 110.0 | 152.0 | 0.7237 | 0.7237 | 108.0 | 108.0 | 142.0 | 0.7606 | 0.7606 | 74.0 | 75.0 | 118.0 | 0.6356 | 0.6271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 1
Model tree for donoway/ARC-Easy_Llama-3.2-1B-l00pih28
Base model
meta-llama/Llama-3.2-1B