ARC-Easy_Llama-3.2-1B-p985fyrb
This model is a fine-tuned version of meta-llama/Llama-3.2-1B on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 3.1280
- Model Preparation Time: 0.0059
- Mdl: 2572.2448
- Accumulated Loss: 1782.9442
- Correct Preds: 391.0
- Total Preds: 570.0
- Accuracy: 0.6860
- Correct Gen Preds: 391.0
- Gen Accuracy: 0.6860
- Correct Gen Preds 32: 102.0
- Correct Preds 32: 102.0
- Total Labels 32: 158.0
- Accuracy 32: 0.6456
- Gen Accuracy 32: 0.6456
- Correct Gen Preds 33: 112.0
- Correct Preds 33: 112.0
- Total Labels 33: 152.0
- Accuracy 33: 0.7368
- Gen Accuracy 33: 0.7368
- Correct Gen Preds 34: 98.0
- Correct Preds 34: 98.0
- Total Labels 34: 142.0
- Accuracy 34: 0.6901
- Gen Accuracy 34: 0.6901
- Correct Gen Preds 35: 79.0
- Correct Preds 35: 79.0
- Total Labels 35: 118.0
- Accuracy 35: 0.6695
- Gen Accuracy 35: 0.6695
- Correct Gen Preds 36: 0.0
- Correct Preds 36: 0.0
- Total Labels 36: 0.0
- Accuracy 36: 0.0
- Gen Accuracy 36: 0.0
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 112
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.01
- num_epochs: 100
Training results
| Training Loss | Epoch | Step | Validation Loss | Model Preparation Time | Mdl | Accumulated Loss | Correct Preds | Total Preds | Accuracy | Correct Gen Preds | Gen Accuracy | Correct Gen Preds 32 | Correct Preds 32 | Total Labels 32 | Accuracy 32 | Gen Accuracy 32 | Correct Gen Preds 33 | Correct Preds 33 | Total Labels 33 | Accuracy 33 | Gen Accuracy 33 | Correct Gen Preds 34 | Correct Preds 34 | Total Labels 34 | Accuracy 34 | Gen Accuracy 34 | Correct Gen Preds 35 | Correct Preds 35 | Total Labels 35 | Accuracy 35 | Gen Accuracy 35 | Correct Gen Preds 36 | Correct Preds 36 | Total Labels 36 | Accuracy 36 | Gen Accuracy 36 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 0 | 0 | 1.5354 | 0.0059 | 1262.6022 | 875.1692 | 172.0 | 570.0 | 0.3018 | 170.0 | 0.2982 | 154.0 | 154.0 | 158.0 | 0.9747 | 0.9747 | 0.0 | 0.0 | 152.0 | 0.0 | 0.0 | 15.0 | 17.0 | 142.0 | 0.1197 | 0.1056 | 1.0 | 1.0 | 118.0 | 0.0085 | 0.0085 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.8001 | 1.0 | 4 | 1.2027 | 0.0059 | 989.0572 | 685.5622 | 300.0 | 570.0 | 0.5263 | 300.0 | 0.5263 | 22.0 | 22.0 | 158.0 | 0.1392 | 0.1392 | 115.0 | 115.0 | 152.0 | 0.7566 | 0.7566 | 73.0 | 73.0 | 142.0 | 0.5141 | 0.5141 | 90.0 | 90.0 | 118.0 | 0.7627 | 0.7627 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.5383 | 2.0 | 8 | 0.9091 | 0.0059 | 747.5608 | 518.1697 | 383.0 | 570.0 | 0.6719 | 383.0 | 0.6719 | 89.0 | 89.0 | 158.0 | 0.5633 | 0.5633 | 116.0 | 116.0 | 152.0 | 0.7632 | 0.7632 | 102.0 | 102.0 | 142.0 | 0.7183 | 0.7183 | 76.0 | 76.0 | 118.0 | 0.6441 | 0.6441 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0249 | 3.0 | 12 | 1.9204 | 0.0059 | 1579.2329 | 1094.6408 | 390.0 | 570.0 | 0.6842 | 390.0 | 0.6842 | 124.0 | 124.0 | 158.0 | 0.7848 | 0.7848 | 101.0 | 101.0 | 152.0 | 0.6645 | 0.6645 | 97.0 | 97.0 | 142.0 | 0.6831 | 0.6831 | 68.0 | 68.0 | 118.0 | 0.5763 | 0.5763 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0007 | 4.0 | 16 | 2.5739 | 0.0059 | 2116.5801 | 1467.1015 | 386.0 | 570.0 | 0.6772 | 385.0 | 0.6754 | 89.0 | 90.0 | 158.0 | 0.5696 | 0.5633 | 117.0 | 117.0 | 152.0 | 0.7697 | 0.7697 | 98.0 | 98.0 | 142.0 | 0.6901 | 0.6901 | 81.0 | 81.0 | 118.0 | 0.6864 | 0.6864 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0017 | 5.0 | 20 | 3.1280 | 0.0059 | 2572.2448 | 1782.9442 | 391.0 | 570.0 | 0.6860 | 391.0 | 0.6860 | 102.0 | 102.0 | 158.0 | 0.6456 | 0.6456 | 112.0 | 112.0 | 152.0 | 0.7368 | 0.7368 | 98.0 | 98.0 | 142.0 | 0.6901 | 0.6901 | 79.0 | 79.0 | 118.0 | 0.6695 | 0.6695 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0003 | 6.0 | 24 | 3.8452 | 0.0059 | 3162.0871 | 2191.7918 | 386.0 | 570.0 | 0.6772 | 386.0 | 0.6772 | 110.0 | 110.0 | 158.0 | 0.6962 | 0.6962 | 103.0 | 103.0 | 152.0 | 0.6776 | 0.6776 | 94.0 | 94.0 | 142.0 | 0.6620 | 0.6620 | 79.0 | 79.0 | 118.0 | 0.6695 | 0.6695 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 7.0 | 28 | 4.4219 | 0.0059 | 3636.2606 | 2520.4638 | 382.0 | 570.0 | 0.6702 | 382.0 | 0.6702 | 113.0 | 113.0 | 158.0 | 0.7152 | 0.7152 | 100.0 | 100.0 | 152.0 | 0.6579 | 0.6579 | 90.0 | 90.0 | 142.0 | 0.6338 | 0.6338 | 79.0 | 79.0 | 118.0 | 0.6695 | 0.6695 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 8.0 | 32 | 4.6772 | 0.0059 | 3846.1986 | 2665.9817 | 380.0 | 570.0 | 0.6667 | 380.0 | 0.6667 | 114.0 | 114.0 | 158.0 | 0.7215 | 0.7215 | 97.0 | 97.0 | 152.0 | 0.6382 | 0.6382 | 91.0 | 91.0 | 142.0 | 0.6408 | 0.6408 | 78.0 | 78.0 | 118.0 | 0.6610 | 0.6610 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 9.0 | 36 | 4.8253 | 0.0059 | 3968.0250 | 2750.4254 | 377.0 | 570.0 | 0.6614 | 377.0 | 0.6614 | 113.0 | 113.0 | 158.0 | 0.7152 | 0.7152 | 95.0 | 95.0 | 152.0 | 0.625 | 0.625 | 92.0 | 92.0 | 142.0 | 0.6479 | 0.6479 | 77.0 | 77.0 | 118.0 | 0.6525 | 0.6525 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 10.0 | 40 | 4.9416 | 0.0059 | 4063.6575 | 2816.7127 | 378.0 | 570.0 | 0.6632 | 378.0 | 0.6632 | 113.0 | 113.0 | 158.0 | 0.7152 | 0.7152 | 95.0 | 95.0 | 152.0 | 0.625 | 0.625 | 93.0 | 93.0 | 142.0 | 0.6549 | 0.6549 | 77.0 | 77.0 | 118.0 | 0.6525 | 0.6525 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 11.0 | 44 | 4.9382 | 0.0059 | 4060.8541 | 2814.7696 | 377.0 | 570.0 | 0.6614 | 377.0 | 0.6614 | 112.0 | 112.0 | 158.0 | 0.7089 | 0.7089 | 95.0 | 95.0 | 152.0 | 0.625 | 0.625 | 93.0 | 93.0 | 142.0 | 0.6549 | 0.6549 | 77.0 | 77.0 | 118.0 | 0.6525 | 0.6525 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 12.0 | 48 | 5.0121 | 0.0059 | 4121.6647 | 2856.9202 | 379.0 | 570.0 | 0.6649 | 379.0 | 0.6649 | 114.0 | 114.0 | 158.0 | 0.7215 | 0.7215 | 95.0 | 95.0 | 152.0 | 0.625 | 0.625 | 93.0 | 93.0 | 142.0 | 0.6549 | 0.6549 | 77.0 | 77.0 | 118.0 | 0.6525 | 0.6525 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 13.0 | 52 | 4.9747 | 0.0059 | 4090.8914 | 2835.5898 | 380.0 | 570.0 | 0.6667 | 380.0 | 0.6667 | 113.0 | 113.0 | 158.0 | 0.7152 | 0.7152 | 95.0 | 95.0 | 152.0 | 0.625 | 0.625 | 94.0 | 94.0 | 142.0 | 0.6620 | 0.6620 | 78.0 | 78.0 | 118.0 | 0.6610 | 0.6610 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 14.0 | 56 | 5.0345 | 0.0059 | 4140.0879 | 2869.6903 | 380.0 | 570.0 | 0.6667 | 380.0 | 0.6667 | 114.0 | 114.0 | 158.0 | 0.7215 | 0.7215 | 95.0 | 95.0 | 152.0 | 0.625 | 0.625 | 93.0 | 93.0 | 142.0 | 0.6549 | 0.6549 | 78.0 | 78.0 | 118.0 | 0.6610 | 0.6610 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 15.0 | 60 | 5.0372 | 0.0059 | 4142.3083 | 2871.2294 | 379.0 | 570.0 | 0.6649 | 379.0 | 0.6649 | 113.0 | 113.0 | 158.0 | 0.7152 | 0.7152 | 94.0 | 94.0 | 152.0 | 0.6184 | 0.6184 | 95.0 | 95.0 | 142.0 | 0.6690 | 0.6690 | 77.0 | 77.0 | 118.0 | 0.6525 | 0.6525 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 16.0 | 64 | 5.0324 | 0.0059 | 4138.2840 | 2868.4399 | 377.0 | 570.0 | 0.6614 | 377.0 | 0.6614 | 112.0 | 112.0 | 158.0 | 0.7089 | 0.7089 | 95.0 | 95.0 | 152.0 | 0.625 | 0.625 | 93.0 | 93.0 | 142.0 | 0.6549 | 0.6549 | 77.0 | 77.0 | 118.0 | 0.6525 | 0.6525 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 17.0 | 68 | 5.0466 | 0.0059 | 4150.0401 | 2876.5886 | 378.0 | 570.0 | 0.6632 | 378.0 | 0.6632 | 112.0 | 112.0 | 158.0 | 0.7089 | 0.7089 | 95.0 | 95.0 | 152.0 | 0.625 | 0.625 | 93.0 | 93.0 | 142.0 | 0.6549 | 0.6549 | 78.0 | 78.0 | 118.0 | 0.6610 | 0.6610 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 18.0 | 72 | 5.0758 | 0.0059 | 4174.0155 | 2893.2071 | 376.0 | 570.0 | 0.6596 | 376.0 | 0.6596 | 112.0 | 112.0 | 158.0 | 0.7089 | 0.7089 | 94.0 | 94.0 | 152.0 | 0.6184 | 0.6184 | 92.0 | 92.0 | 142.0 | 0.6479 | 0.6479 | 78.0 | 78.0 | 118.0 | 0.6610 | 0.6610 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 19.0 | 76 | 5.0498 | 0.0059 | 4152.6162 | 2878.3742 | 379.0 | 570.0 | 0.6649 | 379.0 | 0.6649 | 113.0 | 113.0 | 158.0 | 0.7152 | 0.7152 | 96.0 | 96.0 | 152.0 | 0.6316 | 0.6316 | 93.0 | 93.0 | 142.0 | 0.6549 | 0.6549 | 77.0 | 77.0 | 118.0 | 0.6525 | 0.6525 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 20.0 | 80 | 5.0611 | 0.0059 | 4161.9058 | 2884.8132 | 379.0 | 570.0 | 0.6649 | 379.0 | 0.6649 | 112.0 | 112.0 | 158.0 | 0.7089 | 0.7089 | 95.0 | 95.0 | 152.0 | 0.625 | 0.625 | 94.0 | 94.0 | 142.0 | 0.6620 | 0.6620 | 78.0 | 78.0 | 118.0 | 0.6610 | 0.6610 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 21.0 | 84 | 5.0557 | 0.0059 | 4157.5229 | 2881.7753 | 378.0 | 570.0 | 0.6632 | 378.0 | 0.6632 | 112.0 | 112.0 | 158.0 | 0.7089 | 0.7089 | 94.0 | 94.0 | 152.0 | 0.6184 | 0.6184 | 94.0 | 94.0 | 142.0 | 0.6620 | 0.6620 | 78.0 | 78.0 | 118.0 | 0.6610 | 0.6610 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 22.0 | 88 | 5.0334 | 0.0059 | 4139.1444 | 2869.0363 | 379.0 | 570.0 | 0.6649 | 379.0 | 0.6649 | 113.0 | 113.0 | 158.0 | 0.7152 | 0.7152 | 95.0 | 95.0 | 152.0 | 0.625 | 0.625 | 94.0 | 94.0 | 142.0 | 0.6620 | 0.6620 | 77.0 | 77.0 | 118.0 | 0.6525 | 0.6525 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 23.0 | 92 | 5.0884 | 0.0059 | 4184.3384 | 2900.3623 | 379.0 | 570.0 | 0.6649 | 379.0 | 0.6649 | 113.0 | 113.0 | 158.0 | 0.7152 | 0.7152 | 95.0 | 95.0 | 152.0 | 0.625 | 0.625 | 94.0 | 94.0 | 142.0 | 0.6620 | 0.6620 | 77.0 | 77.0 | 118.0 | 0.6525 | 0.6525 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 24.0 | 96 | 5.0960 | 0.0059 | 4190.6606 | 2904.7446 | 380.0 | 570.0 | 0.6667 | 380.0 | 0.6667 | 112.0 | 112.0 | 158.0 | 0.7089 | 0.7089 | 96.0 | 96.0 | 152.0 | 0.6316 | 0.6316 | 94.0 | 94.0 | 142.0 | 0.6620 | 0.6620 | 78.0 | 78.0 | 118.0 | 0.6610 | 0.6610 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 25.0 | 100 | 5.0158 | 0.0059 | 4124.6379 | 2858.9811 | 383.0 | 570.0 | 0.6719 | 383.0 | 0.6719 | 114.0 | 114.0 | 158.0 | 0.7215 | 0.7215 | 96.0 | 96.0 | 152.0 | 0.6316 | 0.6316 | 95.0 | 95.0 | 142.0 | 0.6690 | 0.6690 | 78.0 | 78.0 | 118.0 | 0.6610 | 0.6610 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 26.0 | 104 | 5.0697 | 0.0059 | 4168.9987 | 2889.7297 | 378.0 | 570.0 | 0.6632 | 378.0 | 0.6632 | 112.0 | 112.0 | 158.0 | 0.7089 | 0.7089 | 95.0 | 95.0 | 152.0 | 0.625 | 0.625 | 93.0 | 93.0 | 142.0 | 0.6549 | 0.6549 | 78.0 | 78.0 | 118.0 | 0.6610 | 0.6610 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 27.0 | 108 | 5.0597 | 0.0059 | 4160.7773 | 2884.0311 | 379.0 | 570.0 | 0.6649 | 379.0 | 0.6649 | 112.0 | 112.0 | 158.0 | 0.7089 | 0.7089 | 95.0 | 95.0 | 152.0 | 0.625 | 0.625 | 94.0 | 94.0 | 142.0 | 0.6620 | 0.6620 | 78.0 | 78.0 | 118.0 | 0.6610 | 0.6610 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 28.0 | 112 | 5.0579 | 0.0059 | 4159.2565 | 2882.9769 | 379.0 | 570.0 | 0.6649 | 379.0 | 0.6649 | 114.0 | 114.0 | 158.0 | 0.7215 | 0.7215 | 94.0 | 94.0 | 152.0 | 0.6184 | 0.6184 | 94.0 | 94.0 | 142.0 | 0.6620 | 0.6620 | 77.0 | 77.0 | 118.0 | 0.6525 | 0.6525 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 29.0 | 116 | 5.0731 | 0.0059 | 4171.8091 | 2891.6777 | 378.0 | 570.0 | 0.6632 | 378.0 | 0.6632 | 112.0 | 112.0 | 158.0 | 0.7089 | 0.7089 | 94.0 | 94.0 | 152.0 | 0.6184 | 0.6184 | 94.0 | 94.0 | 142.0 | 0.6620 | 0.6620 | 78.0 | 78.0 | 118.0 | 0.6610 | 0.6610 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 30.0 | 120 | 5.0587 | 0.0059 | 4159.9854 | 2883.4822 | 377.0 | 570.0 | 0.6614 | 377.0 | 0.6614 | 113.0 | 113.0 | 158.0 | 0.7152 | 0.7152 | 93.0 | 93.0 | 152.0 | 0.6118 | 0.6118 | 94.0 | 94.0 | 142.0 | 0.6620 | 0.6620 | 77.0 | 77.0 | 118.0 | 0.6525 | 0.6525 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 31.0 | 124 | 5.0970 | 0.0059 | 4191.4450 | 2905.2883 | 376.0 | 570.0 | 0.6596 | 376.0 | 0.6596 | 112.0 | 112.0 | 158.0 | 0.7089 | 0.7089 | 94.0 | 94.0 | 152.0 | 0.6184 | 0.6184 | 93.0 | 93.0 | 142.0 | 0.6549 | 0.6549 | 77.0 | 77.0 | 118.0 | 0.6525 | 0.6525 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 32.0 | 128 | 5.0980 | 0.0059 | 4192.2751 | 2905.8637 | 375.0 | 570.0 | 0.6579 | 375.0 | 0.6579 | 111.0 | 111.0 | 158.0 | 0.7025 | 0.7025 | 94.0 | 94.0 | 152.0 | 0.6184 | 0.6184 | 93.0 | 93.0 | 142.0 | 0.6549 | 0.6549 | 77.0 | 77.0 | 118.0 | 0.6525 | 0.6525 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 33.0 | 132 | 5.0987 | 0.0059 | 4192.8484 | 2906.2611 | 377.0 | 570.0 | 0.6614 | 377.0 | 0.6614 | 112.0 | 112.0 | 158.0 | 0.7089 | 0.7089 | 94.0 | 94.0 | 152.0 | 0.6184 | 0.6184 | 94.0 | 94.0 | 142.0 | 0.6620 | 0.6620 | 77.0 | 77.0 | 118.0 | 0.6525 | 0.6525 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 34.0 | 136 | 5.0753 | 0.0059 | 4173.6419 | 2892.9481 | 379.0 | 570.0 | 0.6649 | 379.0 | 0.6649 | 113.0 | 113.0 | 158.0 | 0.7152 | 0.7152 | 95.0 | 95.0 | 152.0 | 0.625 | 0.625 | 94.0 | 94.0 | 142.0 | 0.6620 | 0.6620 | 77.0 | 77.0 | 118.0 | 0.6525 | 0.6525 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 35.0 | 140 | 5.0517 | 0.0059 | 4154.1764 | 2879.4557 | 379.0 | 570.0 | 0.6649 | 379.0 | 0.6649 | 112.0 | 112.0 | 158.0 | 0.7089 | 0.7089 | 96.0 | 96.0 | 152.0 | 0.6316 | 0.6316 | 94.0 | 94.0 | 142.0 | 0.6620 | 0.6620 | 77.0 | 77.0 | 118.0 | 0.6525 | 0.6525 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 2
Model tree for donoway/ARC-Easy_Llama-3.2-1B-p985fyrb
Base model
meta-llama/Llama-3.2-1B