exceptions_exp2_swap_0.3_last_to_carry_5039
This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 3.5498
- Accuracy: 0.3741
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0006
- train_batch_size: 16
- eval_batch_size: 16
- seed: 5039
- gradient_accumulation_steps: 5
- total_train_batch_size: 80
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.98) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 50.0
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Accuracy | Validation Loss |
|---|---|---|---|---|
| 4.8376 | 0.2915 | 1000 | 0.2542 | 4.7589 |
| 4.3508 | 0.5830 | 2000 | 0.2977 | 4.2982 |
| 4.1599 | 0.8745 | 3000 | 0.3137 | 4.1119 |
| 4.0058 | 1.1659 | 4000 | 0.3232 | 4.0022 |
| 3.9397 | 1.4574 | 5000 | 0.3305 | 3.9227 |
| 3.8862 | 1.7488 | 6000 | 0.3356 | 3.8669 |
| 3.7487 | 2.0402 | 7000 | 0.3400 | 3.8238 |
| 3.7661 | 2.3317 | 8000 | 0.3429 | 3.7931 |
| 3.7449 | 2.6232 | 9000 | 0.3457 | 3.7612 |
| 3.7305 | 2.9147 | 10000 | 0.3482 | 3.7342 |
| 3.636 | 3.2061 | 11000 | 0.3503 | 3.7234 |
| 3.6523 | 3.4976 | 12000 | 0.3518 | 3.7036 |
| 3.6479 | 3.7891 | 13000 | 0.3535 | 3.6866 |
| 3.5494 | 4.0805 | 14000 | 0.3549 | 3.6778 |
| 3.5719 | 4.3719 | 15000 | 0.3561 | 3.6679 |
| 3.5815 | 4.6634 | 16000 | 0.3575 | 3.6531 |
| 3.5827 | 4.9549 | 17000 | 0.3585 | 3.6418 |
| 3.5121 | 5.2463 | 18000 | 0.3592 | 3.6428 |
| 3.5172 | 5.5378 | 19000 | 0.3597 | 3.6341 |
| 3.5469 | 5.8293 | 20000 | 0.3607 | 3.6216 |
| 3.4552 | 6.1207 | 21000 | 0.3614 | 3.6244 |
| 3.4922 | 6.4122 | 22000 | 0.3619 | 3.6169 |
| 3.499 | 6.7037 | 23000 | 0.3626 | 3.6087 |
| 3.4927 | 6.9952 | 24000 | 0.3639 | 3.5962 |
| 3.4408 | 7.2865 | 25000 | 0.3633 | 3.6067 |
| 3.4523 | 7.5780 | 26000 | 0.3640 | 3.5957 |
| 3.4617 | 7.8695 | 27000 | 0.3650 | 3.5857 |
| 3.396 | 8.1609 | 28000 | 0.3648 | 3.5959 |
| 3.4334 | 8.4524 | 29000 | 0.3652 | 3.5926 |
| 3.4364 | 8.7439 | 30000 | 0.3661 | 3.5817 |
| 3.3397 | 9.0353 | 31000 | 0.3660 | 3.5855 |
| 3.3905 | 9.3268 | 32000 | 0.3661 | 3.5862 |
| 3.4009 | 9.6183 | 33000 | 0.3670 | 3.5789 |
| 3.4185 | 9.9098 | 34000 | 0.3677 | 3.5687 |
| 3.348 | 10.2011 | 35000 | 0.3672 | 3.5799 |
| 3.3779 | 10.4926 | 36000 | 0.3676 | 3.5705 |
| 3.3988 | 10.7841 | 37000 | 0.3681 | 3.5676 |
| 3.3056 | 11.0755 | 38000 | 0.3679 | 3.5776 |
| 3.345 | 11.3670 | 39000 | 0.3681 | 3.5718 |
| 3.3619 | 11.6585 | 40000 | 0.3689 | 3.5643 |
| 3.387 | 11.9500 | 41000 | 0.3694 | 3.5561 |
| 3.3217 | 12.2414 | 42000 | 0.3688 | 3.5691 |
| 3.3443 | 12.5329 | 43000 | 0.3691 | 3.5631 |
| 3.3572 | 12.8243 | 44000 | 0.3696 | 3.5544 |
| 3.282 | 13.1157 | 45000 | 0.3690 | 3.5689 |
| 3.3217 | 13.4072 | 46000 | 0.3696 | 3.5634 |
| 3.333 | 13.6987 | 47000 | 0.3703 | 3.5528 |
| 3.3499 | 13.9902 | 48000 | 0.3704 | 3.5476 |
| 3.2773 | 14.2816 | 49000 | 0.3698 | 3.5656 |
| 3.3086 | 14.5731 | 50000 | 0.3703 | 3.5580 |
| 3.343 | 14.8646 | 51000 | 0.3706 | 3.5467 |
| 3.2598 | 15.1559 | 52000 | 0.3702 | 3.5646 |
| 3.2774 | 15.4474 | 53000 | 0.3705 | 3.5598 |
| 3.3126 | 15.7389 | 54000 | 0.3711 | 3.5496 |
| 3.2012 | 16.0303 | 55000 | 0.3705 | 3.5610 |
| 3.2648 | 16.3218 | 56000 | 0.3711 | 3.5536 |
| 3.2782 | 16.6133 | 57000 | 0.3710 | 3.5503 |
| 3.3024 | 16.9048 | 58000 | 0.3720 | 3.5396 |
| 3.2318 | 17.1962 | 59000 | 0.3710 | 3.5579 |
| 3.2653 | 17.4877 | 60000 | 0.3715 | 3.5493 |
| 3.274 | 17.7792 | 61000 | 0.3722 | 3.5431 |
| 3.1885 | 18.0705 | 62000 | 0.3713 | 3.5596 |
| 3.2372 | 18.3620 | 63000 | 0.3717 | 3.5528 |
| 3.2585 | 18.6535 | 64000 | 0.3721 | 3.5450 |
| 3.279 | 18.9450 | 65000 | 0.3726 | 3.5389 |
| 3.2051 | 19.2364 | 66000 | 0.3718 | 3.5559 |
| 3.2422 | 19.5279 | 67000 | 0.3718 | 3.5507 |
| 3.2517 | 19.8194 | 68000 | 0.3726 | 3.5421 |
| 3.1962 | 20.1108 | 69000 | 0.3719 | 3.5548 |
| 3.2241 | 20.4023 | 70000 | 0.3721 | 3.5511 |
| 3.2393 | 20.6938 | 71000 | 0.3724 | 3.5465 |
| 3.2653 | 20.9853 | 72000 | 0.3729 | 3.5397 |
| 3.2169 | 21.2766 | 73000 | 0.3719 | 3.5543 |
| 3.2173 | 21.5681 | 74000 | 0.3726 | 3.5482 |
| 3.2412 | 21.8596 | 75000 | 0.3729 | 3.5411 |
| 3.1695 | 22.1510 | 76000 | 0.3725 | 3.5561 |
| 3.2119 | 22.4425 | 77000 | 0.3726 | 3.5493 |
| 3.2316 | 22.7340 | 78000 | 0.3731 | 3.5434 |
| 3.1426 | 23.0254 | 79000 | 0.3723 | 3.5559 |
| 3.1847 | 23.3169 | 80000 | 0.3729 | 3.5535 |
| 3.1668 | 23.6083 | 81000 | 3.5551 | 0.3725 |
| 3.2001 | 23.8998 | 82000 | 3.5479 | 0.3731 |
| 3.1768 | 24.1915 | 83000 | 3.5602 | 0.3723 |
| 3.1871 | 24.4830 | 84000 | 3.5519 | 0.3727 |
| 3.2136 | 24.7745 | 85000 | 3.5428 | 0.3733 |
| 3.1267 | 25.0659 | 86000 | 3.5576 | 0.3731 |
| 3.1684 | 25.3574 | 87000 | 3.5551 | 0.3727 |
| 3.2026 | 25.6489 | 88000 | 3.5472 | 0.3735 |
| 3.2202 | 25.9404 | 89000 | 3.5379 | 0.3737 |
| 3.1603 | 26.2317 | 90000 | 3.5586 | 0.3729 |
| 3.1896 | 26.5232 | 91000 | 3.5494 | 0.3735 |
| 3.1884 | 26.8147 | 92000 | 3.5417 | 0.3740 |
| 3.1285 | 27.1061 | 93000 | 3.5546 | 0.3731 |
| 3.1577 | 27.3976 | 94000 | 3.5499 | 0.3736 |
| 3.183 | 27.6891 | 95000 | 3.5481 | 0.3736 |
| 3.1974 | 27.9806 | 96000 | 3.5389 | 0.3741 |
| 3.1403 | 28.2720 | 97000 | 3.5531 | 0.3733 |
| 3.1545 | 28.5635 | 98000 | 3.5498 | 0.3738 |
| 3.165 | 28.8550 | 99000 | 3.5403 | 0.3740 |
| 3.119 | 29.1463 | 100000 | 3.5620 | 0.3732 |
| 3.1416 | 29.4378 | 101000 | 3.5507 | 0.3740 |
| 3.1652 | 29.7293 | 102000 | 3.5452 | 0.3741 |
| 3.0738 | 30.0207 | 103000 | 3.5542 | 0.3740 |
| 3.1227 | 30.3122 | 104000 | 3.5568 | 0.3737 |
| 3.1518 | 30.6037 | 105000 | 3.5512 | 0.3741 |
| 3.1593 | 30.8952 | 106000 | 3.5423 | 0.3742 |
| 3.1051 | 31.1866 | 107000 | 3.5578 | 0.3737 |
| 3.1409 | 31.4781 | 108000 | 3.5499 | 0.3742 |
| 3.1566 | 31.7695 | 109000 | 3.5498 | 0.3741 |
Framework versions
- Transformers 4.55.2
- Pytorch 2.8.0+cu128
- Datasets 4.0.0
- Tokenizers 0.21.4
- Downloads last month
- 19