|
--- |
|
library_name: transformers |
|
license: mit |
|
base_model: timm/efficientvit_m4.r224_in1k |
|
tags: |
|
- generated_from_trainer |
|
metrics: |
|
- accuracy |
|
model-index: |
|
- name: efficientvit_m4.r224_in1k_rice-leaf-disease-augmented-v4_v5_fft |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# efficientvit_m4.r224_in1k_rice-leaf-disease-augmented-v4_v5_fft |
|
|
|
This model is a fine-tuned version of [timm/efficientvit_m4.r224_in1k](https://huggingface.co/timm/efficientvit_m4.r224_in1k) on an unknown dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 0.4767 |
|
- Accuracy: 0.8658 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 5e-05 |
|
- train_batch_size: 64 |
|
- eval_batch_size: 64 |
|
- seed: 42 |
|
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments |
|
- lr_scheduler_type: cosine_with_restarts |
|
- lr_scheduler_warmup_steps: 256 |
|
- num_epochs: 30 |
|
- mixed_precision_training: Native AMP |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Accuracy | |
|
|:-------------:|:-----:|:----:|:----------------:|:--------:| |
|
| 2.0834 | 0.5 | 64 | 2.0788 | 0.1577 | |
|
| 2.0669 | 1.0 | 128 | 2.0488 | 0.1711 | |
|
| 2.0293 | 1.5 | 192 | 2.0006 | 0.2785 | |
|
| 1.9774 | 2.0 | 256 | 1.9270 | 0.3792 | |
|
| 1.8929 | 2.5 | 320 | 1.8456 | 0.4597 | |
|
| 1.8113 | 3.0 | 384 | 1.7539 | 0.5235 | |
|
| 1.7181 | 3.5 | 448 | 1.6894 | 0.5570 | |
|
| 1.6738 | 4.0 | 512 | 1.6412 | 0.5839 | |
|
| 1.6197 | 4.5 | 576 | 1.6008 | 0.6141 | |
|
| 1.5808 | 5.0 | 640 | 1.5555 | 0.6208 | |
|
| 1.5563 | 5.5 | 704 | 1.5397 | 0.6174 | |
|
| 1.5374 | 6.0 | 768 | 1.5281 | 0.6409 | |
|
| 1.5426 | 6.5 | 832 | 1.5175 | 0.6309 | |
|
| 1.5093 | 7.0 | 896 | 1.4774 | 0.6376 | |
|
| 1.4656 | 7.5 | 960 | 1.4045 | 0.6443 | |
|
| 1.3943 | 8.0 | 1024 | 1.3379 | 0.6577 | |
|
| 1.3244 | 8.5 | 1088 | 1.2769 | 0.6879 | |
|
| 1.2782 | 9.0 | 1152 | 1.2230 | 0.6946 | |
|
| 1.2293 | 9.5 | 1216 | 1.2051 | 0.6980 | |
|
| 1.1952 | 10.0 | 1280 | 1.1664 | 0.7114 | |
|
| 1.1759 | 10.5 | 1344 | 1.1598 | 0.7215 | |
|
| 1.1638 | 11.0 | 1408 | 1.1507 | 0.7248 | |
|
| 1.1612 | 11.5 | 1472 | 1.1345 | 0.7282 | |
|
| 1.1221 | 12.0 | 1536 | 1.0794 | 0.7383 | |
|
| 1.0554 | 12.5 | 1600 | 1.0158 | 0.7584 | |
|
| 0.9903 | 13.0 | 1664 | 0.9986 | 0.7651 | |
|
| 0.9281 | 13.5 | 1728 | 0.9145 | 0.7718 | |
|
| 0.9074 | 14.0 | 1792 | 0.8825 | 0.7919 | |
|
| 0.86 | 14.5 | 1856 | 0.8671 | 0.7919 | |
|
| 0.8338 | 15.0 | 1920 | 0.8936 | 0.7785 | |
|
| 0.8242 | 15.5 | 1984 | 0.8743 | 0.7886 | |
|
| 0.8269 | 16.0 | 2048 | 0.8563 | 0.7886 | |
|
| 0.8116 | 16.5 | 2112 | 0.8288 | 0.7987 | |
|
| 0.7591 | 17.0 | 2176 | 0.7901 | 0.7987 | |
|
| 0.7088 | 17.5 | 2240 | 0.7543 | 0.8087 | |
|
| 0.6646 | 18.0 | 2304 | 0.7242 | 0.8221 | |
|
| 0.6291 | 18.5 | 2368 | 0.7118 | 0.8188 | |
|
| 0.6018 | 19.0 | 2432 | 0.6792 | 0.8255 | |
|
| 0.5824 | 19.5 | 2496 | 0.6707 | 0.8289 | |
|
| 0.5794 | 20.0 | 2560 | 0.6707 | 0.8322 | |
|
| 0.5722 | 20.5 | 2624 | 0.6688 | 0.8356 | |
|
| 0.5643 | 21.0 | 2688 | 0.6503 | 0.8255 | |
|
| 0.5286 | 21.5 | 2752 | 0.6360 | 0.8221 | |
|
| 0.5141 | 22.0 | 2816 | 0.6289 | 0.8289 | |
|
| 0.4557 | 22.5 | 2880 | 0.5956 | 0.8255 | |
|
| 0.4438 | 23.0 | 2944 | 0.5746 | 0.8389 | |
|
| 0.4084 | 23.5 | 3008 | 0.5673 | 0.8490 | |
|
| 0.4007 | 24.0 | 3072 | 0.5566 | 0.8456 | |
|
| 0.3777 | 24.5 | 3136 | 0.5547 | 0.8456 | |
|
| 0.3824 | 25.0 | 3200 | 0.5598 | 0.8523 | |
|
| 0.3807 | 25.5 | 3264 | 0.5528 | 0.8523 | |
|
| 0.3549 | 26.0 | 3328 | 0.5500 | 0.8490 | |
|
| 0.3357 | 26.5 | 3392 | 0.5255 | 0.8523 | |
|
| 0.3206 | 27.0 | 3456 | 0.5039 | 0.8523 | |
|
| 0.2941 | 27.5 | 3520 | 0.4959 | 0.8725 | |
|
| 0.2754 | 28.0 | 3584 | 0.4910 | 0.8523 | |
|
| 0.2536 | 28.5 | 3648 | 0.4837 | 0.8658 | |
|
| 0.2583 | 29.0 | 3712 | 0.4753 | 0.8658 | |
|
| 0.2519 | 29.5 | 3776 | 0.4844 | 0.8792 | |
|
| 0.2458 | 30.0 | 3840 | 0.4767 | 0.8658 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.48.3 |
|
- Pytorch 2.5.1+cu124 |
|
- Datasets 3.3.2 |
|
- Tokenizers 0.21.1 |
|
|