SodaXII's picture
Model save
b6d4d89 verified
metadata
library_name: transformers
license: apache-2.0
base_model: microsoft/swin-base-patch4-window7-224
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: swin-base-patch4-window7-224_rice-leaf-disease-augmented-v4_v5_pft
    results: []

swin-base-patch4-window7-224_rice-leaf-disease-augmented-v4_v5_pft

This model is a fine-tuned version of microsoft/swin-base-patch4-window7-224 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4024
  • Accuracy: 0.8490

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine_with_restarts
  • lr_scheduler_warmup_steps: 256
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
2.0846 0.5 64 1.9602 0.2483
1.7504 1.0 128 1.5308 0.5034
1.3704 1.5 192 1.1825 0.6107
1.113 2.0 256 0.9313 0.7148
0.9305 2.5 320 0.8132 0.7617
0.8171 3.0 384 0.7214 0.7651
0.7497 3.5 448 0.6650 0.7785
0.7039 4.0 512 0.6244 0.8188
0.6696 4.5 576 0.6003 0.8188
0.649 5.0 640 0.5976 0.8121
0.6334 5.5 704 0.6032 0.8020
0.6256 6.0 768 0.5859 0.8188
0.6417 6.5 832 0.5851 0.8188
0.5991 7.0 896 0.5835 0.8154
0.6014 7.5 960 0.5394 0.8322
0.5614 8.0 1024 0.5211 0.8356
0.536 8.5 1088 0.5184 0.8121
0.5443 9.0 1152 0.5256 0.8154
0.5129 9.5 1216 0.5026 0.8221
0.5084 10.0 1280 0.5028 0.8188
0.5081 10.5 1344 0.4996 0.8188
0.4936 11.0 1408 0.5004 0.8188
0.5 11.5 1472 0.5091 0.8121
0.4934 12.0 1536 0.4892 0.8356
0.4831 12.5 1600 0.4736 0.8322
0.4638 13.0 1664 0.4727 0.8255
0.4549 13.5 1728 0.4552 0.8456
0.4454 14.0 1792 0.4646 0.8322
0.44 14.5 1856 0.4610 0.8322
0.4304 15.0 1920 0.4574 0.8356
0.4255 15.5 1984 0.4550 0.8356
0.4353 16.0 2048 0.4548 0.8356
0.4456 16.5 2112 0.4465 0.8322
0.4047 17.0 2176 0.4619 0.8255
0.4119 17.5 2240 0.4497 0.8389
0.4009 18.0 2304 0.4329 0.8423
0.3901 18.5 2368 0.4286 0.8456
0.3936 19.0 2432 0.4318 0.8456
0.3761 19.5 2496 0.4297 0.8456
0.3885 20.0 2560 0.4279 0.8456
0.3806 20.5 2624 0.4271 0.8456
0.3779 21.0 2688 0.4352 0.8523
0.3746 21.5 2752 0.4256 0.8490
0.3708 22.0 2816 0.4253 0.8557
0.362 22.5 2880 0.4205 0.8490
0.3558 23.0 2944 0.4122 0.8490
0.3507 23.5 3008 0.4147 0.8423
0.3481 24.0 3072 0.4134 0.8456
0.3452 24.5 3136 0.4120 0.8456
0.3437 25.0 3200 0.4117 0.8490
0.3499 25.5 3264 0.4164 0.8523
0.3373 26.0 3328 0.4109 0.8490
0.3468 26.5 3392 0.3999 0.8523
0.3297 27.0 3456 0.4079 0.8523
0.329 27.5 3520 0.3997 0.8423
0.3293 28.0 3584 0.4051 0.8423
0.3147 28.5 3648 0.3987 0.8523
0.3239 29.0 3712 0.4013 0.8523
0.3147 29.5 3776 0.4031 0.8490
0.3167 30.0 3840 0.4024 0.8490

Framework versions

  • Transformers 4.48.3
  • Pytorch 2.5.1+cu124
  • Datasets 3.3.2
  • Tokenizers 0.21.1