BTX24's picture
Update README.md
04eaa5e verified
metadata
license: apache-2.0
base_model: microsoft/beit-base-patch16-224-pt22k-ft22k
tags:
  - generated_from_trainer
metrics:
  - accuracy
  - f1
  - precision
  - recall
model-index:
  - name: beit-base-patch16-224-pt22k-ft22k-finetuned-tekno24
    results: []

beit-base-patch16-224-pt22k-ft22k-finetuned-tekno24

This model is a fine-tuned version of microsoft/beit-base-patch16-224-pt22k-ft22k on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0072
  • Accuracy: 0.5785
  • F1: 0.5643
  • Precision: 0.5602
  • Recall: 0.5785

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 256
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 15
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Precision Recall
1.4008 0.9855 17 1.2967 0.4059 0.3220 0.3791 0.4059
1.2363 1.9710 34 1.1309 0.5032 0.4187 0.4871 0.5032
1.1716 2.9565 51 1.0983 0.5161 0.4385 0.4610 0.5161
1.1479 4.0 69 1.0550 0.5409 0.5014 0.5067 0.5409
1.1058 4.9855 86 1.0397 0.5500 0.4942 0.5208 0.5500
1.0656 5.9710 103 1.0558 0.5556 0.5396 0.5486 0.5556
1.0328 6.9565 120 1.0216 0.5730 0.5465 0.5513 0.5730
1.0116 8.0 138 1.0469 0.5363 0.5187 0.5119 0.5363
1.012 8.9855 155 1.0216 0.5629 0.5226 0.5344 0.5629
1.0076 9.9710 172 1.0186 0.5675 0.5275 0.5379 0.5675
0.9714 10.9565 189 1.0205 0.5638 0.5499 0.5549 0.5638
0.9843 12.0 207 1.0117 0.5657 0.5488 0.5495 0.5657
0.9427 12.9855 224 1.0072 0.5785 0.5643 0.5602 0.5785
0.9268 13.9710 241 1.0068 0.5785 0.5652 0.5621 0.5785
0.9525 14.7826 255 1.0073 0.5785 0.5641 0.5641 0.5785

Framework versions

  • Transformers 4.42.4
  • Pytorch 2.4.0+cu121
  • Datasets 2.21.0
  • Tokenizers 0.19.1

image/png

image/png