tanoManzo's picture
End of training
0431ff3 verified
|
raw
history blame
No virus
3.32 kB
metadata
license: cc-by-nc-sa-4.0
base_model: InstaDeepAI/nucleotide-transformer-2.5b-multi-species
tags:
  - generated_from_trainer
metrics:
  - precision
  - recall
  - accuracy
model-index:
  - name: nucleotide-transformer-2.5b-multi-species_ft_Hepg2_1kbpHG19_DHSs_H3K27AC
    results: []

nucleotide-transformer-2.5b-multi-species_ft_Hepg2_1kbpHG19_DHSs_H3K27AC

This model is a fine-tuned version of InstaDeepAI/nucleotide-transformer-2.5b-multi-species on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 2.2344
  • F1 Score: 0.8878
  • Precision: 0.8690
  • Recall: 0.9074
  • Accuracy: 0.8836

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 24
  • eval_batch_size: 24
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss F1 Score Precision Recall Accuracy
0.3737 0.5593 500 0.3067 0.8745 0.8377 0.9147 0.8668
0.292 1.1186 1000 0.3043 0.8852 0.8332 0.9441 0.8757
0.1865 1.6779 1500 0.3219 0.8854 0.8324 0.9456 0.8757
0.1744 2.2371 2000 2.0896 0.8683 0.8712 0.8654 0.8668
0.2341 2.7964 2500 1.9718 0.8887 0.8383 0.9456 0.8799
0.1357 3.3557 3000 2.2402 0.8787 0.8268 0.9375 0.8687
0.0779 3.9150 3500 2.0910 0.8857 0.8664 0.9059 0.8813
0.0426 4.4743 4000 2.2457 0.8724 0.8911 0.8544 0.8731
0.0549 5.0336 4500 1.8462 0.8952 0.8718 0.9199 0.8907
0.0331 5.5928 5000 1.9918 0.8803 0.8790 0.8816 0.8784
0.0206 6.1521 5500 2.0727 0.8847 0.8811 0.8882 0.8825
0.0101 6.7114 6000 2.0343 0.8882 0.8853 0.8912 0.8862
0.0099 7.2707 6500 2.1056 0.8831 0.8914 0.875 0.8825
0.0058 7.8300 7000 2.3964 0.8673 0.9036 0.8338 0.8705
0.0 8.3893 7500 2.2680 0.8808 0.8814 0.8801 0.8791
0.0047 8.9485 8000 2.1908 0.8863 0.8730 0.9 0.8828
0.0 9.5078 8500 2.2344 0.8878 0.8690 0.9074 0.8836

Framework versions

  • Transformers 4.42.3
  • Pytorch 2.3.0+cu121
  • Datasets 2.18.0
  • Tokenizers 0.19.0