Edit model card

roberta-large-ner-ghtk-cs-add-3label-11-new-data-3090-14Sep-1

This model was trained from scratch on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2967
  • Tk: {'precision': 0.7714285714285715, 'recall': 0.6982758620689655, 'f1': 0.7330316742081449, 'number': 116}
  • A: {'precision': 0.9327354260089686, 'recall': 0.9651972157772621, 'f1': 0.9486887115165337, 'number': 431}
  • Gày: {'precision': 0.7317073170731707, 'recall': 0.8823529411764706, 'f1': 0.8, 'number': 34}
  • Gày trừu tượng: {'precision': 0.906832298136646, 'recall': 0.8975409836065574, 'f1': 0.90216271884655, 'number': 488}
  • Iền: {'precision': 0.7058823529411765, 'recall': 0.9230769230769231, 'f1': 0.8000000000000002, 'number': 39}
  • Iờ: {'precision': 0.5686274509803921, 'recall': 0.7631578947368421, 'f1': 0.651685393258427, 'number': 38}
  • Ã đơn: {'precision': 0.8333333333333334, 'recall': 0.8374384236453202, 'f1': 0.8353808353808354, 'number': 203}
  • Đt: {'precision': 0.9300106044538706, 'recall': 0.9988610478359908, 'f1': 0.9632070291048874, 'number': 878}
  • Đt trừu tượng: {'precision': 0.7545787545787546, 'recall': 0.8841201716738197, 'f1': 0.8142292490118577, 'number': 233}
  • Overall Precision: 0.8791
  • Overall Recall: 0.9280
  • Overall F1: 0.9029
  • Overall Accuracy: 0.9579

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2.5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Tk A Gày Gày trừu tượng Iền Iờ Ã đơn Đt Đt trừu tượng Overall Precision Overall Recall Overall F1 Overall Accuracy
No log 1.0 294 0.1535 {'precision': 0.8173076923076923, 'recall': 0.7327586206896551, 'f1': 0.7727272727272727, 'number': 116} {'precision': 0.9017094017094017, 'recall': 0.9791183294663574, 'f1': 0.9388209121245829, 'number': 431} {'precision': 0.76, 'recall': 0.5588235294117647, 'f1': 0.6440677966101696, 'number': 34} {'precision': 0.9010309278350516, 'recall': 0.8954918032786885, 'f1': 0.8982528263103803, 'number': 488} {'precision': 0.75, 'recall': 0.9230769230769231, 'f1': 0.8275862068965517, 'number': 39} {'precision': 0.6363636363636364, 'recall': 0.3684210526315789, 'f1': 0.4666666666666667, 'number': 38} {'precision': 0.780373831775701, 'recall': 0.8226600985221675, 'f1': 0.8009592326139089, 'number': 203} {'precision': 0.940347071583514, 'recall': 0.9874715261958997, 'f1': 0.9633333333333334, 'number': 878} {'precision': 0.910377358490566, 'recall': 0.8283261802575107, 'f1': 0.8674157303370786, 'number': 233} 0.896 0.9106 0.9032 0.9564
0.0981 2.0 588 0.1800 {'precision': 0.7558139534883721, 'recall': 0.5603448275862069, 'f1': 0.6435643564356436, 'number': 116} {'precision': 0.9284116331096197, 'recall': 0.962877030162413, 'f1': 0.9453302961275627, 'number': 431} {'precision': 0.71875, 'recall': 0.6764705882352942, 'f1': 0.696969696969697, 'number': 34} {'precision': 0.9063829787234042, 'recall': 0.8729508196721312, 'f1': 0.8893528183716075, 'number': 488} {'precision': 0.6727272727272727, 'recall': 0.9487179487179487, 'f1': 0.7872340425531915, 'number': 39} {'precision': 0.75, 'recall': 0.631578947368421, 'f1': 0.6857142857142857, 'number': 38} {'precision': 0.8109452736318408, 'recall': 0.8029556650246306, 'f1': 0.806930693069307, 'number': 203} {'precision': 0.9005128205128206, 'recall': 1.0, 'f1': 0.9476524554776039, 'number': 878} {'precision': 0.8038461538461539, 'recall': 0.8969957081545065, 'f1': 0.8478701825557811, 'number': 233} 0.8757 0.9106 0.8928 0.9539
0.0981 3.0 882 0.1885 {'precision': 0.8533333333333334, 'recall': 0.5517241379310345, 'f1': 0.6701570680628273, 'number': 116} {'precision': 0.9409090909090909, 'recall': 0.9605568445475638, 'f1': 0.9506314580941447, 'number': 431} {'precision': 0.7045454545454546, 'recall': 0.9117647058823529, 'f1': 0.794871794871795, 'number': 34} {'precision': 0.871031746031746, 'recall': 0.8995901639344263, 'f1': 0.8850806451612903, 'number': 488} {'precision': 0.7142857142857143, 'recall': 0.8974358974358975, 'f1': 0.7954545454545455, 'number': 39} {'precision': 0.631578947368421, 'recall': 0.9473684210526315, 'f1': 0.7578947368421052, 'number': 38} {'precision': 0.9080459770114943, 'recall': 0.7783251231527094, 'f1': 0.8381962864721485, 'number': 203} {'precision': 0.9360780065005417, 'recall': 0.9840546697038725, 'f1': 0.9594669627984453, 'number': 878} {'precision': 0.7545787545787546, 'recall': 0.8841201716738197, 'f1': 0.8142292490118577, 'number': 233} 0.8850 0.9134 0.8990 0.9586
0.0452 4.0 1176 0.1982 {'precision': 0.7345132743362832, 'recall': 0.7155172413793104, 'f1': 0.7248908296943231, 'number': 116} {'precision': 0.9247787610619469, 'recall': 0.9698375870069605, 'f1': 0.9467723669309173, 'number': 431} {'precision': 0.7631578947368421, 'recall': 0.8529411764705882, 'f1': 0.8055555555555555, 'number': 34} {'precision': 0.8500948766603416, 'recall': 0.9180327868852459, 'f1': 0.882758620689655, 'number': 488} {'precision': 0.72, 'recall': 0.9230769230769231, 'f1': 0.8089887640449438, 'number': 39} {'precision': 0.5645161290322581, 'recall': 0.9210526315789473, 'f1': 0.7000000000000001, 'number': 38} {'precision': 0.83, 'recall': 0.8177339901477833, 'f1': 0.8238213399503722, 'number': 203} {'precision': 0.9231578947368421, 'recall': 0.9988610478359908, 'f1': 0.9595185995623632, 'number': 878} {'precision': 0.7644927536231884, 'recall': 0.9055793991416309, 'f1': 0.8290766208251473, 'number': 233} 0.8632 0.9362 0.8982 0.9567
0.0452 5.0 1470 0.2384 {'precision': 0.7171717171717171, 'recall': 0.6120689655172413, 'f1': 0.6604651162790697, 'number': 116} {'precision': 0.9175704989154013, 'recall': 0.9814385150812065, 'f1': 0.9484304932735426, 'number': 431} {'precision': 0.7222222222222222, 'recall': 0.7647058823529411, 'f1': 0.7428571428571428, 'number': 34} {'precision': 0.8848484848484849, 'recall': 0.8975409836065574, 'f1': 0.8911495422177009, 'number': 488} {'precision': 0.6981132075471698, 'recall': 0.9487179487179487, 'f1': 0.8043478260869565, 'number': 39} {'precision': 0.673469387755102, 'recall': 0.868421052631579, 'f1': 0.7586206896551724, 'number': 38} {'precision': 0.83, 'recall': 0.8177339901477833, 'f1': 0.8238213399503722, 'number': 203} {'precision': 0.9258474576271186, 'recall': 0.9954441913439636, 'f1': 0.9593852908891327, 'number': 878} {'precision': 0.7848605577689243, 'recall': 0.8454935622317596, 'f1': 0.8140495867768595, 'number': 233} 0.8752 0.9207 0.8974 0.9566
0.0281 6.0 1764 0.2465 {'precision': 0.7456140350877193, 'recall': 0.7327586206896551, 'f1': 0.7391304347826088, 'number': 116} {'precision': 0.9269911504424779, 'recall': 0.9721577726218097, 'f1': 0.9490373725934316, 'number': 431} {'precision': 0.7142857142857143, 'recall': 0.8823529411764706, 'f1': 0.7894736842105262, 'number': 34} {'precision': 0.8764940239043825, 'recall': 0.9016393442622951, 'f1': 0.8888888888888888, 'number': 488} {'precision': 0.6923076923076923, 'recall': 0.9230769230769231, 'f1': 0.7912087912087913, 'number': 39} {'precision': 0.5833333333333334, 'recall': 0.9210526315789473, 'f1': 0.7142857142857143, 'number': 38} {'precision': 0.819047619047619, 'recall': 0.8472906403940886, 'f1': 0.8329297820823244, 'number': 203} {'precision': 0.9379014989293362, 'recall': 0.9977220956719818, 'f1': 0.966887417218543, 'number': 878} {'precision': 0.7555555555555555, 'recall': 0.8755364806866953, 'f1': 0.8111332007952288, 'number': 233} 0.8714 0.9337 0.9015 0.9565
0.0147 7.0 2058 0.2598 {'precision': 0.7289719626168224, 'recall': 0.6724137931034483, 'f1': 0.6995515695067265, 'number': 116} {'precision': 0.9349775784753364, 'recall': 0.9675174013921114, 'f1': 0.95096921322691, 'number': 431} {'precision': 0.7317073170731707, 'recall': 0.8823529411764706, 'f1': 0.8, 'number': 34} {'precision': 0.8931451612903226, 'recall': 0.9077868852459017, 'f1': 0.9004065040650407, 'number': 488} {'precision': 0.75, 'recall': 0.9230769230769231, 'f1': 0.8275862068965517, 'number': 39} {'precision': 0.6326530612244898, 'recall': 0.8157894736842105, 'f1': 0.7126436781609196, 'number': 38} {'precision': 0.8325123152709359, 'recall': 0.8325123152709359, 'f1': 0.8325123152709359, 'number': 203} {'precision': 0.9367631296891747, 'recall': 0.9954441913439636, 'f1': 0.9652125897294312, 'number': 878} {'precision': 0.7279151943462897, 'recall': 0.8841201716738197, 'f1': 0.7984496124031008, 'number': 233} 0.8764 0.9285 0.9017 0.9587
0.0147 8.0 2352 0.2884 {'precision': 0.7524752475247525, 'recall': 0.6551724137931034, 'f1': 0.7004608294930876, 'number': 116} {'precision': 0.9312638580931264, 'recall': 0.974477958236659, 'f1': 0.9523809523809523, 'number': 431} {'precision': 0.7435897435897436, 'recall': 0.8529411764705882, 'f1': 0.7945205479452054, 'number': 34} {'precision': 0.8933601609657947, 'recall': 0.9098360655737705, 'f1': 0.9015228426395939, 'number': 488} {'precision': 0.6923076923076923, 'recall': 0.9230769230769231, 'f1': 0.7912087912087913, 'number': 39} {'precision': 0.64, 'recall': 0.8421052631578947, 'f1': 0.7272727272727272, 'number': 38} {'precision': 0.8608247422680413, 'recall': 0.8226600985221675, 'f1': 0.8413098236775819, 'number': 203} {'precision': 0.9310710498409331, 'recall': 1.0, 'f1': 0.9643053267435475, 'number': 878} {'precision': 0.7314487632508834, 'recall': 0.8884120171673819, 'f1': 0.8023255813953488, 'number': 233} 0.8770 0.9305 0.9030 0.9592
0.0072 9.0 2646 0.2868 {'precision': 0.7692307692307693, 'recall': 0.6896551724137931, 'f1': 0.7272727272727274, 'number': 116} {'precision': 0.9287305122494433, 'recall': 0.9675174013921114, 'f1': 0.9477272727272728, 'number': 431} {'precision': 0.725, 'recall': 0.8529411764705882, 'f1': 0.7837837837837837, 'number': 34} {'precision': 0.9032921810699589, 'recall': 0.8995901639344263, 'f1': 0.9014373716632444, 'number': 488} {'precision': 0.7346938775510204, 'recall': 0.9230769230769231, 'f1': 0.8181818181818182, 'number': 39} {'precision': 0.627906976744186, 'recall': 0.7105263157894737, 'f1': 0.6666666666666666, 'number': 38} {'precision': 0.8333333333333334, 'recall': 0.8374384236453202, 'f1': 0.8353808353808354, 'number': 203} {'precision': 0.9300106044538706, 'recall': 0.9988610478359908, 'f1': 0.9632070291048874, 'number': 878} {'precision': 0.7686567164179104, 'recall': 0.8841201716738197, 'f1': 0.8223552894211578, 'number': 233} 0.8821 0.9272 0.9041 0.9588
0.0072 10.0 2940 0.2967 {'precision': 0.7714285714285715, 'recall': 0.6982758620689655, 'f1': 0.7330316742081449, 'number': 116} {'precision': 0.9327354260089686, 'recall': 0.9651972157772621, 'f1': 0.9486887115165337, 'number': 431} {'precision': 0.7317073170731707, 'recall': 0.8823529411764706, 'f1': 0.8, 'number': 34} {'precision': 0.906832298136646, 'recall': 0.8975409836065574, 'f1': 0.90216271884655, 'number': 488} {'precision': 0.7058823529411765, 'recall': 0.9230769230769231, 'f1': 0.8000000000000002, 'number': 39} {'precision': 0.5686274509803921, 'recall': 0.7631578947368421, 'f1': 0.651685393258427, 'number': 38} {'precision': 0.8333333333333334, 'recall': 0.8374384236453202, 'f1': 0.8353808353808354, 'number': 203} {'precision': 0.9300106044538706, 'recall': 0.9988610478359908, 'f1': 0.9632070291048874, 'number': 878} {'precision': 0.7545787545787546, 'recall': 0.8841201716738197, 'f1': 0.8142292490118577, 'number': 233} 0.8791 0.9280 0.9029 0.9579

Framework versions

  • Transformers 4.44.0
  • Pytorch 2.3.1+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
9
Safetensors
Model size
559M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .