micheller7171's picture
Update README.md
c00d17c verified
metadata
license: mit
language:
  - en
tags:
  - biology
  - CV
  - images
  - animals
  - lepidoptera
  - butterflies
  - detection
  - heliconius
  - forewings
  - hindwings
  - separated wings
  - full body
  - butterfly
  - RGB
  - ruler
  - whitebalance
  - label
  - colorchecker

Model Card for butterfly_detection_yolo

This model takes in images of butterflies as photographed for museum collections and detects butterfly components (L/R forewings, L/R hindwings and body) as well as color checkers and metadata labels. The detection model described here is used in the repository https://github.com/Imageomics/wing-segmentation to detect components and use Meta's Segment-Anything (SAM) model for segmentation of components.

Model Details

yolo_detection_8m_shear_10.0_scale_0.5_translate_0.1_fliplr_0.0_best.pt is the butterfly detection model.

The yolo v8 detection model was trained on a dataset of 800 total images from the Heliconius Collection-Cambridge Butterfly, OM_STRI, and Monteiro datasets. The model uses the pretrained yolov8m.pt model.

Model Description

The model is responsible for taking an input image (RGB) and generating bounding boxes for all classes below that are found in the image. Data augmentations applied during training include shear (10.0), scale (0.5), and translate (0.1). The model was trained for 50 epochs with an image size of 256. Note that despite defining an image size of 256, the normalized masks predicted by yolo can be rescaled to the original image size.

Segmentation Classes

[pixel class] corresponding category

  • [0] background
  • [1] right_forewing
  • [2] left_forewing
  • [3] right_hindwing
  • [4] left_hindwing
  • [5] ruler
  • [6] white_balance
  • [7] label
  • [8] color_card
  • [9] body

Details

model.train(data=YAML, imgsz=256, epochs=50, batch=16, device=DEVICE, optimizer='auto', verbose=True, val=True, shear=10.0, scale=0.5, translate=0.1, fliplr = 0.0 )

Metrics

             Class     Images  Instances      Box(P          R      mAP50  mAP50-95)
               all         64        358      0.979      0.887      0.919      0.877
        background         64          3          1          0      0.315      0.169
    right_forewing         64         58      0.995      0.983      0.986      0.977
     left_forewing         64         51      0.975          1      0.985      0.982
    right_hindwing         64         59      0.997      0.966      0.993      0.977
     left_hindwing         64         50      0.975          1      0.993       0.98
             ruler         64         31      0.951          1      0.995      0.952
     white_balance         64         18      0.984          1      0.995      0.995
             label         64         50      0.996          1      0.995      0.935
        color_card         64         24      0.988          1      0.995      0.992
              body         64         14      0.928      0.921      0.939      0.815
              

Developed by: Michelle Ramirez

How to Get Started with the Model

To view applications of how to load in the model file and predict masks on images, please refer to this github repository

Citation

BibTeX:

@software{Ramirez_Lepidoptera_Wing_Segmentation_2024,
  author = {Ramirez, Michelle},
  doi = {10.5281/zenodo.10869579},
  month = mar,
  title = {{Lepidoptera Wing Segmentation}},
  url = {https://github.com/Imageomics/wing-segmentation},
  version = {1.0.0},
  year = {2024}
}

APA:

Ramirez, M. (2024). Lepidoptera Wing Segmentation (Version 1.0.0) [Computer software]. https://doi.org/10.5281/zenodo.10869579

Acknowledgements

The Imageomics Institute is funded by the US National Science Foundation's Harnessing the Data Revolution (HDR) program under Award #2118240 (Imageomics: A New Frontier of Biological Information Powered by Knowledge-Guided Machine Learning). Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.