--- license: cc-by-4.0 tags: - ocean - object-detection --- # FathomNet Vulnerable Marine Ecosystems (VME) Detector ## Model Details - Trained by researchers at the [Monterey Bay Aquarium Research Institute](https://www.mbari.org/) (MBARI). - Ultralytics [YOLOv8x](https://github.com/ultralytics/ultralytics) - Object detection model - Fine-tuned to detect 4 high-level classes of benthic animals from deep-sea imagery specifically identified as indicators of vulnerable marine ecosystems - These VME categories include *corals*, *crinoids*, *sponges*, and *fishes* - [Baco et al. 2023](https://peerj.com/articles/16024) (Table 2) was used to determine classes that were useful for detecting VME's, however we added fishes as an additionalclass due to the undeniable fact that VMEs and fishery management often overlap ## Intended Use - Post-process video and images collected by marine researchers to determine presence of VME indicator species ## Factors - Distribution shifts related to sampling platform, camera parameters, illumination, and deployment environment are expected to impact model performance - Evaluation was performed on an IID subset of available training data as well as out-of-distribution data ## Metrics - [Normalized confusion matrix](plots/confusion_matrix_normalized.png), [precision-recall curve](plots/PR_curve.png), and [F1-confidence curve](plots/F1_curve.png) were evaluated at test time - mAP@0.5 = 0.713 ## Training and Evaluation Data - Publicly-available data on [FathomNet](https://fathomnet.org/) - TODO: Add specific class to concept mapping used to query FathomNet ## Deployment 1. Clone this repository 2. In an environment with the [`ultralytics` Python package](https://github.com/ultralytics/ultralytics) installed, run: ```bash yolo predict model=best.pt ```