
F1 vs Confidence
Use this graph to pick the confidence threshold where precision and recall are best balanced for deployment.
Dedicated model page
This page is focused on model behavior only. It brings confidence curves and quality plots into one place so threshold decisions and model updates can be discussed quickly.

Use this graph to pick the confidence threshold where precision and recall are best balanced for deployment.

Higher confidence usually increases precision. This helps reduce false positives in rapid response workflows.

Recall drops as threshold rises. This graph helps avoid missing true detections when monitoring invasive outbreaks.

This summarizes detector quality across all thresholds and makes class separability easier to compare.

Highlights where classes are confused so data collection and labeling can target weak categories.

Shows optimization behavior during training and whether convergence remains stable over time.