Most Vet AI Companies Won't Show Their Numbers. Vetology Just Published All of Them.
Vetology expanded its public AI performance dashboard to eleven metrics per classifier across 89+ conditions, backed by 300,000+ validated patient cases. The move highlights a transparency gap in veterinary AI, where a 2026 audit found nearly two-thirds of vendors disclose no validation data publicly.

Vetology expanded its public AI performance dashboard from four metrics to eleven per classifier, covering 89+ validated conditions across canine and feline imaging. The move exposes a wider industry problem: according to a 2026 Frontiers in Veterinary Science audit, nearly two-thirds of commercial veterinary AI vendors don't disclose validation data at all.
What Happened
Vetology, which provides AI-generated radiology screening reports and board-certified veterinary teleradiology, announced on April 1 that its publicly available dashboard at vetology.net/ai-classifier-performance now reports eleven metrics per classifier: sensitivity, specificity, PPV, NPV, AUC, F1 score, accuracy, prevalence, confidence intervals, and Radiologist Agreement Rate.
The expansion covers 89+ classifiers spanning thoracic, abdominal, and musculoskeletal imaging for both dogs and cats. Of those, 31 are retrained models that were originally released, then revalidated against updated board-certified radiologist consensus data. All classifiers have been revalidated with confusion matrices generated as recently as February 2026.
The underlying validation dataset comprises 300,000+ multi-image patient cases, with performance benchmarked against board-certified veterinary radiologist consensus. Vetology first released public classifier metrics earlier this year, becoming what it called the first veterinary imaging AI company to do so. This update deepens that commitment with a broader statistical picture and a pledge to ongoing model retraining.
Why It Matters
The veterinary AI diagnostics market is growing fast, but buyer confidence hasn't kept pace. The Frontiers in Veterinary Science audit found that 63.3% of commercial veterinary AI vendors do not publicly disclose validation data. That means most practices purchasing AI radiology tools are evaluating products based on marketing claims rather than performance evidence.
1. This creates a de facto transparency standard. By publishing eleven statistical measures per condition, Vetology is setting the bar competitors will eventually need to clear. For practice operators currently evaluating AI imaging tools (or renegotiating existing contracts), this dashboard provides the first independent-comparisons framework. If a competing vendor can't show you their sensitivity and specificity by condition, you now know what questions to ask.
2. The Radiologist Agreement Rate metric is new and telling. Most AI validation in veterinary medicine benchmarks against a single radiologist's reads. Vetology's inclusion of radiologist agreement rate acknowledges what every practice owner knows: radiologists themselves disagree on reads. Publishing this metric sets a more honest baseline for what "accurate" means in clinical reality.
3. The retraining commitment matters as much as the metrics. AI models degrade over time as patient populations and imaging equipment change. Vetology's disclosure that 31 classifiers have been retrained and revalidated signals an ongoing investment in model maintenance, not just a one-time validation exercise. Practices evaluating AI tools should ask vendors not just "how accurate is your model today?" but "when was it last retrained, and against what data?"
4. The timing aligns with rising scrutiny of AI claims across healthcare. Human healthcare regulators have been tightening AI validation requirements for years. Veterinary medicine is behind that curve but moving in the same direction. Early movers on transparency will have a regulatory and reputational advantage when the bar rises for everyone.
What to Watch
Competitor response: Will other vet AI vendors (SignalPET, InnVet, Onyx) publish comparable dashboards? If they don't, sales conversations shift from "does your product have AI?" to "why won't you show us how it performs?" Watch for either matching disclosures or defensive messaging about why public metrics are "misleading" or "not the right way to evaluate."
Practice purchasing behavior: The real test is whether transparency drives switching. If practices start citing Vetology's public data in vendor evaluations, it validates the strategy. If they continue buying on price and PIMS integration, the transparency play is a brand differentiator rather than a market mover.
Regulatory signals: Watch for AVMA or state veterinary board guidance on AI disclosure standards. If regulatory bodies begin requiring performance transparency for AI diagnostics tools, Vetology is years ahead of compliance.
Other News
More stories shaping the pet industry this week. From funding rounds and product launches to regulatory shifts and retail strategy, stay ahead of what's driving the market.
