I’m often asked a version of the same question when speaking with pathologists and laboratory professionals:
“Is AI really worth it? And more importantly, can we trust it in our laboratory?”
It’s a fair question.
Clinical labs are built on consistency, reproducibility, and evidence. New technologies are not adopted simply because they are innovative. They are adopted when data demonstrates that they can support real workflows and address real diagnostic challenges.
Over the past several years, I’ve worked alongside teams developing and validating AI-assisted digital pathology solutions across both veterinary and human medicine. What stands out most isn’t a single breakthrough, but the growing body of peer-reviewed research showing how AI can be validated and successfully integrated into routine laboratory settings.
We’re quickly approaching the point where the question is no longer whether AI belongs in the lab, but whether labs that aren’t using AI are leaving quality and efficiency on the table.

Where AI development really began
Many people are surprised to learn that a significant amount of Techcyte’s early AI development happened in veterinary diagnostics. These studies evaluated AI-assisted workflows for fecal parasite detection, blood smear analysis, and urine sediment evaluation across canine, feline, and equine samples.
Throughout this time, multiple peer-reviewed veterinary publications have demonstrated agreement rates approaching or exceeding 95% across multiple parasite classes. Just as important, these studies exposed algorithms to significant variation in staining quality, morphology, and specimen preparation, all challenges that mirror those seen every day in human clinical laboratories.
That experience mattered. It helped establish the data pipelines, validation frameworks, and quality processes that later shaped human clinical applications.
What the human studies tell us
As digital pathology adoption expanded, our research collaborations with institutions such as ARUP Laboratories, CorePlusPR, Mayo Clinic, Quest Diagnostics, The University of Bern, and other partners began evaluating AI-assisted workflows in human clinical settings.
In clinical parasitology, studies published in the Journal of Clinical Microbiology reported positive and negative agreement above 98% for AI-assisted detection of intestinal protozoa. A prospective study published in Diagnostics demonstrated 98.1% overall agreement and a Cohen’s kappa of 0.915 between AI-assisted workflows and manual microscopy using routine clinical samples.
One finding consistently stood out: AI-assisted review identified additional organisms that were initially missed during manual examination. This wasn’t because experts were wrong; rather, AI provided a consistent, expanded second look across every field of view.
Beyond microbiology, our AI-assisted cervical cytology screening workflow was clinically evaluated by a laboratory across 1,400 whole slide images, achieving 97% accuracy and 99% specificity. Based on these findings, the laboratory implemented the screening system to support 100% quality-control review.
These are not hypothetical studies or controlled lab demonstrations built on curated datasets. They reflect real laboratory environments and routine clinical workflows.
What years of research have taught me
When people ask whether AI can be trusted, I don’t point to a single algorithm or a single dataset. I point to the pattern that emerges across independent studies.
Across veterinary and human diagnostics, the evidence consistently demonstrates:
- A strong agreement between AI-assisted workflows and traditional microscopy
- Successful validation across diverse specimen types, including fecal samples, blood smears, cytology slides, and transplant pathology
- AI functioning as a valuable tool to support quality control, triage, and efficiency within existing laboratory workflows
Trust in AI doesn’t come from replacing expertise. It comes from building systems that reinforce consistency, reduce variability, and provide additional layers of review.

Why the question is changing
Earlier in my career, the question was often whether AI belonged in the laboratory at all.
Today, the conversation is shifting. Laboratories are no longer asking if AI will play a role in digital pathology; they are asking which solutions are grounded in real scientific evidence.
What gives me confidence is seeing years of iterative development across species, specimen types, and clinical environments. From veterinary parasitology to human cytology and microbiology, the growing publication record shows that AI development is becoming more mature, more collaborative, and more deeply integrated into laboratory workflows.
AI is not a shortcut. It is the result of sustained research, careful validation, and collaboration between clinicians, laboratorians, and engineers.
So when I’m asked whether laboratories can trust AI, my answer is simple: look at the evidence.
The following peer-reviewed studies reflect independent evaluations of AI-assisted digital pathology workflows developed using the Techcyte platform:
- Clinical Detection of Intestinal Protozoa in Digitized Modified Trichrome-Stained Stool Specimens Using a Deep Convolutional Neural Network–Assisted Workflow
Journal of Clinical Microbiology
https://journals.asm.org/doi/10.1128/jcm.02053-19 - Development and Validation of an Artificial Intelligence Model for Detection of Gastrointestinal Parasites from Concentrated Wet-Mount Stool Examinations
Journal of Clinical Microbiology
https://journals.asm.org/doi/10.1128/jcm.01062-25 - Enhancing Organ Allocation Efficiency: A Pilot Study Evaluating Artificial Intelligence-Assisted Assessment of Donor Kidney Pathology
Cureus
https://www.cureus.com/articles/354850-enhancing-organ-allocation-efficiency-a-pilot-study-evaluating-artificial-intelligence-assisted-assessment-of-donor-kidney-pathology#!/ - Artificial Intelligence–Assisted Digital Cytology Quality Control Workflow
https://pmc.ncbi.nlm.nih.gov/articles/PMC12087438/ - AI-Assisted Digital Parasitology Workflow Validation
Diagnostics
https://techcyte.com/wp-content/uploads/2025/12/diagnostics-15-02974.pdf - Evaluation of the VETSCAN IMAGYST: an in-clinic canine and feline fecal parasite detection system integrated with a deep learning algorithm
https://pubmed.ncbi.nlm.nih.gov/32653042/ - Further Evaluation and Validation of the VETSCAN IMAGYST: in-clinic feline and canine fecal parasite detection system integrated with a deep learning algorithm
https://pubmed.ncbi.nlm.nih.gov/33514412/ - Multicenter evaluation of the Vetscan Imagyst system using Ocus 40 and EasyScan One scanners to detect gastrointestinal parasites in feces of dogs and cats
https://pubmed.ncbi.nlm.nih.gov/38014739/ - Point-of-care platform integrated with deep-learning, convolutional neural network algorithms effectively evaluates canine and feline peripheral blood smears
https://pubmed.ncbi.nlm.nih.gov/39705813/ - Validation of Vetscan Imagyst®, a diagnostic test utilizing an artificial intelligence deep learning algorithm, for detecting strongyles and Parascaris spp. in equine fecal samples
https://link.springer.com/article/10.1186/s13071-024-06525-w
In addition to peer-reviewed publications, Techcyte’s AI development has been evaluated in more than 25 scientific abstracts presented across veterinary and human clinical meetings.