and up-to-date on our AI empowered digital cytology solutions.
.png)
- De-identified ThinPrep urine cytology slides (200) were retrospectively selected. Two cytopathologists (CP) provided consensus diagnoses (ground truth, GT) for all cases: 100 Negative for High-Grade Urothelial Carcinoma (NHGUC), 35 Atypical Urothelial Cells (AUC), 32 Suspicious for HGUC (SHGUC), and 33 HGUC.
- Slides were digitized into WSIs utilizing Mikroscan SLxCyto and customized Huron WSI imagers and examined using AI-assisted WSI review (AIxURO)
- 1 cytopathologist (CP) and 2 cytologists (CT) blindly reviewed slides with a 2-week washout period between research arms:
o Arm 1- Microscopy only
o Arm 2- AIxURO Mikroscan SLxCyto’s WSI review
o Arm 3- AIxURO customized Huron’s WSI review
- Performance Metrics:
o Comparison of study diagnosis with ground truth diagnosis for 3 Arms using following thresholds:
(1) AUC+ (AUC, SHGUC and HGUC) cases as positive; NHGUC cases as negative
(2) SHGUC+ (SHGUC and HGUC) cases as positive; NHGUC and AUC cases as negative
o The slide evaluation time (SET) in each arm was documented and made comparison.
- Diagnostic performance using the AUC+ threshold, AIxURO WSI review (Arm 2 and Arm 3) demonstrated higher sensitivity than microscopy (85.0% and 88.3% vs 79.3% overall). However, AIxURO WSI review exhibited lower specificity than microscopy (85.7% and 82.7% vs. 94.3%).
- When using the SHGUC+ threshold, AIxURO WSI review demonstrated higher overall sensitivity (74.9% and 86.2 vs 76.9%) and slightly lower specificity (96.0% and 92.3% vs 97.5% overall) compared to microscopy.
- AIxURO WSI review markedly reduced the SET versus microscopy (35.9s and 36.4s vs 102.6s). SETs for AUC+ are 45s and 43.6s vs 116.4s, while SET for negative cases are 26.5s and 29.1s vs 88.9s.
AIxURO WSI review demonstrated higher sensitivity but lower specificity than microscopy for AUC+ and SHGUC+ thresholds. Notably, AIxURO WSI review reduced slide evaluation time by at least 64.5%, offering substantial efficiency gains. These findings highlight AIxURO’s potential to enhance workflow efficiency in settings withstaffing shortages while maintaining diagnostic performance.
- 116 urine cytology slides (76 Cytospin, 40 CytoRich)
- Consensus diagnosis by two senior cytopathologists for ground truth, using TPS2.0, resulting in
- 30 positive slides (AUC/SHGUC/HGUC)
- 86 negative slides (NHGUC)
- Consensus diagnosis by two senior cytopathologists for ground truth, using TPS2.0, resulting in
- 3-Arm study with 1 cytopathologist (CP) and 2 cytologist (CT) reviewers and a 2-week washout period between each arm; and analysis of the cytopathologist paired with one of the cytologists to mimic clinical practice of cytologist review and referral to cytopathologist
- Arm 1: Microscopy
- Arm 2: Digital whole slide image review
- Arm 3: Digital image review using artificial intelligence software (AIxURO)
- Performances Metrics: Sensitivity, specificity, PPV, NPV, accuracy and total diagnostic (review) time
- Sensitivity: Improved with AI-assistance for CP+CTA (90% vs 76.7%) and CP+CTB (76.7% vs 76.7%) compared with microscopy (but not with WSI review alone)
- Negative Predictive Value: Improved with AI-assistance for CP+CTA (96.4% vs 92.2%) and CP+CTB (92% vs 92.3%) compared with microscopy (but not with WSI review alone)
- Specificity: Decreased in AI-assistance for CP+CTA (93% vs 96.5%) and CP+CTB (92% vs 92.3%) compared to microscopy alone
- Positive Predictive Value: Decreased in AI-assistance for CP+CTA (81.8% vs 88.5%) and CP+CTB (79.3% vs 92%) compared to microscopy alone
- Overall, Arm 2 (WSI review) showed no improvement compared to microscopy for either CP+CTA or CP+CTB
- Overall Review Time: AI-assisted review decreased the total review time (72.2 min and 110.4 min), compared to microscopy (210.2 min and 244.7 min), whereas WSI review took as long or longer (227.1 min and 243.8 min) than microscopy
AI-assisted urine cytology review markedly reduces review time while increasing sensitivity and NPV using TPS2.0 compared to conventional microscopy. However, specificity, PPV and accuracy are slightly diminished. Pairing a cytopathologist (CP) with different cytologists (CT) also influences the diagnostic outcomes and metrics.
- 116 urine cytology slides with consensus diagnosis (ground truth) by a panel of experts to 86 NHGUC, 12 AUC, 11 SHGUC, and 7 HGUC, scanned with Leica Aperio AT2 to create a whole slide image (WSI)
- 1 Cytopathologist and 2 cytologists reviewed all slides/images in each arm independently, recording the time required to diagnosis and The Paris System (TPS) cytologic diagnosis, with a 2-week washout period between each review
- Arm 1: Microscopic review of the cytology slide
- Arm 2: Review of the scanned whole slide image (WSI)
- Arm 3: Review of images with AI-assistance software (AIxURO)
- Metrics: TPS diagnostic category compared with the expert panel and Total time spent on review
- NHGUC, SHGUC and HGUC: AIxURO showed higher specificity than microscopy, but lower sensitivity
- AUC: AIxURO showed higher sensitivity but lower specificity than microscopy
- The performance of Arm 2 (using the whole slide digital image only for analysis) was poorest overall, compared to the other 2 arms
- There were performance inconsistencies between reviewers. For example, sensitivity for SHGUC was decreased for the pathologist and one cytologist, whereas the other cytologist noted an increase in sensitivity. One cytologist had a decrease in sensitivity for HGUC compared to the other 2 reviewers.
- AIxURO showed the largest reduction in time spent on review (32-45% less for the pathologist and 10-62% for the cytologists). The cytopathologist took the longest time to review AUC and the shortest for HGUC, whereas the cytologists took the most time to review SHGUC and the least on NHGUC.
AI-assisted AIxURO outperformed microscopy in diagnostic accuracy for AUC while maintaining comparable accuracy across other TPS categories and significantly reducing total review time for all reviewers. WSI review alone did not improve diagnostic accuracy or efficiency.
- 14 urine cytology specimens, equally divided and prepped into CytoSpin and ThinPrep specimens, were scanned for WSI by Leica Aperio AT2 and Hamamatsu NanoZoom S360 at 3 focus modes (default, semiautomatic, and manual) for single layer scanning and with a manual focus mode for 21 Z-layer scanning
- Performance metrics evaluated included scanning success rate, AI-algorithm-inferred atypical cell numbers and coverage rates (atypical cells in single or multiple Z-layers divided by total atypical cells), scanning time, and image file size
Success Scanning Rates
- Default Mode Scanning: 85.7% (Cytospin, Leica) and 92.9% (ThinPrep, Leica; Cytospin and ThinPrep, Hamamatsu) success rates
- Semi-Auto Mode Scanning: 92.9% (Cytospin, Leica; Cytospin, Hamamatsu and ThinPrep, Hamamatsu) or 100% (ThinPrep, Leica) success
- Manual Scanning: 100% success (all preps with all scanners)
Scanning Times (median)
- Cytospin, Leica: 1010 seconds
- Cytospin, Hamamatsu: 275 seconds
- ThinPrep, Leica: 5357 seconds
- ThinPrep, Hamamatsu: 1429 seconds
File Image Sizes of WSI (using manual focus mode and 21 Z-layer scan settings)
- Cytospin, Leica: 2.0 GB
- Cytospin, Hamamatsu: 4.4 GB
- ThinPrep, Leica: 13.1 GB
- ThinPrep, Hamamatsu: 25.5 GB
Semi-automatic and manual focus modes provide the most successful slide scanning with rates up to 100%. A minimum of 9-layer Z-stacking at 1 µm intervals is necessary to cover 80% of atypical cells. Cytospin preparations take less time to scan and result in smaller file sizes than do ThinPrep slides, regardless of the scanner. Z-stacking enhances AI inferred quality and coverage rates of atypical cells but with longer scanning times and larger image files.
- 20 urine cytology slides, 5 types of preparation (cytospin, ThinPrep nonGYN, ThinPrep Urocyte, and BD CytoRich) were digitized using 3 digital scanners (Roch DP200, Roch DP600, and Hamamatsu Nanozoomer 360)
- Images were evaluated for quality, focus and color from the default, manual and advanced scanning modes by a senior cytologist
- Roche DP200 and DP600 scanners achieved good quality WSI (30-50% in default mode and 40-65% in manual mode)
- Hamamatsu achieved 90% good quality imaging only in the manual mode, whereas the default quality dropped to 15%. It also had a lower coefficient of variation for average number of atypical cells across all preparation types
Overall, Hamamatsu scanner showed better performance for image quality in the manual mode than Roche scanners, which performed better in the default mode. Hamamatsu scanners also had improved detection of atypical urothelial cells among triplicate images from the same samples compared to Roche scanners.
- 116 urine cytology slide scanned as WSI and analyzed by AI deep-learning software system and ranked as most suspicious or atypical urothelial cells based on The Paris System 2.0 (TPS 2.0) for Reporting Urine Cytology
- Cell image gallery displayed 14 top abnormal cells in viewing software for expert analysis
- 1 cytopathologist (1 CP) and 2 cytologists (CP) assigned a TPS 2.0 diagnosis to each case: NHGUC, AUC, SHGUC or HGUC; results were compared between reviewers
- CP demonstrated the highest specificity, PPV, accuracy and agreement with the ground truth compared with the cytologists (CPs), who had greater sensitivity and NPV
- Both CP and CTs had reduced sensitivity, PPV and agreement in the diagnosis of AUC vs NHGUC
Decreased sensitivity and PPV in differentiating AUC from NHGUC may be the result of the usual clinical workflow, where cytologists initially screen and mark suspicious regions as abnormal, but defer to CP for a more specific interpretation; it may also reflect less experience with digital cytology interpretation.
- 20 urine cytology slides, 5 types of preparation (cytospin, ThinPrep nonGYN, ThinPrep Urocyte, and BD CytoRich) were digitized using 3 digital scanners (Roch DP200, Roch DP600, and Hamamatsu Nanozoomer 360)
- Images were evaluated for quality, focus and color from the default, manual and advanced scanning modes by a senior cytologist
- Roche DP200 and DP600 scanners achieved good quality WSI (30-50% in default mode and 40-65% in manual mode)
- Hamamatsu achieved 90% good quality imaging only in the manual mode, whereas the default quality dropped to 15%. It also had a lower coefficient of variation for average number of atypical cells across all preparation types
Overall, Hamamatsu scanner showed better performance for image quality in the manual mode than Roche scanners, which performed better in the default mode. Hamamatsu scanners also had improved detection of atypical urothelial cells among triplicate images from the same samples compared to Roche scanners.
- 60 paired cytospin and CytoRich slides from 30 pts scanned into WSI and inferred using AIxURO AI-driven software, ranking into most top 24 suspicious cells/groups
- 3 senior cytologists with variable digital interpretive experience (A= over 1 year; B = 6 mo-1 yr; C = less than one month) evaluated interface gallery of images to render a diagnosis
- Dx performance was consistent for the 2 preparation types, with cytologists A & B showing excellent performance
- All reviewers spent similar amounts of time on review per preparation type, with cytologist C taking the least amount of time.
The AI-assisted tool allowed for excellent performance to aid interpretation of upper GU tract urothelial carcinoma
- 116 urine cytology slides digitized into WSI and analyzed by AI-assisted software to identify and categorize abnormal cells into suspicious (likely SHGUC or HGUC) or atypical (AUC) categories
- 1 cytopathologist (CP) and 2 cytologists (CP) reviewed all slides microscopically (Arm 1), as a WSI (Arm 2) and with AI-assistance presenting the top 24 most abnormal cells in a gallery display (Arm 3), with a 2-week wash-out period between reviews
- Performance Metrics: Comparison of expert panel consensus of the glass slide (86 negative, 30 positive) to the review diagnoses to calculate sensitivity, specificity, PPV and NPV, along with the time spent examining the slide for each arm per individual
- AI-assisted software (AIxURO) improved the overall sensitivity (from 82.2% to 92.2%) and NPV (from 93.4% to 96.5%) but decreased specificity (from 87.2% to 75.6%), PPV (from 69.2% to 56.8%) and accuracy (from 85.9% to 79.9%) compared to microscopy. The WSI performance was the worst arm.
- Time for slide review was significantly reduced overall (from 159.9 min for microscopy to 106.3 minutes for AI-assisted), as well as per individual slide (from 1.38 min to 0.92 min)
AI-assistance improves the sensitivity and NPV of urine cytology review, but at the expense of a decrease in specificity, PPV and accuracy. This may indicate that AI-assistance will facilitate the detection of abnormal urothelial cells while lowering the time required for slide review.
- 52 urine cytology slides from bladder cancer patients (24 cytospin, 16 ThinPrep, 12 Cytorich) were digitally scanned with Leica AT2 using conventional Z-stacking (Z = 0 to Z = +10 layers above and below at 1 µm intervals) and compared with using a heuristic scanning simulation method with AI to determine regions of interest (ROI) for scanning, whereby the software determines the ideal number of layers to scan, from 3 layers (Z = 0+1) to 21 layers (Z = +10).
- Performance Metrics: Total number of suspicious cells; coverage rate (ratio of suspicious cells in single vs multiple Z-layers to the total suspicious cells in 21 Z-layers); scanning time (seconds); and image file size for storage (in gigabytes).
- Heuristic scanning was comparable to Z-stacking with similar average numbers of suspicious cell coverage rates (79.3% cytospin; 85.9% TP; 78.3% CytoRich) to those of Z-stacking (81.9% in 5 layers for cytospin, 87.1% in 9 layers for TP, 82.9% in 7 layers for CytoRich).
- Heuristic scanning significantly reduced scanning time (65-72%) and image file size (by 46-64%).
- Heuristic scanning in low cellularity slides was more accurate (7/7 correct diagnoses) than using single Z-layer WSI (4/7 correct diagnoses)
Heuristic scanning is an effective alternative to conventional Z-stacking to identify suspicious cells on urine cytology slides, with the advantage of reducing scanning time and image file size.