and up-to-date on our AI empowered digital cytology solutions.
.png)
- 106 urine cytology slides (from 46 lower and 60 upper urinary tract HGUC/CIS cases) digitized and analyzed using AIxURO, an AI-based WSI tool.
- AI quantified atypical and suspicious urothelial cells, their nuclear-to-cytoplasmic(N/C) ratios, and nuclear areas.
- Morphologic data were correlated with biopsy-confirmed HGUC/CIS diagnoses. Statistical comparisons were conducted using Kruskal–Wallis tests.
- Suspicious vs Atypical Cells (Total):
- Fewer suspicious cells detected (median 20.5 vs 242.0, p<.001)
- Higher median N/C ratio: 0.66 vs 0.58 (p<.001)
- Larger nuclear area: 102.3 µm² vs 85.7 µm² (p<.001)
- By Cytology Category (AUC, SHGUC, HGUC):
- Suspicious cells consistently had higher N/C ratios and nuclear areas than atypical cells.
- Median N/C ratio of suspicious cells: 0.66~0.67 and atypical cells: 0.58~0.59 across groups
- Median nuclear areas of suspicious cells: AUC 108.9 µm², SHGUC 99.2 µm², HGUC 101.6 µm² and atypical cells: AUC 88.5 µm², SHGUC 86.8 µm², HGUC 83.5 µm²
- Upper vs Lower Tract:
- No significant difference in N/C ratios
- Nuclear areas were smaller in UUT vs LUT cases (e.g., suspicious: 98.0 µm² vs 108.0 µm², p<.001)
- No significant difference in N/C ratios
- Biopsy Correlation:
- CIS had the largest nuclear areas among suspicious (116.3 µm²) and atypical (101.5 µm²) cells
- Cell number and N/C ratio did not differ significantly across CIS, CIS-HGUC, and HGUC biopsy categories
AIxURO provides objective measurement of N/C ratio and nuclear area in urine cytology, challenging the current TPS threshold (>0.7) for diagnosing HGUC. The study suggests that a revised N/C ratio cutoff of 0.66 may be more appropriate for SHGUC/HGUC categorization using AI. Findings also support the use of consistent thresholds across upper and lower urinary tract cases. Nuclear area offers additional discriminative value, particularly for differentiating CIS from HGUC.
- 52 urine cytology slides (cytospin, ThinPrep, and CytoRIch) scanned with 21 Z-plane and a heuristic scan simulation method to generate whole slide images (WSI) using a Leica Aperio AT2 scanner
- An AI algorithm inferred the WSI from 21 Z-planes to quantitate total number of cells suspicious for high grade urothelial carcinoma (SHGUC) / high-grade urothelial carcinoma (HGUC)=[SHGUC+]
- The heuristic scan simulation calculated the total number of SHGUC+ using the 21 Z-plane scan data
- Performance metrics evaluated were SHGUC+ cell coverage rates, scanning times, file size, and AI-aided interpretation of WSI compared to the original cytology diagnosis for the 21 Z-plane scan and the heuristic scan
- SHGUC+ Coverage Rates: Heuristic scanning coverage rates were similar to 5 Z-plane scans for all 3 preparation types (0.78 to 0.91 vs 0.75 to 0.88; p = 0.451 to 0.578)
- Scanning Time: Heuristic scanning significantly reduced scanning time (137.2 to 635.0 seconds vs 332.6 to 1,278.8 seconds; p < 0.05)
- Image File Size: Heuristic scanning significantly reduced image file size (0.51 to 2.10 GB vs. 1.16 to 3.10 GB; p < 0.05)
- AI-aided Interpretation: Heuristic scanning had higher rates of accurate interpretation compared to single Z-plan scanning (62.5% vs. 37.5%)
Heuristic scanning showed improved scanning times and AI-aided cytologic interpretation while reducing overall image file size and maintaining similar scanning coverage for SHGUC+ cells for 3 urine cytology preparation types.
- 116 urine cytology slides (76 Cytospin, 40 CytoRich)
- Consensus diagnosis by two senior cytopathologists for ground truth, using TPS2.0, resulting in
- 30 positive slides (AUC/SHGUC/HGUC)
- 86 negative slides (NHGUC)
- Consensus diagnosis by two senior cytopathologists for ground truth, using TPS2.0, resulting in
- 3-Arm study with 1 cytopathologist (CP) and 2 cytologist (CT) reviewers and a 2-week washout period between each arm; and analysis of the cytopathologist paired with one of the cytologists to mimic clinical practice of cytologist review and referral to cytopathologist
- Arm 1: Microscopy
- Arm 2: Digital whole slide image review
- Arm 3: Digital image review using artificial intelligence software (AIxURO)
- Performances Metrics: Sensitivity, specificity, PPV, NPV, accuracy and total diagnostic (review) time
- Sensitivity: Improved with AI-assistance for CP+CTA (90% vs 76.7%) and CP+CTB (76.7% vs 76.7%) compared with microscopy (but not with WSI review alone)
- Negative Predictive Value: Improved with AI-assistance for CP+CTA (96.4% vs 92.2%) and CP+CTB (92% vs 92.3%) compared with microscopy (but not with WSI review alone)
- Specificity: Decreased in AI-assistance for CP+CTA (93% vs 96.5%) and CP+CTB (92% vs 92.3%) compared to microscopy alone
- Positive Predictive Value: Decreased in AI-assistance for CP+CTA (81.8% vs 88.5%) and CP+CTB (79.3% vs 92%) compared to microscopy alone
- Overall, Arm 2 (WSI review) showed no improvement compared to microscopy for either CP+CTA or CP+CTB
- Overall Review Time: AI-assisted review decreased the total review time (72.2 min and 110.4 min), compared to microscopy (210.2 min and 244.7 min), whereas WSI review took as long or longer (227.1 min and 243.8 min) than microscopy
AI-assisted urine cytology review markedly reduces review time while increasing sensitivity and NPV using TPS2.0 compared to conventional microscopy. However, specificity, PPV and accuracy are slightly diminished. Pairing a cytopathologist (CP) with different cytologists (CT) also influences the diagnostic outcomes and metrics.
- 116 urine cytology slides with consensus diagnosis (ground truth) by a panel of experts to 86 NHGUC, 12 AUC, 11 SHGUC, and 7 HGUC, scanned with Leica Aperio AT2 to create a whole slide image (WSI)
- 1 Cytopathologist and 2 cytologists reviewed all slides/images in each arm independently, recording the time required to diagnosis and The Paris System (TPS) cytologic diagnosis, with a 2-week washout period between each review
- Arm 1: Microscopic review of the cytology slide
- Arm 2: Review of the scanned whole slide image (WSI)
- Arm 3: Review of images with AI-assistance software (AIxURO)
- Metrics: TPS diagnostic category compared with the expert panel and Total time spent on review
- NHGUC, SHGUC and HGUC: AIxURO showed higher specificity than microscopy, but lower sensitivity
- AUC: AIxURO showed higher sensitivity but lower specificity than microscopy
- The performance of Arm 2 (using the whole slide digital image only for analysis) was poorest overall, compared to the other 2 arms
- There were performance inconsistencies between reviewers. For example, sensitivity for SHGUC was decreased for the pathologist and one cytologist, whereas the other cytologist noted an increase in sensitivity. One cytologist had a decrease in sensitivity for HGUC compared to the other 2 reviewers.
- AIxURO showed the largest reduction in time spent on review (32-45% less for the pathologist and 10-62% for the cytologists). The cytopathologist took the longest time to review AUC and the shortest for HGUC, whereas the cytologists took the most time to review SHGUC and the least on NHGUC.
AI-assisted AIxURO outperformed microscopy in diagnostic accuracy for AUC while maintaining comparable accuracy across other TPS categories and significantly reducing total review time for all reviewers. WSI review alone did not improve diagnostic accuracy or efficiency.
- 14 urine cytology specimens, equally divided and prepped into CytoSpin and ThinPrep specimens, were scanned for WSI by Leica Aperio AT2 and Hamamatsu NanoZoom S360 at 3 focus modes (default, semiautomatic, and manual) for single layer scanning and with a manual focus mode for 21 Z-layer scanning
- Performance metrics evaluated included scanning success rate, AI-algorithm-inferred atypical cell numbers and coverage rates (atypical cells in single or multiple Z-layers divided by total atypical cells), scanning time, and image file size
Success Scanning Rates
- Default Mode Scanning: 85.7% (Cytospin, Leica) and 92.9% (ThinPrep, Leica; Cytospin and ThinPrep, Hamamatsu) success rates
- Semi-Auto Mode Scanning: 92.9% (Cytospin, Leica; Cytospin, Hamamatsu and ThinPrep, Hamamatsu) or 100% (ThinPrep, Leica) success
- Manual Scanning: 100% success (all preps with all scanners)
Scanning Times (median)
- Cytospin, Leica: 1010 seconds
- Cytospin, Hamamatsu: 275 seconds
- ThinPrep, Leica: 5357 seconds
- ThinPrep, Hamamatsu: 1429 seconds
File Image Sizes of WSI (using manual focus mode and 21 Z-layer scan settings)
- Cytospin, Leica: 2.0 GB
- Cytospin, Hamamatsu: 4.4 GB
- ThinPrep, Leica: 13.1 GB
- ThinPrep, Hamamatsu: 25.5 GB
Semi-automatic and manual focus modes provide the most successful slide scanning with rates up to 100%. A minimum of 9-layer Z-stacking at 1 µm intervals is necessary to cover 80% of atypical cells. Cytospin preparations take less time to scan and result in smaller file sizes than do ThinPrep slides, regardless of the scanner. Z-stacking enhances AI inferred quality and coverage rates of atypical cells but with longer scanning times and larger image files.
- 20 urine cytology slides, 5 types of preparation (cytospin, ThinPrep nonGYN, ThinPrep Urocyte, and BD CytoRich) were digitized using 3 digital scanners (Roch DP200, Roch DP600, and Hamamatsu Nanozoomer 360)
- Images were evaluated for quality, focus and color from the default, manual and advanced scanning modes by a senior cytologist
- Roche DP200 and DP600 scanners achieved good quality WSI (30-50% in default mode and 40-65% in manual mode)
- Hamamatsu achieved 90% good quality imaging only in the manual mode, whereas the default quality dropped to 15%. It also had a lower coefficient of variation for average number of atypical cells across all preparation types
Overall, Hamamatsu scanner showed better performance for image quality in the manual mode than Roche scanners, which performed better in the default mode. Hamamatsu scanners also had improved detection of atypical urothelial cells among triplicate images from the same samples compared to Roche scanners.
- 116 urine cytology slide scanned as WSI and analyzed by AI deep-learning software system and ranked as most suspicious or atypical urothelial cells based on The Paris System 2.0 (TPS 2.0) for Reporting Urine Cytology
- Cell image gallery displayed 14 top abnormal cells in viewing software for expert analysis
- 1 cytopathologist (1 CP) and 2 cytologists (CP) assigned a TPS 2.0 diagnosis to each case: NHGUC, AUC, SHGUC or HGUC; results were compared between reviewers
- CP demonstrated the highest specificity, PPV, accuracy and agreement with the ground truth compared with the cytologists (CPs), who had greater sensitivity and NPV
- Both CP and CTs had reduced sensitivity, PPV and agreement in the diagnosis of AUC vs NHGUC
Decreased sensitivity and PPV in differentiating AUC from NHGUC may be the result of the usual clinical workflow, where cytologists initially screen and mark suspicious regions as abnormal, but defer to CP for a more specific interpretation; it may also reflect less experience with digital cytology interpretation.
- 20 urine cytology slides, 5 types of preparation (cytospin, ThinPrep nonGYN, ThinPrep Urocyte, and BD CytoRich) were digitized using 3 digital scanners (Roch DP200, Roch DP600, and Hamamatsu Nanozoomer 360)
- Images were evaluated for quality, focus and color from the default, manual and advanced scanning modes by a senior cytologist
- Roche DP200 and DP600 scanners achieved good quality WSI (30-50% in default mode and 40-65% in manual mode)
- Hamamatsu achieved 90% good quality imaging only in the manual mode, whereas the default quality dropped to 15%. It also had a lower coefficient of variation for average number of atypical cells across all preparation types
Overall, Hamamatsu scanner showed better performance for image quality in the manual mode than Roche scanners, which performed better in the default mode. Hamamatsu scanners also had improved detection of atypical urothelial cells among triplicate images from the same samples compared to Roche scanners.
- 60 paired cytospin and CytoRich slides from 30 pts scanned into WSI and inferred using AIxURO AI-driven software, ranking into most top 24 suspicious cells/groups
- 3 senior cytologists with variable digital interpretive experience (A= over 1 year; B = 6 mo-1 yr; C = less than one month) evaluated interface gallery of images to render a diagnosis
- Dx performance was consistent for the 2 preparation types, with cytologists A & B showing excellent performance
- All reviewers spent similar amounts of time on review per preparation type, with cytologist C taking the least amount of time.
The AI-assisted tool allowed for excellent performance to aid interpretation of upper GU tract urothelial carcinoma
- 116 urine cytology slides digitized into WSI and analyzed by AI-assisted software to identify and categorize abnormal cells into suspicious (likely SHGUC or HGUC) or atypical (AUC) categories
- 1 cytopathologist (CP) and 2 cytologists (CP) reviewed all slides microscopically (Arm 1), as a WSI (Arm 2) and with AI-assistance presenting the top 24 most abnormal cells in a gallery display (Arm 3), with a 2-week wash-out period between reviews
- Performance Metrics: Comparison of expert panel consensus of the glass slide (86 negative, 30 positive) to the review diagnoses to calculate sensitivity, specificity, PPV and NPV, along with the time spent examining the slide for each arm per individual
- AI-assisted software (AIxURO) improved the overall sensitivity (from 82.2% to 92.2%) and NPV (from 93.4% to 96.5%) but decreased specificity (from 87.2% to 75.6%), PPV (from 69.2% to 56.8%) and accuracy (from 85.9% to 79.9%) compared to microscopy. The WSI performance was the worst arm.
- Time for slide review was significantly reduced overall (from 159.9 min for microscopy to 106.3 minutes for AI-assisted), as well as per individual slide (from 1.38 min to 0.92 min)
AI-assistance improves the sensitivity and NPV of urine cytology review, but at the expense of a decrease in specificity, PPV and accuracy. This may indicate that AI-assistance will facilitate the detection of abnormal urothelial cells while lowering the time required for slide review.