Researchers developed a real-time, accurate artificial intelligence-based system for monitoring blind spots during esophagogastroduodenoscopy (EGD). These findings were published in Digestive and Liver Disease.
High-definition white light videos (N=5779 endoscopic videos) from patients undergoing EGD between 2018 and 2019 at a single center in China were retrospectively reviewed. This study developed 2 network algorithms to determine whether the endoscope was inside or outside the body and for determining sites in the digestive tract.
The first network used 46,923 in vivo and 25,804 in vitro images for training the algorithm, and the second network used 170,297 images that had been classified into 31 sites (location in the gastrointestinal tract and retroflex views A, G, P, or L) or NA for algorithm training. The networks were tested using 500 in vitro, 500 in vivo, and 3100 EGD images, and the combined model was tested using 129 videos.
The accuracy of the first deep convolutional neural network was 99.20% and that of the second was 99.83%.
Stratified by site type, accuracy ranged from 99.24% to 100%, sensitivity from 78% to 100%, and specificity from 99.31% to 100%. Only 2 sites were observed to have sensitivities below 90% (middle-upper body [P] and middle-upper body [L]).
In the video assessment, the Intelligent Detection Endoscopic Assistant (IDEA) model correctly detected the start time for all except 1 of the videos and identified the endpoint for all videos with an overall accuracy for video timing of 99.22%.
The 31 sites were accurately detected 95.30% of the time. Stratified by site type, accuracy ranged from 88.37% to 100%, sensitivity from 81.25% to 100%, and specificity from 80.49% to 100%.
The sites that had accuracy less than 90% were lower body (P) and middle-upper body (L). Sensitivity below 90% was observed at gastric cardia (P), lower body (P), gastric cardia (L) retro, and angularis (A) sites. Specificity at hypopharynx, gastroesophageal junction, lower body (A), lower body (L), middle-upper body (L), antrum (P), antrum (L), pylorus, and duodenal bulb (P) sites was less than 90%.
Compared with expert assessment of the videos, the IDEA model demonstrated significant agreement (k, 0.894; P <.001).
This study was limited by the researchers not assessing their model in the clinical setting. It remains unclear what the clinical value of the IDEA tool may be.
These findings indicated that a computer-aided monitoring system may automate the detection of gastric sites in real time and may have clinical application following verification of its effectiveness.
Reference
Li Y-D, Zhu S-W, Yu J-P, et al. Intelligent detection endoscopic assistant: an artificial intelligence-based system for monitoring blind spots during esophagogastroduodenoscopy in real-time. Dig Liver Dis. 2020;S1590-8658(20)31038-0. doi:10.1016/j.dld.2020.11.017