(175e) Quantitative Analysis of Fundus Images for Grading of Vitreous Haze | AIChE

(175e) Quantitative Analysis of Fundus Images for Grading of Vitreous Haze

Purpose: Intraocular inflammatory processes lead to progressive accumulation of cells and protein exudate in the vitreous, which can be detected by ophthalmoloscopic inspection of the eye as decreased visibility of the retinal vasculature, optic nerve head, and other fundus details. The objective of this work was to develop a quantitative method for grading the blurriness of ocular fundus images in order to provide automated grading of the severity of uveitis.

Methods: A computer algorithm was developed to quantify image blurriness through application of entropy filtering and Fourier spatial-frequency analysis. The algorithm was refined using a set of 8 reference images that were created by optically filtering a single standard fundus picture to varying degrees that simulate increasing vitreous haze. It was then applied without modification to a dataset of 120 digital fundus images collected from patients with different grades of uveitis. The algorithm performance was evaluated against the acutance method, which is commonly used to quantify the overall sharpness of edges in an image. Computed scores of vitreous haze were also compared for both methods in a masked fashion with the subjective readings of three expert clinicians.

Results: The Pearson correlation coefficient between computed automated grades of the algorithm and clinician grades of blurriness was 0.81-0.88 (R2 = 0.66-0.78). Acutance measurements calculated from the raw images and from images with background fluctuations subtracted out did not perform as well. The Pearson correlation coefficient between acutance and clinician grades was found to be -0.05 (R = 0.003) and 0.33 (R=0.109), respectively, which is quite weak.

Conclusions: We have developed a computer algorithm for grading vitreous haze in an unbiased and quantitative manner that correlates strongly with the subjective readings of expert clinicians. The algorithm performance with fundus images is much better than that of standard acutance measures.