The purpose of this study is to test the hypothesis that the relationship between baseline visual field damage and the rate of progression depends upon the use of logarithmic (dB) versus linear (1/Lambert) scale.Methods:
A total of 60 eyes (60 patients) with treated, established glaucoma and at least 5 reliable 24-2 visual fields were included. Baseline visual field mean deviation (MD) in dB was transformed to 1/Lambert using standard equation. Mixed effects linear regression was used to calculate the slopes (MD rates of progression over time) with linear and nonlinear scales. We tested the relationship between baseline MD and MD slopes for each scale of measure.Results:
In dB scale, worse baseline visual field loss was associated with faster MD slopes (P=0.037), whereas the opposite effect was seen in 1/Lambert (P=0.001). For a similar rate of progression in dB/y, eyes with mild visual field damage lost more linear sensitivity over a given period of time than those with more severe baseline damage.Conclusions:
There is a significant relationship between baseline visual field severity and rates of MD progression, although the direction of this association depends on the scale sensitivity is measured. The definition of fast versus slow visual field progression should be revised and take into account that sensitivity in linear scales show a better correlation with structural loss than when conventionally measured in nonlinear scale.