Visual Acuity Measured as a Function of Retinal Eccentricity Visual acuity is a measure of an observer’s ability to see fine spatial detail (Cavonius & Schumacher, 1966). There are a number of factors that affect visual acuity, such as illumination and contrast, and various ways to measure it (Kalloniatis & Luu, 2005). One way to measure visual acuity is through target detection which requires the perception of the orientation of a stimulus such as a Landolt C or a Snellen E (Kalloniatis & Luu, 2005). The participant in the current experiment was referred to have their acuity tested. Target detection of a stimulus was used to measure the participant’s visual acuity as a function of retinal eccentricity of the target. Retinal eccentricity refers to the angle that subtends from the fovea into the periphery (Cowey & Rolls, 1974). It is known that visual acuity is sharpest at the fovea and decreases with increasing angle of eccentricity (Cowey & Rolls, 1974). This has been attributed to factors such as a decline in cone density and an increased receptive field size (Millodot et al., 1975). Normal measures of visual acuity as a function of retinal eccentricity can be best described in reference to results obtained by Millodot et al. (1975) who plotted the minimum angle of resolution (MAR) as a function of target distance from fixation (eccentricity). The results showed a significant reduction in the MAR with increasing retinal eccentricity indicating that visual acuity decreases considerably as the distance of target from fixation increases (Millodot et al., 1975). These results have been used as a comparison of normal data to results obtained in the following experiment. Method
The participant in the current experiment is aged 20 years and has no optical correction (i.e. does not wear glasses or contacts). The experimenter measured the participant’s visual acuity by testing monocular vision in the right eye....
Please join StudyMode to read the full document