Real Time Simulator of Age-Related Macular Degeneration
Introduction
Important applications of Virtual Reality (VR) have been in the context of medical research. For this project, we have simulated some of the typical symptoms of age related macular degeneration (AMD) within a Unity3D environment, to be displayed via Oculus Rift. The goal was to produce visual effects subtle enough to capture the actual experience of affected patients. Once the results are deemed accurate via subjective comparison, this setup could be used not only as a pseudo-phenomenological model but also as a starting point for further simulation and testing of visual acuity or impairment. Additionally, it has the potential of conveying to healthy individuals, e.g. family, friends and physicians, the experience of a person suffering from AMD.
Background
AMD generally affects individuals above the age of 60, though symptoms may begin earlier and are not always noticeable at the onset. The region of the retina that is affected by AMD is called the macula and includes the center point of the visual field (the fovea). This corresponds to the point of fixation, and is the most sensitive to spatial and chromatic detail.
Based on the cause, AMD can assume one of two forms. Dry AMD occurs when yellow protein deposits called drusen accumulate underneath the macula. This form of AMD also occurs when the photoreceptors or surrounding tissue break down, causing gaps in the retinal image. This can lead to partial or total loss of central vision.
Wet AMD occurs by neovascularization, where retinal blood vessels expand and leak fluid into the macula, damaging and distorting its surface. This can lead, again, to vision loss, but also introduces warping since straight lines appear curved (a condition called metamorphopsia). This latter condition is tested qualitatively by presenting patients with the Amsler grid.
Out of all patients affected by AMD, 10% have the wet form, and are more likely to experience permanent loss of vision at fixation.
Methods
Rendering Pipeline
The chosen rendering environment is Unity3D, a powerful multi-platform engine for real time simulation usually used for gaming applications. Simple UI elements can be designed programmatically to interact with the program while it renders the output in real time. This is a useful feature for subjective testing, allowing patients to continuously vary the parameters driving each perceptual effect. For our purposes, a scene model containing various 3D objects was provided by Matt Vitelli.
Each perceptual effect was scripted as a post-processing shader using embedded CG/HLSL syntax. These scripts can be called from any component within Unity, and thus layered up to any desired combination. Individual effects can be activated or deactivated according to user feedback.
A shader component can be attached to any of the camera objects positioned in the virtual space. Two cameras generate the field of view of the left and right video channels respectively, which are then streamed directly to the Oculus Rift. This simulation of stereoscopic vision can be useful not just for the purposes of realism but for also for testing. By applying a post-processing shader to only of the two channels, a realistic assumption since AMD does not affect both eyes identically, it is possible to carry out immediate left/right visual comparisons.
Implementation
We have simulated various hypothesized symptoms of AMD. We refer to them collectively as scotomas to indicate any region of impaired vision, though the exact nature of the impairment can range from unconscious blind spots to perceivable blur. The effects we considered are the following.
Desaturation
Pixel by pixel RGB values were extracted and used to estimate luminance as a weighted sum of gamma compressed inputs. The result is then interpolated radially up to a user defined radius from the point of fixation.
Blur
We implemented a simple gaussian filter using a 4x4 kernel. As before, the result is blended according to distance from the center.
Dark spot
Another interpolated effect was implemented as a simple dark spot at the point of fixation. The radius of this effect can be manipulated independently of the others, and constitutes perhaps the more naive interpretation of what a patient "sees."
Warping
We implemented a separate shader to distort the image around the center point. The current transformation simply stretches the image radially.
In order to produce a more interesting warping effect, we have also generated a series of noise textures in Matlab. We applied four gaussian bandpass filters to a 2D gaussian noise image, with center frequency increasing by octaves from 4 to 64cpi. These textures can now be used to determine the horizontal and vertical offsets of each pixel within the warped region. Lower frequency textures will generate waves from straight lines in the input image, whereas high frequency textures will produce grainier results. This is the same setup as in [6], and can be used as a virtual experimental setup to measure the effects of metamorphopsia on image recognition.
Results
In the following snapshots, we show the resulting visual effects. In the case of blur and desaturation, we hope that these may be subtle enough to affect detail in the center of the visual field without overtly interrupting the surrounding region.
Conclusions and Future Work
- Real time textural + structural inpainting
- Live capture
must account for distance
assume a given point of fixation or incorporate depth sensing the lack of stimulus does not entail lack of perception, especially when motion is considered => real time rendering
- Extensive subjective testing to measure the accuracy of each effect
- Real time textural + structural inpainting
- Live capture
- must account for distance
- assume a given point of fixation or incorporate depth sensing
- the lack of stimulus does not entail lack of perception, especially when motion is considered => real time rendering
- Extensive subjective testing to measure the accuracy of each effect
Acknowledgements
Appendix
Related Literature
[1] Marmor, David J., and Michael F. Marmor. "Simulating vision with and without macular disease." Archives of ophthalmology 128.1 (2010): 117-125.
[2] Lewis, J., L. Shires, and D. J. Brown. "Development of a visual impairment simulator using the Microsoft XNA Framework." Proc. 9th Intl Conf. Disability, Virtual Reality & Associated Technologies, Laval, France. 2012.
[3] Ai, Zhuming, et al. "Simulation of eye diseases in a virtual environment." System Sciences, 2000. Proceedings of the 33rd Annual Hawaii International Conference on. IEEE, 2000.
[4] Jin, Bei, Zhuming Ai, and Mary Rasmussen. "Simulation of eye disease in virtual reality." Engineering in Medicine and Biology Society, 2005. IEEE-EMBS 2005. 27th Annual International Conference of the. IEEE, 2005.
[5] Ates, Halim Cagri, Alexander Fiannaca, and Eelke Folmer. "Immersive simulation of visual impairments using a wearable see-through display." Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction. ACM, 2015.
[6] Lewis, James, et al. "Simulating visual impairments using the Unreal Engine 3 game engine." Serious Games and Applications for Health (SeGAH), 2011 IEEE 1st International Conference on. IEEE, 2011.
Oculus Specifications
- Display Technology: OLED
- Resolution: 2160×1200 (1080×1200 per eye)
- Refresh Rate: 90 Hz
- FOV (Nominal): 110 degrees or greater
- Head Tracking: 6DOF (3-axis rotational tracking + 3-axis positional tracking)
- Weight: TBA (Lighter than 380g)
- Platforms: Microsoft Windows (OS X and Linux planned)
- Connection: 1x HDMI 1.3 and 2x USB 3.0
