Summary

PyOKR: A Semi-Automated Method for Quantifying Optokinetic Reflex Tracking Ability

Published: April 12, 2024
doi:

Summary

We describe here PyOKR, a semi-automated quantitative analysis method that directly measures eye movements resulting from visual responses to two-dimensional image motion. A Python-based user interface and analysis algorithm allows for higher throughput and more accurate quantitative measurements of eye-tracking parameters than previous methods.

Abstract

The study of behavioral responses to visual stimuli is a key component of understanding visual system function. One notable response is the optokinetic reflex (OKR), a highly conserved innate behavior necessary for image stabilization on the retina. The OKR provides a robust readout of image tracking ability and has been extensively studied to understand visual system circuitry and function in animals from different genetic backgrounds. The OKR consists of two phases: a slow tracking phase as the eye follows a stimulus to the edge of the visual plane and a compensatory fast phase saccade that resets the position of the eye in the orbit. Previous methods of tracking gain quantification, although reliable, are labor intensive and can be subjective or arbitrarily derived. To obtain more rapid and reproducible quantification of eye tracking ability, we have developed a novel semi-automated analysis program, PyOKR, that allows for quantification of two-dimensional eye tracking motion in response to any directional stimulus, in addition to being adaptable to any type of video-oculography equipment. This method provides automated filtering, selection of slow tracking phases, modeling of vertical and horizontal eye vectors, quantification of eye movement gains relative to stimulus speed, and organization of resultant data into a usable spreadsheet for statistical and graphical comparisons. This quantitative and streamlined analysis pipeline, readily accessible via PyPI import, provides a fast and direct measurement of OKR responses, thereby facilitating the study of visual behavioral responses.

Introduction

Image stabilization relies on precise oculomotor responses to compensate for global optic flow that occurs during self-motion. This stabilization is driven primarily by two motor responses: the optokinetic reflex (OKR) and the vestibulo-ocular reflex (VOR)1,2,3. Slow global motion across the retina induces the OKR, which elicits reflexive eye rotation in the corresponding direction to stabilize the image1,2. This movement, known as the slow phase, is interrupted by compensatory saccades, known as the fast phase, in which the eye rapidly resets in the opposite direction to allow for a new slow phase. Here, we define these fast-phase saccades as eye-tracking movements (ETMs). Whereas the VOR relies on the vestibular system to elicit eye movements to compensate for head movements3, the OKR is initiated in the retina by the firing of ON and subsequent signaling to the Accessory Optic System (AOS) in the midbrain4,5. Due to its direct reliance on retinal circuits, the OKR has been frequently used to determine visual tracking ability in both research and clinical settings6,7.

The OKR has been studied extensively as a tool for assessing basic visual ability2,6,8, DSGC development9,10,11,12, oculomotor responses13, and physiological differences among genetic backgrounds7. The OKR is evaluated in head-fixed animals presented with a moving stimulus14. Oculomotor responses are typically captured using a variety of video tools, and eye-tracking motions are captured as OKR waveforms in the horizontal and vertical directions9. To quantify tracking ability, two primary metrics have been described: tracking gain (the velocity of the eye relative to the velocity of the stimulus) and ETM frequency (the number of fast phase saccades over a given time frame). Calculation of gain has been used historically to directly measure angular velocity of the eye to estimate tracking ability; however, these calculations are labor intensive and can be arbitrarily derived based on video-oculography collection methods and subsequent quantification. For more rapid OKR assessment, counting of ETM frequency has been used as an alternate method for measuring tracking acuity7. Although this provides a fairly accurate estimation of tracking ability, this method relies on an indirect metric to quantify the slow phase response and introduces a number of biases. These include an observer bias in saccade determination, a reliance on temporally consistent saccadic responses across a set epoch, and an inability to assess the magnitude of the slow phase response.

In order to address these concerns with current OKR assessment approaches and to enable a high throughput in-depth quantification of OKR parameters, we have developed a new analysis method to quantify OKR waveforms. Our approach uses an accessible Python-based software platform named "PyOKR." Using this software, modeling and quantification of OKR slow phase responses can be studied in greater depth and with increased parameterization. The software provides accessible and reproducible quantitative assessments of responses to a myriad of visual stimuli and also two-dimensional visual tracking in response to horizontal and vertical motion.

Protocol

All animal experiments performed at The Johns Hopkins University School of Medicine (JHUSOM) were approved by the Institutional Animal Care and Use Committee (IACUC) at the JHUSOM. All experiments performed at the University of California, San Francisco (UCSF) were performed in accordance with protocols approved by the UCSF Institutional Animal Care and Use Program. 1. Behavioral data collection Record OKR eye movements using the video-oculography method of choice …

Representative Results

To validate the analysis method described above, we quantified OKR tracking gain on wave traces collected from wild-type mice and a conditional knockout mutant with a known tracking deficit. In addition, to test the broader applicability of our analysis method, we analyzed traces derived from a separate cohort of wild-type mice acquired using a different video-oculography collection method. The automatic filtering of saccades facilitates OKR data processing and analysis (Figure 3). Using rec…

Discussion

PyOKR provides several advantages for studying visual responses reflected in eye movements. These include accuracy, accessibility, and data collection options, in addition to the ability to incorporate parameterization and variable stimulus speeds.

Direct eye tracking gain assessment provides an accurate characterization of eye movement that is a more direct quantitative metric than traditional manual counting of fast phase saccades (ETMs). Although useful, saccade counting provides an indirec…

Disclosures

The authors have nothing to disclose.

Acknowledgements

This work was supported by R01 EY032095 (ALK), VSTP pre-doctoral fellowship 5T32 EY7143-27 (JK), F31 EY-033225 (SCH), R01 EY035028 (FAD and ALK) and R01 EY-029772 (FAD).

Materials

C57BL/6J  mice Jackson Labs 664
Igor Pro WaveMetrics RRID: SCR_000325
MATLAB MathWorks RRID: SCR_001622
Optokinetic reflex recording chamber – JHUSOM Custom-built N/A As described in Al-Khindi et al.(2022)9 and Kodama et al. (2016)13 
Optokinetic reflex recording chamber – UCSF Custom-built N/A As described in Harris and Dunn, 201510
Python Python Software Foundation RRID: SCR_008394
Tbx5 flox/+ mice Gift from B. Bruneau N/A As described in Al-Khindi et al.(2022)9 
Tg(Pcdh9-cre)NP276Gsat/Mmucd MMRRC MMRRC Stock # 036084-UCD; RRID: MMRRC_036084-UCD

References

  1. Stahl, J. S. Using eye movements to assess brain function in mice. Vision Res. 44 (28), 3401-3410 (2004).
  2. Kretschmer, F., Tariq, M., Chatila, W., Wu, B., Badea, T. C. Comparison of optomotor and optokinetic reflexes in mice. J Neurophysiol. 118, 300-316 (2017).
  3. Bronstein, A. M., Patel, M., Arshad, Q. A brief review of the clinical anatomy of the vestibular-ocular connections – How much do we know. Eye. 29 (2), 163-170 (2015).
  4. Simpson, J. I. The accessory optic system. Ann Rev Neurosci. 7, 13-41 (1984).
  5. Hamilton, N. R., Scasny, A. J., Kolodkin, A. L. Development of the vertebrate retinal direction-selective circuit. Dev Biol. 477, 273-283 (2021).
  6. Dobson, V., Teller, D. Y. Visual acuity in human infants: a review and comparison of behavioral and electrophysiological studies. Vision Res. 18 (11), 1469-1483 (1978).
  7. Cahill, H., Nathans, J. The optokinetic reflex as a tool for quantitative analyses of nervous system function in mice: Application to genetic and drug-induced variation. PLoS One. 3 (4), e2055 (2008).
  8. Cameron, D. J., et al. The optokinetic response as a quantitative measure of visual acuity in zebrafish. J Vis Exp. (80), e50832 (2013).
  9. Al-Khindi, T., et al. The transcription factor Tbx5 regulates direction-selective retinal ganglion cell development and image stabilization. Curr Biol. 32 (19), 4286-4298 (2022).
  10. Harris, S. C., Dunn, F. A. Asymmetric retinal direction tuning predicts optokinetic eye movements across stimulus conditions. eLife. 12, 81780 (2015).
  11. Sun, L. O., et al. Functional assembly of accessory optic system circuitry critical for compensatory eye movements. Neuron. 86 (4), 971-984 (2015).
  12. Yonehara, K., et al. Congenital Nystagmus gene FRMD7 is necessary for establishing a neuronal circuit asymmetry for direction selectivity. Neuron. 89 (1), 177-193 (2016).
  13. Kodama, T., Du Lac, S. Adaptive acceleration of visually evoked smooth eye movements in mice. J Neurosci. 36 (25), 6836-6849 (2016).
  14. Stahl, J. S., Van Alphen, A. M., De Zeeuw, C. I. A comparison of video and magnetic search coil recordings of mouse eye movements. J Neurosci Methods. 99 (1-2), 101-110 (2000).
This article has been published
Video Coming Soon
Keep me updated:

.

Cite This Article
Kiraly, J. K., Harris, S. C., Al-Khindi, T., Dunn, F. A., Kolodkin, A. L. PyOKR: A Semi-Automated Method for Quantifying Optokinetic Reflex Tracking Ability. J. Vis. Exp. (206), e66779, doi:10.3791/66779 (2024).

View Video