Waiting
Login processing...

Trial ends in Request Full Access Tell Your Colleague About Jove

Neuroscience

PyOKR: A Semi-Automated Method for Quantifying Optokinetic Reflex Tracking Ability

Published: April 12, 2024 doi: 10.3791/66779

Abstract

The study of behavioral responses to visual stimuli is a key component of understanding visual system function. One notable response is the optokinetic reflex (OKR), a highly conserved innate behavior necessary for image stabilization on the retina. The OKR provides a robust readout of image tracking ability and has been extensively studied to understand visual system circuitry and function in animals from different genetic backgrounds. The OKR consists of two phases: a slow tracking phase as the eye follows a stimulus to the edge of the visual plane and a compensatory fast phase saccade that resets the position of the eye in the orbit. Previous methods of tracking gain quantification, although reliable, are labor intensive and can be subjective or arbitrarily derived. To obtain more rapid and reproducible quantification of eye tracking ability, we have developed a novel semi-automated analysis program, PyOKR, that allows for quantification of two-dimensional eye tracking motion in response to any directional stimulus, in addition to being adaptable to any type of video-oculography equipment. This method provides automated filtering, selection of slow tracking phases, modeling of vertical and horizontal eye vectors, quantification of eye movement gains relative to stimulus speed, and organization of resultant data into a usable spreadsheet for statistical and graphical comparisons. This quantitative and streamlined analysis pipeline, readily accessible via PyPI import, provides a fast and direct measurement of OKR responses, thereby facilitating the study of visual behavioral responses.

Introduction

Image stabilization relies on precise oculomotor responses to compensate for global optic flow that occurs during self-motion. This stabilization is driven primarily by two motor responses: the optokinetic reflex (OKR) and the vestibulo-ocular reflex (VOR)1,2,3. Slow global motion across the retina induces the OKR, which elicits reflexive eye rotation in the corresponding direction to stabilize the image1,2. This movement, known as the slow phase, is interrupted by compensatory saccades, known as the fast phase, in which the eye rapidly resets in the opposite direction to allow for a new slow phase. Here, we define these fast-phase saccades as eye-tracking movements (ETMs). Whereas the VOR relies on the vestibular system to elicit eye movements to compensate for head movements3, the OKR is initiated in the retina by the firing of ON and subsequent signaling to the Accessory Optic System (AOS) in the midbrain4,5. Due to its direct reliance on retinal circuits, the OKR has been frequently used to determine visual tracking ability in both research and clinical settings6,7.

The OKR has been studied extensively as a tool for assessing basic visual ability2,6,8, DSGC development9,10,11,12, oculomotor responses13, and physiological differences among genetic backgrounds7. The OKR is evaluated in head-fixed animals presented with a moving stimulus14. Oculomotor responses are typically captured using a variety of video tools, and eye-tracking motions are captured as OKR waveforms in the horizontal and vertical directions9. To quantify tracking ability, two primary metrics have been described: tracking gain (the velocity of the eye relative to the velocity of the stimulus) and ETM frequency (the number of fast phase saccades over a given time frame). Calculation of gain has been used historically to directly measure angular velocity of the eye to estimate tracking ability; however, these calculations are labor intensive and can be arbitrarily derived based on video-oculography collection methods and subsequent quantification. For more rapid OKR assessment, counting of ETM frequency has been used as an alternate method for measuring tracking acuity7. Although this provides a fairly accurate estimation of tracking ability, this method relies on an indirect metric to quantify the slow phase response and introduces a number of biases. These include an observer bias in saccade determination, a reliance on temporally consistent saccadic responses across a set epoch, and an inability to assess the magnitude of the slow phase response.

In order to address these concerns with current OKR assessment approaches and to enable a high throughput in-depth quantification of OKR parameters, we have developed a new analysis method to quantify OKR waveforms. Our approach uses an accessible Python-based software platform named "PyOKR." Using this software, modeling and quantification of OKR slow phase responses can be studied in greater depth and with increased parameterization. The software provides accessible and reproducible quantitative assessments of responses to a myriad of visual stimuli and also two-dimensional visual tracking in response to horizontal and vertical motion.

Subscription Required. Please recommend JoVE to your librarian.

Protocol

All animal experiments performed at The Johns Hopkins University School of Medicine (JHUSOM) were approved by the Institutional Animal Care and Use Committee (IACUC) at the JHUSOM. All experiments performed at the University of California, San Francisco (UCSF) were performed in accordance with protocols approved by the UCSF Institutional Animal Care and Use Program.

1. Behavioral data collection

  1. Record OKR eye movements using the video-oculography method of choice to generate wave data (i.e., a time series of the eye's gaze angle in spherical coordinates).
    NOTE: Representative data collected at JHUSOM were obtained using headpost implantation surgery and video-oculography, as previously described9,13 (Figure 1). Representative data collected from UCSF were obtained through the headpost implantation surgery and the video-oculography method, as described in previously10 (Figure 7).
    1. Make note of stimulus and recording parameters: recording frame rate, stimulus speed and direction, and lengths of time between and after stimulus epochs. For sinusoidal stimuli, note the amplitude and frequency of the stimulus wave as well.
  2. Export collected wave data as a .CSV file containing horizontal and vertical (azimuth and elevation) wave data.
    1. Organize wave data as a tab-delimited .CSV file with two columns containing horizontal data (epxWave) and vertical data (epyWave).

2. Installation of analysis software

  1. Download and install Python.
    1. For graph supervision, install Spyder via Anaconda.
    2. To ensure graphs function correctly in Spyder, go to Tools > Preferences > Ipython Console > Graphics > Graphics Backend. Set Inline to Automatic.
  2. Create a new Anaconda environment with Python.
  3. Install PyOKR via PyPi with pip install PyOKR to install the newest version along with package dependencies (Supplementary Coding File 1 and Supplementary Coding File 2)
  4. If a Windows computer is being used, run from PyOKR import OKR_win as o and then o.run().
  5. If a Mac computer is being used, run from PyOKR import OKR_osx as o and then o.run().

3. Analysis of wave data

  1. Initialization of analysis and file imports
    1. Run o.run() in a .py script to open the user interface.
    2. Under File, use the function Open or the command Ctrl+O [iOS command] to open a browser that will allow the user to select the desired wave file.
    3. Under File, use the button Export Folder or the command Ctrl+E to open a folder browser that will allow the selection of an output folder to which final analyses will be exported.
    4. Input the final analysis file name under the Output file in a recommended format such as AnimalGenotype_AnimalNumber_Analysis.
    5. Set the program for an individual animal using the command Set Subject under File or the command Ctrl+S to initialize the dataset for an individual animal.
  2. Definition of wave file parameters
    1. To begin setting stimulus parameters, define directionality under Select stimulus direction by selecting one of the four cardinal directions. For sinusoidal stimuli, select one that contains (Horizontal) or (Vertical) accordingly, with the cardinal direction defining the initial direction of the sinusoidal wave.
    2. Set stimulus type under Select stimulus type as either Unidirectional, Oscillatory, or Oblique.
    3. After setting the directionality, import either one's own stimulus position dataset (Import own stimulus vector data) or automatically generate a vector based on parameters (Generate stimulus vector from parameters). If importing a stimulus vector, proceed with 3.2.3.1 and then skip to step 3.3. If generating a stimulus vector, proceed with the next steps.
      1. If importing one's own vector data, import the distance values of the stimulus (i.e., a time series describing how far the stimulus moves between each adjacent acquisition frame) in the same format described in step 3.2.1. Additionally, analyze the entire dataset as one epoch rather than splitting it into individual epochs, as functionality for subsetting imported stimulus value has not been added as of PyOKR v1.1.2.
    4. Under Stimulus parameters, set the parameters of the stimulus used for data collection.
      1. Set the length of time of no stimulus at the beginning (head) and end (tail) of a given trial with Head and Tail.
      2. Set the amount of time a stimulus is shown, the amount of time of no-stimulus after, and the number of total epochs within a given trial with Length of epoch, Length of post-stimulus, and Number of Epochs, respectively.
      3. For unidirectional and oblique stimuli, set stimulus speed in degrees per second with Horizontal Speed and Vertical Speed.
      4. Set the capture rate of the collection camera with Capture frame rate.
      5. For sinusoidal stimuli, generate the sinusoidal wave for modeling oscillatory stimuli with Frequency and Amplitude.
    5. After parameterization, make the appropriate model from the inputted stimulus information above with Generate stimulus vector from parameters.
    6. Select a given epoch for the inputted stimulus using Select epoch to scan through the total wave file.
  3. Supervised selection of tracking phases
    1. To identify regions of slow tracking, automatically select fast phase saccades with Preliminary adjustment by clicking either Unfiltered Data or Filtered Data, which will label potential saccades based on maximal velocity changes.
    2. Under Unfiltered Data, confirm that saccades are accurately selected with a blue dot. If automatic selection is not accurate, manually remove points with the Left Mouse Button (LMB) or add points with the Right Mouse Button (RMB). When fast phase saccades are adequately selected, save the points with the Middle Mouse Button (MMB) and close the graph.
    3. If automatic filtering is desired, set a Z-Score Threshold and click Filtered Data to automatically filter saccades. If necessary, use the same manual supervision as described in step 3.3.2 to remove any noise.
    4. After proper saccade selection, press Point Adjustment to select the region to remove. Alter top and bottom points through a similar control scheme as described previously in step 3.3.2. Edit top (green) points with the LMB or the RMB and edit bottom (red) points with the Shift+LMB or Shift+RMB. When points are properly placed, use the MMB to save the points.
      NOTE: if using a Mac, bottom and top point adjustment are in two separate buttons and follow the same control scheme as described in step 3.3.2.
  4. Analysis of slow-tracking phases
    1. Set the order of the polynomial model using Set Polynomial Order to define the polynomial model that will be fitted to individual slow phases.
      NOTE: For unidirectional or oblique stimuli, the default value is 1 since linearity is necessary to calculate tracking gain. For sinusoidal stimuli, a higher order is needed to model the curve of the wave, with a default of 15.
    2. To analyze the trace, select Final Analysis to generate the slow phase models (Figure 2) for the selected slow phases (see Figure 2A-D) and calculate the distances, velocities, and tracking gains averaged across the epoch (Figure 2E).
    3. To view the two-dimensional (2D) or three-dimensional (3D) graph of the selected regions, select View 2D graph or View 3D graph, respectively.
    4. Select Add epoch to save the collected values generated in step 3.4.2. To view all added values for a given animal as well as averages for collected trials, select View current dataset.
    5. After an epoch is added, cycle through the rest of the file with Select epoch, following steps 3.3.1 to 3.4.4.
    6. Once a wave file is fully analyzed, repeat this process for all other files for a given animal by opening new files, setting appropriate parameters, and analyzing them accordingly. By repeating steps 3.2.1-3.4.5 for each file, generate a final dataset containing all wave data for a given animal.
  5. Final export of data
    1. After data analysis is complete for a given animal, with all directions or stimuli analyzed, export the dataset via Export data.
      NOTE: The raw dataset will be exported based on the Output file name and saved along the path set by Output Folder as a CSV containing individual epoch data with the total mean for each stimulus parameter.
    2. After exporting an individual animal, re-initialize the dataset with Ctrl+S and then repeat all previous steps to analyze a new animal.
    3. If needed, re-organize all the output data collected for multiple animals for easier analysis using the command Sort Data under the Analysis tab.
      NOTE: This function will compile and sort all average values for all the analyzed animal files stored within the output folder to allow for easier generation of graphs and statistical comparisons. Sorting is reliant on the naming of the files as of v1.1.2. Use the recommended naming scheme as described in step 3.1.4 for each file (e.g., WT_123_Analysis).

Subscription Required. Please recommend JoVE to your librarian.

Representative Results

To validate the analysis method described above, we quantified OKR tracking gain on wave traces collected from wild-type mice and a conditional knockout mutant with a known tracking deficit. In addition, to test the broader applicability of our analysis method, we analyzed traces derived from a separate cohort of wild-type mice acquired using a different video-oculography collection method. The automatic filtering of saccades facilitates OKR data processing and analysis (Figure 3). Using recordings from unidirectional and sinusoidal stimuli (Figure 1D), we calculated OKR tracking gains in the four cardinal directions (Figure 2F) for wild-type animals (n = 13) to unidirectional stimuli and also tracking gains in response to horizontal and vertical sinusoidal stimuli (Figure 4). The disparity in tracking ability relative to stimulus direction for both unidirectional and sinusoidal stimuli is consistently observed in all wild-type mice, with equally robust horizontal responses that demonstrate significantly higher tracking gains than vertical responses, as has been described2. Further, asymmetric tracking gains between upward and downward responses are also observed in wild-type mice using both video-oculography collection methods, as has been previously reported2,10. The relative magnitudes and the consistency of tracking gains compared to published characterization of OKR responses indicate that tracking gains calculated using the software accurately reflect tracking ability. In addition to single-directional gain calculations, horizontal and vertical eye movement can be modeled simultaneously (Figure 5), allowing for a three-dimensional reconstruction of eye movement in response to a given stimulus. This provides an additional quantification capability that is useful for future studies investigating cross-coupled horizontal and vertical responses9.

To validate the utility of the software for identifying significant behavioral changes across different experimental conditions, we re-analyzed our published data9 to confirm that deficits in vertical tracking assessed in that study by manual counting of fast phase saccades are reflected in tracking gains using the methodology presented here. Previous work shows that genetic inactivation in the retina of the transcription factor T-box Transcription Factor 5 (Tbx5) through conditional knockout using Protocadherin 9-Cre (Pcdh9-Cre) causes specific loss of upward-tuned ON Direction-Selective Ganglion Cells (up-oDSGCs), and that Tbx5 Flox/Flox (Tbx5f/f); Pcdh9-Cre mutants exhibit specific loss of vertical OKR tracking9. Quantitative analysis using the method described here shows that Tbx5f/f; Pcdh9-Cre animals retain normal horizontal tracking gains (Figure 6A), similar to those described previously and obtained by manual counting of fast phase saccades (ETMs) (Figure 2F); however, these mice show significant loss of vertical tracking, with near zero gains in response to both upward and downward stimuli (Figure 6B,C). Additionally, analysis of sinusoidal responses confirms that Tbx5 cKO animals exhibit greater horizontal tracking gains while showing significantly decreased vertical tracking (Figure 6D-F). A reanalysis of this previously described phenotype using PyOKR demonstrates the precision and sensitivity of this new methodology, which allows for quantitative comparisons of OKR responses in mice of different genetic strains.

Finally, we analyzed wild-type vertical OKR traces collected at UCSF to validate the software application's utility with different video-oculography methods and stimulus parameters. Data from UCSF were collected using a hemispherical projection system in which moving gratings are presented to the mouse through the reflection of a 405 nm wavelength projector onto a hemisphere surrounding the head-fixed animal10 (Figure 7A). Unidirectional vertical gratings were presented to mice at a speed of 10 degrees per second, and the OKR responses were recorded over 60-s intervals (Figure 7B,C). Vertical traces were quantitatively analyzed via the PyOKR, and upward responses were compared to downward responses (Figure 7D). Upward responses were significantly stronger than downward, as expected10; however, tracking gains were slightly reduced compared to traces recorded at JHUSOM (Figure 2F). In addition, the quantification of sinusoidal responses was analyzed via PyOKR (Figure 7E), and a significant asymmetry in the vertical responses to sinusoidally moving stimuli is reflected in calculated gains (Figure 7F). Differences between gain values collected at JHUSOM and UCSF can be attributed to differences between stimulus parameters, including different stimulus speeds, types, and wavelengths; however, the overall consistency we observe in our analysis of data obtained using each collection method shows that our PyOKR can be easily adapted beyond our JHUSOM OKR data collection system and applied to other OKR recordings, independent of video-oculography methods. These results demonstrate that the software platform described here is accurate and can be generally applied to the study of oculomotor responses, allowing for precise quantitative comparisons among animals belonging to different groups to further the study of visual image stabilization circuitry.

Figure 1
Figure 1: Collection of OKR response data. (A) OKR virtual arena apparatus for behavioral stimulation, as previously described9,13. Four monitors surround a head-fixed animal (1), displaying a continuously moving checkerboard stimulus (2). The virtual drum can present unidirectional movement in all four cardinal directions as well as oscillatory sinusoidal stimuli. The mouse's left eye is illuminated by an infrared (IR) light and recorded with a camera (3) to record visual system responses reflected in eye tracking. (B) Analysis of eye tracking occurs by capturing the pupil and a corneal reflection generated by IR light. Data collection and calculation of eye movements in response to the virtual drum were performed as described previously9,13. (C) Schematic of eye vectors moving vertically (Y wave) and horizontally (X wave). (D) Sample traces of an eye's tracking response to unidirectional upwards and backwards motion, and also vertical and horizontal sinusoidal motion. Please click here to view a larger version of this figure.

Figure 2
Figure 2: Tracking analysis of unidirectional visual responses. (A-D) Identification and selection of slow tracking phases for gain analysis. Sample unidirectional traces are shown with visual responses to forward (A), backward (B), upward (C), and downward (D) motion in relation to the mouse's eye. Slow phases are identified by the addition of red and green points described in Step 3 to remove saccades, and the selected slow phases are highlighted in yellow. Polynomial regressions are overlaid on the traces as lines. (E) Quantification of the sample traces (A-D) as organized in the PyOKR readout. For each trace, total XY speeds and respective gains are calculated, regardless of directionality. In unidirectional responses, these total speeds will usually reflect the individual velocity in a certain direction; however, for sinusoidal responses, this value will reflect the average overall speed of the eye. Horizontal and vertical velocity components are broken down to show velocity in each respective direction. Gain is then calculated based on the presented stimulus velocities. (F) Calculated tracking gains of wild-type animals (n = 13) in the four cardinal directions compared to their associated ETM quantification. Data are presented as mean ± SD. Data analyzed with a one-way ANOVA with multiple comparisons. *p<0.05, **p<0.01,***p<0.005, ****p<0.0001. Please click here to view a larger version of this figure.

Figure 3
Figure 3: Automatic filtering of saccades facilitates OKR data processing and analysis. (A-D) Automatic filtering of traces from Figure 2A-D removes saccades and models only slow phase motion by removing rapid velocity changes and stitching slow phases together. The final slope represents the total eye movement over the given epoch. (E) Quantification of gains from filtered sample data, as organized in the PyOKR readout. (F) Comparison of gain values between unfiltered vs. filtered sample eye traces reflects no significant differences. Data are presented as mean ± SD. Data analyzed with a Mann-Whitney U test between unfiltered and filtered results. Please click here to view a larger version of this figure.

Figure 4
Figure 4: Derivation of tracking gains in response to oscillatory visual stimuli. (A,B) Vertical (A) and horizontal (B) eye movement responses to sinusoidally moving stimuli can be modeled relative to defined oscillatory stimulus parameters. Selected regions are labeled in yellow with the polynomial approximation overlaid on top of the trace. A model of the stimulus is presented as an orange sinusoid wave behind the trace to allow for reference to what the stimulus is at each point. (C) Gain calculations of wild-type sinusoidal responses (n = 7) reflect asymmetrical responses between horizontal and vertical tracking ability. Data are presented as mean ± SD. Data analyzed with a one-way ANOVA with multiple comparisons. **p<0.01,***p<0.005. Please click here to view a larger version of this figure.

Figure 5
Figure 5: Directional tracking can be modeled in its horizontal and vertical components. (A) Vertical component of an eye tracking wave in response to an upward stimulus. (B) Horizontal component of an eye tracking wave in response to an upward stimulus. (C) Overall eye trajectory in both vertical and horizontal directions. (D) Three-dimensional model of the eye's movement vector over time in response to downward motion. Raw trace data is displayed in red and the regression model of trajectory is displayed in blue. Please click here to view a larger version of this figure.

Figure 6
Figure 6: Analysis of the OKR in Tbx5f/f; Pcdh9-Cre mice shows significant deficits in unidirectional vertical tracking gains. (ATbx5f/f; Pcdh9-Cre animals show no significant change in horizontal tracking gain. (B,CTbx5f/f; Pcdh9-Cre animals show a significant reduction of gain in their vertical responses: upwards (B) and downwards (C). (D,E) Sinusoidal responses of Tbx5f/f; Pcdh9-Cre animals in response to horizontal (D) and vertical (E) oscillatory stimuli. (F) Quantification of Tbx5f/f; Pcdh9-Cre oscillatory responses show significant increases in horizontal tracking gains, but show decreases in vertical responses. Data are presented as mean ± SD. Data analyzed with Mann-Whitney U tests. *p<0.05, **p<0.01, ****p<0.0001. Please click here to view a larger version of this figure.

Figure 7
Figure 7: Application of PyOKR to data acquired from alternative video-oculography methods. (A) Apparatus for OKR virtual drum stimulation, as described10. A 405 nm wavelength DLP projector is reflected via a convex mirror onto a hemisphere to create a virtual drum that surrounds the animal's field of view. Eye movements are measured using an NIR camera positioned outside of the hemisphere. Unidirectional and sinusoidal bar gratings are shown to a head-fixed animal in vertical directions. (B,C) Upward (B) and downward (C) tracking phases are identified and selected for quantitative analysis. Slow phases are highlighted in yellow. (D) Tracking gains calculated from vertical tracking of wild-type animals (n=5) using methods described here. Asymmetric tracking ability is observed, with a significant decrease in downward tracking. (E) Oscillatory response to sinusoidal stimuli modeled to quantify tracking gains in wild-type animals (n=8). Slow phases are highlighted in yellow. (F) Quantification of sinusoidal gains reveals decreased downward tracking gains compared to upward gains. Data are presented as mean ± SD. Data analyzed with Mann-Whitney U tests. *p<0.05. Please click here to view a larger version of this figure.

Supplementary Coding File 1: PyOKR Windows Please click here to download this File.

Supplementary Coding File 2: PyOKR Mac Please click here to download this File.

Subscription Required. Please recommend JoVE to your librarian.

Discussion

PyOKR provides several advantages for studying visual responses reflected in eye movements. These include accuracy, accessibility, and data collection options, in addition to the ability to incorporate parameterization and variable stimulus speeds.

Direct eye tracking gain assessment provides an accurate characterization of eye movement that is a more direct quantitative metric than traditional manual counting of fast phase saccades (ETMs). Although useful, saccade counting provides an indirect assessment of the actual tracking motion. The PyOKR software platform provides direct quantification of tracking speeds, taking into account additional parameters including saccade amplitudes and the length of slow phases. In addition, fast-phase saccade counting relies on the assessment of saccade frequency over a set period of time, requiring that each trace has little to no noise, including random saccades from animal stress. These factors result in many collected traces being unusable and, therefore, add significant time for data collection in order to accurately assess eye tracking. However, the software descibed here addresses this issue by directly quantifying individual slow phases within a given trace and calculating gains at instantaneous points along the trace. It accomplishes this by comparing the velocity of the eye motion to the speed of the stimulus at a given point. Since saccadic frequency is no longer the defining parameter for analysis, noise can be easily removed by user supervision, reducing the impact of noise on the quality of the analysis. This allows for more usable collected data and, higher statistical power when analyzing these data, increasing accuracy and reproducibility when quantifying eye tracking and characterizing OKR parameters for individual mice and different groups of mice. Automated filtering of saccades based on rapid changes in eye velocity found via a Z-score method has also been used to characterize tracking gain10. To incorporate this filtering method within the software, we included this type of velocity filtering but with additional tools such as threshold parameterization and manual saccadic supervision to reduce potential errors generated by this approach. If raw traces are used for analysis instead of using automated filtering, rapid changes in eye movement caused by saccades, blinks, or mouse distress are identified through maximal velocity changes via a Gaussian kernel density estimation of trace velocity. This allows for the automatic selection of segmenting regions and, eventually, removal based on manual user supervision. Manual supervision provides a higher tolerance of noisy traces, allowing users to quickly correct false positive or false negative markings for rapid eye changes. Additionally, since PyOKR uses polynomial approximations to model general eye motion in a given direction, noise at individual points is smoothed out, allowing for trajectory speeds to be assessed with some tolerance for noise in the trace. Only in the event of extremely poor data quality across the entire trace or improper calibration during data collection will the software be unable to generate accurate data analysis. Taken together, by incorporating user supervision, filtering, and instantaneous gain calculation tools, the PyOKR software application described here generates tracking gains with high accuracy and more direct metrics than previous methods.

PyOKR also has many features that allow for OKR analyses to be customized to meet user needs. The input of stimulus parameters, as well as raw stimulus vectors, allows the user to precisely define any desired stimulus. These may include unidirectional, sinusoidal, or oblique stimuli, all of which the software can use to generate accurate tracking gains for the associated visual responses. With the stimulus parameterization of this methodology, instantaneous gains calculated at each frame can be reliably generated with high accuracy regardless of the inputted stimulus. Although oblique stimuli data were not analyzed here, accurate gains can be calculated through decomposed XY velocity vectors to reconstruct an oblique angle, allowing for accurate gain calculations of oblique stimuli. Through accessible parameterization of stimuli settings, our tool is widely applicable to wave tracing in response to any visual stimulus. Additionally, the software facilitates higher throughput and reproducible quantification compared to previous methods. Sorting through OKR traces is time-consuming and labor-intensive; however, the PyOKR streamlined user interface is able to easily screen through many traces at a higher rate than previously possible. This not only allows for faster trace quantification but also reveals additional OKR parameters, including direct quantification of eye velocities, directional vector components of eye motion, and instantaneous gains relative to the stimulus in both the horizontal and vertical components of the movement vector. Further, given the automated identification of saccades and calculation of tracking speeds afforded by our method, potential experimenter bias in data analysis is greatly reduced compared to other quantification methods, such as manual ETM counting or automated trace filtering, which can generate false negatives. Further, large amounts of behavioral data can be generated and automatically compiled for simplified downstream analysis. Through the data export and sorting functions available in PyOKR, tracking data across multiple animals and conditions can be automatically processed to allow for organized data storage as well as rapid statistical analysis. For conducting experiments that employ multiple conditions during the same recording session, such as circuit manipulation or visual responses, we recommend collecting data in discrete wave files or subsetted epochs that allow for the storage of separate datasets based on the independent variables of interest derived from this single recording session. An example would be if one is testing differences in responses to different sinusoidal wave frequencies within an experimental paradigm, we would recommend saving different parameters into different wave files for separate analysis, such as WT_1_Freq0.1Analysis, WT_1_Freq0.2Analysis, and WT_1_Freq0.3Analysis. In the current version, once epochs are analyzed, there is no functionality for selecting individual epoch values within the dataset, though this could be added in the future if necessary.

Finally, our method is adaptable to multiple video-oculography collection methods and provides a robust analysis platform that can be easily tailored to a laboratory's specific needs. Through analysis of wild-type and existing mutant OKR data, and also wild-type OKR data collected using different methods and stimuli, we show here that our analysis tool is capable of a) quantifying OKR tracking from any directional stimulus with high accuracy and throughput; b) identifying visual system behavioral differences resulting from genetic perturbation; and c) assessing visual tracking data from different video-oculography methods. The accessibility and general adaptability of our analysis platform facilitate further study of OKR responses and will enhance behavioral studies that characterize neural circuit assembly and dynamics in the context of oculomotor responses.

To obtain accurate measurements and subsequent useful data analyses, several steps are necessary for data collection. We recommend collecting OKR data over multiple sessions to allow the animal to acclimate to the behavioral testing apparatus and reduce the impact of animal stress on behavioral responses; however, excessive recording may lead to potentiation of the OKR response13, so care in designing testing regimes is recommended. During OKR data collection, proper calibration of the video-oculography equipment is critical for accurate quantification since the quality of analyzed data is a direct function of the processed trace. Importantly, the use of the Spyder IDE is necessary for graph supervision through Matplotlib. Given the accessibility and framework design of our platform, all the necessary tools are available for others to expand the software's capabilities and tailor this platform for distinct behavioral experimental paradigms.

In conclusion, we describe here a new, accessible, and versatile tool for the analysis of OKR behavioral responses in more depth and with more quantitative power than is currently available using existing methodologies. PyOKR can be easily used by novice Python users and contains an established analysis pipeline and interface for the rapid and accurate analysis of OKR waves with increased rigor and reproducibility. The adaptability of this software provides a flexible framework that users can easily tailor to fit their specific needs and data collection procedures. We anticipate that this quantitative method will advance the study of oculomotor responses and further our understanding of the development and function of the neural circuitry that drives visual system behaviors.

Subscription Required. Please recommend JoVE to your librarian.

Disclosures

The authors have no conflicts of interest.

Acknowledgments

This work was supported by R01 EY032095 (ALK), VSTP pre-doctoral fellowship 5T32 EY7143-27 (JK), F31 EY-033225 (SCH), R01 EY035028 (FAD and ALK) and R01 EY-029772 (FAD).

Materials

Name Company Catalog Number Comments
C57BL/6J  mice Jackson Labs 664
Igor Pro WaveMetrics RRID: SCR_000325
MATLAB MathWorks RRID: SCR_001622
Optokinetic reflex recording chamber - JHUSOM Custom-built N/A As described in Al-Khindi et al.(2022)9 and Kodama et al. (2016)13 
Optokinetic reflex recording chamber - UCSF Custom-built N/A As described in Harris and Dunn, 201510
Python Python Software Foundation RRID: SCR_008394
Tbx5 flox/+ mice Gift from B. Bruneau N/A As described in Al-Khindi et al.(2022)9 
Tg(Pcdh9-cre)NP276Gsat/Mmucd MMRRC MMRRC Stock # 036084-UCD; RRID: MMRRC_036084-UCD

DOWNLOAD MATERIALS LIST

References

  1. Stahl, J. S. Using eye movements to assess brain function in mice. Vision Res. 44 (28), 3401-3410 (2004).
  2. Kretschmer, F., Tariq, M., Chatila, W., Wu, B., Badea, T. C. Comparison of optomotor and optokinetic reflexes in mice. J Neurophysiol. 118, 300-316 (2017).
  3. Bronstein, A. M., Patel, M., Arshad, Q. A brief review of the clinical anatomy of the vestibular-ocular connections - How much do we know. Eye. 29 (2), 163-170 (2015).
  4. Simpson, J. I. The accessory optic system. Ann Rev Neurosci. 7, 13-41 (1984).
  5. Hamilton, N. R., Scasny, A. J., Kolodkin, A. L. Development of the vertebrate retinal direction-selective circuit. Dev Biol. 477, 273-283 (2021).
  6. Dobson, V., Teller, D. Y. Visual acuity in human infants: a review and comparison of behavioral and electrophysiological studies. Vision Res. 18 (11), 1469-1483 (1978).
  7. Cahill, H., Nathans, J. The optokinetic reflex as a tool for quantitative analyses of nervous system function in mice: Application to genetic and drug-induced variation. PLoS One. 3 (4), e2055 (2008).
  8. Cameron, D. J., et al. The optokinetic response as a quantitative measure of visual acuity in zebrafish. J Vis Exp. 80, 50832 (2013).
  9. Al-Khindi, T., et al. The transcription factor Tbx5 regulates direction-selective retinal ganglion cell development and image stabilization. Curr Biol. 32 (19), 4286-4298.e5 (2022).
  10. Harris, S. C., Dunn, F. A. Asymmetric retinal direction tuning predicts optokinetic eye movements across stimulus conditions. eLife. 12, 81780 (2015).
  11. Sun, L. O., et al. Functional assembly of accessory optic system circuitry critical for compensatory eye movements. Neuron. 86 (4), 971-984 (2015).
  12. Yonehara, K., et al. Congenital Nystagmus gene FRMD7 is necessary for establishing a neuronal circuit asymmetry for direction selectivity. Neuron. 89 (1), 177-193 (2016).
  13. Kodama, T., Du Lac, S. Adaptive acceleration of visually evoked smooth eye movements in mice. J Neurosci. 36 (25), 6836-6849 (2016).
  14. Stahl, J. S., Van Alphen, A. M., De Zeeuw, C. I. A comparison of video and magnetic search coil recordings of mouse eye movements. J Neurosci Methods. 99 (1-2), 101-110 (2000).

Tags

optokinetic reflex ocular behavior visual perception direction-selectivity video-oculography accessory optic system
This article has been published
Video Coming Soon
PDF DOI DOWNLOAD MATERIALS LIST

Cite this Article

Kiraly, J. K., Harris, S. C.,More

Kiraly, J. K., Harris, S. C., Al-Khindi, T., Dunn, F. A., Kolodkin, A. L. PyOKR: A Semi-Automated Method for Quantifying Optokinetic Reflex Tracking Ability. J. Vis. Exp. (206), e66779, doi:10.3791/66779 (2024).

Less
Copy Citation Download Citation Reprints and Permissions
View Video

Get cutting-edge science videos from JoVE sent straight to your inbox every month.

Waiting X
Simple Hit Counter