Large-scale analysis of circuit dynamics underlying behavior in zebrafish larva

Get Complete Project Material File(s) Now! »

Prediction of the larva’s trajectory from the kinematics of tail movements

Zebrafish larvae navigate by producing discrete stereotypical tail movements called swim bouts. Larvae do not track moving gratings faster than 10 Hz (Rinner et al., 2005), this indicates that a refresh rate of 60 Hz from a video projector is sufficient to accommodate the temporal acuity of zebrafish vision. The typical frequency of oscillations of the tail during a bout in restrained larvae is 20 to 30 Hz (Severi et al., 2014). In order to provide a real-time feedback, the tail kinematics should be filmed at high acquisition rates (above 200 Hz), and the processing of the acquired images must be computed in just a few milliseconds. The Reynold’s number of swimming larvae is between 50 and 900 Re, which puts them in a transitional flow regime (McHenry and Lauder, 2005), thus neither inertial nor viscous forces can be neglected. Approximations in flow regime could enable to compute, in real time, the thrust generated by the tail movements of an adult fish (Bergmann and Iollo, 2011). However, computing the thrust in transitional flow regime is so far unachievable.
To predict trajectories from tail kinematics, I used a data-driven approach to learn the relationship between tail movements and fish kinematics in the horizontal plane. I recorded the displacement and tail kinematics from freely swimming larvae in shallow water to generate a large library of movements. Paramecia were also introduced to induce the larvae to generate prey-capture behaviors (5% of the library). Our library of movements consisted of ∼ 300 tail bouts from 6-8 days post fertilization wild-type larvae. The shape of the tail was quantified by computing the tail deflection using a method developed by Raphael Candelier. Figure 2.2 shows the time series of the tail deflection associated with stereotypical movements. This quantification of tail kinematics was fast (∼ 1ms/frames with 100 px square image in C++), and it resulted in a low-noise, smooth and oscillating times series. To describe the change in orientation and position of the larva in the swimming plane, I have used 3 parameters: axial, lateral and yaw speed (Figure 2.3.A). Figure 2.3.C shows the kinematic parameters for freely swimming larvae associated with 4 different types of movements. Kinematic parameters were chosen to be smooth oscillating time series during swim bouts. To identify the relation between the oscillating tail deflections and changes in orientation and position, I used an auto-regressive model with external input (ARX Model, Ljung (1998)). This technique can predict the value of a kinematic parameter (axial, lateral or yaw speed) using a linear combination of both the past value of the kinematic parameters and the past and current values of the tail deflection (Figure 2.3.B). Thus, a simple regression is needed to fit the relationship between the tail deflection and the resulting trajectories.
To assess our model, I predicted the trajectories in the test dataset of free-swimming larvae using only the tail deflection. The resulting trajectories were then compared to the actual trajectories of the larva. Figure 2.3.C shows that the trajectories resulting from different categories of tail movements can be fitted using the same model. Errors accumulates such that the trajectory predicted from the tail deflection diverges from the observed trajectory but the overall kinematics were similar.
The quality of the predictions of the final orientation and position after a tail bout is shown in Figure 2.3.D. I computed the error between predicted and observed path using bootstrap between a test and a train dataset. The error in the prediction of the direction of movements had a standard deviation of 19.41◦ (Figure 2.3.D.ii), a similar standard deviation of 23.41◦ was observed in the prediction of the change in the larva’s head direction (Figure 2.3.D.i). The difference between the predicted and observed amplitude of the movement had a standard deviation of 0.3 mm which represent 1/10 of the body length of the larva (Figure 2.3.D.iii).

Optomotor response in a two-dimensions visual virtual reality system

The optomotor response, a visual component of rheotaxis, is a highly reproducible behavior in zebrafish larva. Presenting a moving grating below the larva elicits a movement in the same direction. I tested whether larvae were capable of orienting towards and follow a moving grating stimulus in the VR.
The speed of the grating (1 cm/s) and its spatial period (1 cm) were chosen according to previous studies (Portugues and Engert (2011), Ahrens et al. (2013a)). At the beginning of each trial, I randomly chose the angle between the initial orientation of the grid movement and the head direction of the larva (between −180◦ and 180◦).
During the stimulation, the speed and orientation of the grid was updated according to the tail movements of the larva. The stability of the trajectories in the VR was improved by applying a gain of 3 to the axial speed. Each experiment consisted of 120 trials, each trial was split into periods of visual stimulation of 6s and resting periods of 20s.
Using this paradigm, we found that when the whole-field motion was aligned with the larvae, they displayed a shorted response time before the first bout (Figure 2.4.F). During the stimulation, larvae maintain an average speed of 0.15 cm/s in the direction of the grid. They produced on average 3 bouts per trial (3.26, N=549, from 9 larvae) and the average bout duration was ∼ 300ms (0.313ms, N=1783, from 9 larvae), which is consistent with previous report (Severi et al., 2014).
As expected, the distribution of the angles between the larva and the grating’s direction deceased with time (Figure 2.4.C,D). Successive bouts brought the head angle of the larva to an average deviation of 20◦ with the grid (Figure 2.4.G). Considering that the larva is aligned with the motion if the difference between the angle of its head and the angle of the grid motion is lower that 30◦ (the deviation observed in free swimming OMR, Ahrens et al. (2013a)). The proportion of aligned larva increased by two folds during the 6 s trial (from 28.2◦ to 51.6◦, N=546, from 9 larvae, Figure 2.4.E,H).

READ  Noise temperature measurement of the receiver with a phase grating 

Prey-capture behavior in two-dimension visual virtual reality

Zebrafish larvae begin to hunt paramecia after 5 days post fertilization, just two days after hatching. This visually driven behavior is crucial for their survival. After detecting a prey, the larva orients itself towards the prey and uses forward scoots and J turns. The larva executes a capture maneuver and swallows its prey when the paramecia is closer than 400 μm.
Under head-restrained conditions, the larvae could perform orienting and pursuit maneuvers toward the pseudo-paramecia in a visual virtual environment (Figure 2.5.A,B). Each trial mimicked a situation where a 100 μm paramecia appears 1.5 mm away from the larva. In this configuration, the apparent angle of the paramecia (diameter of 4◦) was presented to optimally elicit a prey-capture behavior (Bianco and Engert (2015), Semmelhack et al. (2015)).
At the beginning of each trial, we projected on the circular screen a 4◦ circular black spot moving on a white background at an angular speed of ±90◦/s along the azimutal plane. The angular velocity of 90◦/s is not consistent with the speed of moving paramecia (∼ 100 μm/s) at a distance of 1.5 mm from the larva, but it has been shown to be optimal to elicit prey capture (Semmelhack et al., 2015). It is possible that this optimal speed results from the relative velocity between the larva and the paramecia when the larva is actively foraging. Right after the onset of the larva’s first tail bout, the angular speed of the prey inj the virtual environment was set to 0◦/s and the change in size and position of the black circle projected on the screen were computed according to the predicted trajectory of the larva. Figure 2.5.B illustrates the experimental design. If the larva oriented itself toward the virtual paramecium, the circle projected then reached the center of the field of view of the larva and its radius increased as the larva swam in its direction. We considered that a larva captured the virtual prey if its trajectory in the virtual environment was closer than 400 μm from the virtual prey (the corresponding apparent angle of the virtual prey would have a diameter of 15◦).

Integration of visual information during tail bouts

In absence of vestibular input, external landmarks can provide a feedback on the result of a motor action by comparing the visual scene before and after a movement. An alternative strategy is to have a continuous update on the action rather than a discrete one by integrating the angular speed of the visual environment during the movement. Computing the cumulative rotation would, however require the visual system to integrate over large angular displacements while the amplitude of oscillations of the head can reach velocities of up to 4000◦/s during a turn (Figure 2.3.C.iv).
Previous studies have reported that the larva is less sensitive to sensory feedback during movement, and uses visual feedback in-between swim bouts to compare the observed and the expected position (Trivedi and Bollmann, 2013). To test whether visual feedback is not used by the larva during swimming, I altered the visual feedback provided during the movements. We thus performed experiments in which the feedback was updated only at the end of the bout, when the speed was smaller than 0.2 mm/s (Figure 2.6.A). In comparison to trials in which visual feedback was provided in real time, the delay resulted in longer bout durations of 200ms (199.3ms, N=668, from 27 larvae,Figure 2.6.C,D). This subtle change in the visual feedback also decreased by half the percentage of capture (Figure 2.6.B). Overall, these findings suggest that the zebrafish larva is capable of integrating visual information during movements. These results demonstrate the sensitivity of the zebrafish larva to visual feedback provided during movement.

Table of contents :

List of figures
List of tables
1 Introduction 
1.1 Understanding behavior: the sensory-motor dialogue
1.2 Sensory feedback in the perception-action loop
1.2.1 Recording from behaving animals: Virtual reality in neuroscience
1.2.2 How real is virtual reality?
1.3 Internally driven behaviors
1.3.1 Motivation for action in absence of sensory stimulation
1.3.2 Neural basis of spontaneous behavior
1.4 Large-scale analysis of circuit dynamics underlying behavior in zebrafish larva
1.4.1 The zebrafish as a model for systems neuroscience
1.4.2 Locomotion of zebrafish larva
1.4.3 Goal-driven behavior in the larval zebrafish
1.5 Main aims
2 A visual virtual reality system for the zebrafish larva 
2.1 Introduction
2.2 Results
2.2.1 Prediction of the larva’s trajectory from the kinematics of tail movements
2.2.2 Optomotor response in a two-dimensions visual virtual reality system
2.2.3 Prey-capture behavior in two-dimension visual virtual reality .
2.2.4 Integration of visual information during tail bouts
2.3 Materials and methods
3 Internally driven behavior in zebrafish larvae 
3.1 Introduction
3.2 Internally driven behaviors of zebrafish larva
3.2.1 Locomotor repertoire of zebrafish larva
3.2.2 Chaining of spontaneous motor actions
3.2.3 Supplementary Methods
3.3 Neuronal patterns predictive of spontaneous behaviors
3.3.1 Methods
3.4 Results
3.5 Supplementary Methods
4 Conclusions and perspectives 
4.1 A visual virtual reality system for the zebrafish larva
4.2 Internally driven behaviors in zebrafish larva
4.3 Neural basis of internal decisions


Related Posts