Neural basis of visual perception and vision-based decision making

We examine the neural underpinning of the sensory-to-decisions transformation, with a particular interest in the impact of broadly defined contextual elements for the decision-making process. For this, we adopt behavioral paradigms where the integration of sensory evidence is selectively routed or gated by contextual elements, and where the confidence in the decision making is modulated by the availability of sensory evidence. Accordingly, we target cortical areas known to be involved in such computations and that are accessible with our methodologies, such as the visual, parietal, and (anterior) cingulate cortex.

Capturing the large-scale neural dynamics with recurrent neural networks

Our preferred theoretical scaffolding is based on recurrent neural networks (RNNs) constrained to reproduce neural responses. By drawing statistical analogies between RNNs and neuronal circuits, and by reverse engineering the RNN dynamics to infer computations, we make testable dynamical predictions on how timely perturbations of specific functional groups and cell types can modify the population dynamics, and accordingly the animal’s decision-making process and behavioral output. Experimentally, we are adopting the mouse animal model and all-optical methods to record from, and simultaneously perturb neural responses in large cortical networks with single-cell spatial resolution. This is achieved through the integration of two-photon GCaMP imaging with a spatial-light modulator (SLM) for patterned optogenetic stimulation. Behavioral training is facilitated by our high-throughput, fully-automated training setups (Aoki et al., 2017).

Join the lab

We are under intense development and we are looking for motivated, enthusiastic, bright scientists and technical staff to join the team. Please, click on Join The Lab for available positions.


To contact us, please click here.