Students will initially learn that the computational mechanisms of the human brain are one of the greatest challenges of this century and that a great effort has been provided thanks to large-scale simulations and the development of theoretical models at different scales of observation. Students will then be introduced to the usage of computational techniques to model biological neural networks and will understand the brain and its function through a variety of theoretical constructs and computer science analogies. Students will be provided with insights about how the developing of in silico models, as well as of neuromorphic computational engines – based on the brain's circuitry – can contribute a better understanding of the coding strategies used by the “biological” brain to process incoming stimuli, and produce cognitive and/or motor outputs.
AIMS AND LEARNING OUTCOMES
The emphasis is on neural information processing at “network level” in developing quantitative models, as well as in formalizing new paradigms of computation and data representation.
Lectures and case-study discussion.
- Neuron models: i) Biophysical model of neurons: passive and Hodgkin and Huxley models; ii) Reduced neuron models: Integrate-and-fire (IF) and Izhikevich models
- Synaptic transmission and plasticity: i) Phenomenological models; ii) Dynamical models; iii) Spike Timing Dependent Plasticity (STDP).
- Network models: i) overview of different strategies (firing vs spiking) to model large-scale neuronal dynamics; ii) Meta-networks; iii) Balanced networks and syn-fire chains; iv) Role of the connectivity in the emerging dynamics; v) overview of the graph theory and metrics for characterizing a network; vi) different kind of connectivity; functional vs structural connectivity; vii) interplay between connectivity and dynamics.
- Computational paradigms: i) Coding and decoding information; ii) Feed-forward and recurrent networks, lateral inhibition.
- Multidimensional data processing and representation: i) The case study of early sensory systems: receptive fields, tuning curves, population activity, read-out mechanisms; ii) Efficient coding and reduction of dimensionality; iii) Optimal decoding methods.
- Computational synthesis of brain information processing: models of “perceptual engines”, potentialities and design examples.
Slides and other distributed material (available through Aulaweb).
- Koch and Segev. Methods in Neuronal Modeling. MIT press, 1999.
- Gerstner and Kistler. Spiking Neuron Models. Cambridge press, 2002.
- Izhikevich. Dynamical systems in neuroscience. MIT press, 2007.
- Dayan and Abbott. Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. The MIT Press, 2001.
Office hours: Appointment by e-mail
Office hours: Monday 11am-13pm Thursday 14:30pm-16:30pm Office c/o pad. E, Via Opera Pia 13 (3rd floor) Lab: “The Physical Structure of Perception and Computation”, Via Opera Pia 11a,3rd floor (phone: +39-010-3532794, website: www.pspc.unige.it )
SILVIO PAOLO SABATINI (President)
Lectures and case-study discussion.
All class schedules are posted on the EasyAcademy portal.
Oral examination and evaluation of the presentation of a scientific paper selected by the student.
After completing this course, the student will be able to:
- Develop computational models of large-scale neuronal networks.
- Analyze and synthetize neuromorphic processing paradigms at cellular, network, and system level.
|16/02/2018||09:00||GENOVA||Esame su appuntamento|
|27/07/2018||09:00||GENOVA||Esame su appuntamento|
|21/09/2018||09:00||GENOVA||Esame su appuntamento|
|28/02/2019||09:00||GENOVA||Esame su appuntamento|