Research

Our research focuses on the neural foundations of visual perception and their development. We are particularly interested in brain areas involved in the recognition and processing of faces, gestures, and written words. We place a special emphasis on the development of these areas during childhood and adolescence and study the interplay between brain development and experience. For example, by examining the neural processes involved in learning to read, we can gain insights into how certain parts of the brain change as a result of experience. In the long term, our research aims to contribute to a better understanding of typical brain development as well as of dyslexia and autism spectrum disorders.

In this study, we aim to find out how brain regions involved in the recognition of faces, hands, and written words develop throughout childhood. To this end, we collect data from preschool children aged 4–6 years, as well as from school-aged children aged 10–12 years who have already been attending school for several years. Because we are interested in how the development of these brain areas relates to what children see in their everyday lives, our study uses a combination of different methods, such as video recordings and magnetic resonance imaging (MRI).

The results will help us better understand the development of brain regions involved in face perception and word recognition. In the long term, the findings provide a basis for future studies investigating developmental disorders.

In this study, we aim to investigate how sign language may shape the brain regions involved in recognizing faces and hands, since hand movements and facial expressions play a much greater role in communication using sign language than in spoken language. Previous research has shown, for example, that people who use sign language are particularly skilled at recognizing and distinguishing faces. The goal of this study is to better understand the neural basis of this ability.

To do so, we collect data from adult participants in three groups: individuals who are deaf and use sign language, individuals who are hearing and use sign language, and individuals who are hearing and have no knowledge of sign language. We employ a combination of different methods, such as eye tracking and magnetic resonance imaging (MRI).

The results are expected to contribute to a better understanding of the brain’s plasticity.

In this study, we investigate how the brain changes when children learn to read. Reading is a complex skill that involves recognizing letters, linking them to sounds, and understanding words. We collect data from children aged 5 to 6 at three time points before, and during their first year of school. Using magnetic resonance imaging (MRI) and computer-based tasks, we examine how brain regions that are important for learning to read develop over time.

The results will provide a better understanding of the neural basis of learning to read and, in the long term, help develop targeted support for children with reading difficulties.

 

In this study, we investigate how differences in brain structure are related to reading development. We focus on the cortical folding in the ventral temporal lobe, a brain region that is closely linked to reading and varies between individuals. Initial research findings indicate that certain anatomical features may be associated with better reading performance.

We analyze large, openly available databases containing structural MRI scans of children with and without developmental dyslexia to examine whether specific anatomical features in early childhood can predict later reading development. In the long term, we aim to contribute to the identification of early biological markers for reading difficulties, in order to support children with dyslexia early and in a targeted manner.