abstract
- While top-down modulation is believed to be central to adult perception, the developmental origins of this ability are unclear. Here, we present a direct, behavioral investigation of top-down modulation of perception in infancy using emotional face perception as a test case. We investigated whether 9-month-olds can modulate their face perception based on predictive, auditory emotional cues without any training or familiarization procedure. Infants first heard a 3-second emotional vocal sound (happy/angry) while their gaze was held in the center of the screen. Then, they were presented with a pair of emotional and neutral faces images without any audio sound. The faces were small (4.70° × 5.80°) and presented in randomized locations outside their focus of attention. We measured the initial latency to shift gaze to look at a congruent emotional face as an index of infants' pre-attentive perception of these faces. We found that infants' face perception was augmented by preceding emotional cues: They were faster to look at the emotional face after hearing an emotionally congruent sound than an incongruent one. Moreover, the emotional sounds boosted perception of congruent faces 200 ms after the onset of the faces. These top-down effects were robust for both happy and angry emotions, indicating a flexible and active control of perception based on different top-down cues. A control study further supported the view that the Congruency effect is due to a top-down influence on face perception rather than a rapid matching of cross-modal emotional signals. Together, these findings demonstrate that top-down modulation of perception is already quite sophisticated early in development. Raw data is available on Github (https://github.com/naiqixiao/CuedEmotion.git).