Home
Scholarly Works
Semantic Learning Modifies Perceptual Face...
Journal article

Semantic Learning Modifies Perceptual Face Processing

Abstract

Face processing changes when a face is learned with personally relevant information. In a five-day learning paradigm, faces were presented with rich semantic stories that conveyed personal information about the faces. Event-related potentials were recorded before and after learning during a passive viewing task. When faces were novel, we observed the expected N170 repetition effect-a reduction in amplitude following face repetition. However, when faces were learned with personal information, the N170 repetition effect was eliminated, suggesting that semantic information modulates the N170 repetition effect. To control for the possibility that a simple perceptual effect contributed to the change in the N170 repetition effect, another experiment was conducted using stories that were not related to the person (i.e., stories about rocks and volcanoes). Although viewers were exposed to the faces an equal amount of time, the typical N170 repetition effect was observed, indicating that personal semantic information associated with a face, and not simply perceptual exposure, produced the observed reduction in the N170 repetition effect. These results are the first to reveal a critical perceptual change in face processing as a result of learning person-related information. The results have important implications for researchers studying face processing, as well as learning and memory in general, as they demonstrate that perceptual information alone is not enough to establish familiarity akin to real-world person learning.

Authors

Heisz JJ; Shedden JM

Journal

Journal of Cognitive Neuroscience, Vol. 21, No. 6, pp. 1127–1134

Publisher

MIT Press

Publication Date

June 1, 2009

DOI

10.1162/jocn.2009.21104

ISSN

0898-929X

Contact the Experts team