Semantic Learning Modifies Perceptual Face Processing Journal Articles uri icon

  •  
  • Overview
  •  
  • Research
  •  
  • Identity
  •  
  • Additional Document Info
  •  
  • View All
  •  

abstract

  • Abstract Face processing changes when a face is learned with personally relevant information. In a five-day learning paradigm, faces were presented with rich semantic stories that conveyed personal information about the faces. Event-related potentials were recorded before and after learning during a passive viewing task. When faces were novel, we observed the expected N170 repetition effect—a reduction in amplitude following face repetition. However, when faces were learned with personal information, the N170 repetition effect was eliminated, suggesting that semantic information modulates the N170 repetition effect. To control for the possibility that a simple perceptual effect contributed to the change in the N170 repetition effect, another experiment was conducted using stories that were not related to the person (i.e., stories about rocks and volcanoes). Although viewers were exposed to the faces an equal amount of time, the typical N170 repetition effect was observed, indicating that personal semantic information associated with a face, and not simply perceptual exposure, produced the observed reduction in the N170 repetition effect. These results are the first to reveal a critical perceptual change in face processing as a result of learning person-related information. The results have important implications for researchers studying face processing, as well as learning and memory in general, as they demonstrate that perceptual information alone is not enough to establish familiarity akin to real-world person learning.

publication date

  • June 1, 2009

has subject area