ITEM METADATA RECORD
Title: Learning shapes the representation of visual categories in the aging human brain
Authors: Mayhew, Stephen D ×
Li, Shengqiao
Storrar, Joshua K
Tsvetanov, Kamen A
Kourtzi, Zoe #
Issue Date: Dec-2010
Publisher: MIT Press
Series Title: Journal of Cognitive Neuroscience vol:22 issue:12 pages:2899-912
Article number: 10.1162/jocn.2010.21415
Abstract: The ability to make categorical decisions and interpret sensory experiences is critical for survival and interactions across the lifespan. However, little is known about the human brain mechanisms that mediate the learning and representation of visual categories in aging. Here we combine behavioral measurements and fMRI measurements to investigate the neural processes that mediate flexible category learning in the aging human brain. Our findings show that training changes the decision criterion (i.e., categorical boundary) that young and older observers use for making categorical judgments. Comparing the behavioral choices of human observers with those of a pattern classifier based upon multivoxel fMRI signals, we demonstrate learning-dependent changes in similar cortical areas for young and older adults. In particular, we show that neural signals in occipito-temporal and posterior parietal regions change through learning to reflect the perceived visual categories. Information in these areas about the perceived visual categories is preserved in aging, whereas information content is compromised in more anterior parietal and frontal circuits. Thus, these findings provide novel evidence for flexible category learning in aging that shapes the neural representations of visual categories to reflect the observers' behavioral judgments.
URI: 
ISSN: 0898-929X
Publication status: published
KU Leuven publication type: IT
Appears in Collections:Non-KU Leuven Association publications
× corresponding author
# (joint) last author

Files in This Item:

There are no files associated with this item.

Request a copy

 




All items in Lirias are protected by copyright, with all rights reserved.

© Web of science