Modality-independent encoding of individual concepts in the left parietal cortex
The organization of semantic information in the brain has been mainly explored through category-based models, on the assumption that categories broadly reflect the organization of conceptual knowledge. However, the analysis of concepts as individual entities, rather than as items belonging to distinct superordinate categories, may represent a significant advancement in the comprehension of how conceptual knowledge is encoded in the human brain.
Here, we studied the individual representation of thirty concrete nouns from six different categories, across different sensory modalities (i.e., auditory and visual) and groups (i.e., sighted and congenitally blind individuals) in a core hub of the semantic network, the left angular gyrus, and in its neighboring regions within the lateral parietal cortex. Four models based on either perceptual or semantic features at different levels of complexity (i.e., low- or high-level) were used to predict fMRI brain activity using representational similarity encoding analysis. When controlling for the superordinate component, high-level models based on semantic and shape information led to significant encoding accuracies in the intraparietal sulcus only. This region is involved in feature binding and combination of concepts across multiple sensory modalities, suggesting its role in high-level representation of conceptual knowledge. Moreover, when the information regarding superordinate categories is retained, a large extent of parietal cortex is engaged. This result indicates the need to control for the coarse-level categorial organization when performing studies on higher-level processes related to the retrieval of semantic information.