Correlation of Individual Differences in Audiovisual Asynchrony Across Stimuli and Tasks: New Constraints on Temporal Renormalization Theory

    loading  Checking for direct PDF access through Ovid

Abstract

Sight and sound are out of synch in different people by different amounts for different tasks. But surprisingly, different concurrent measures of perceptual asynchrony correlate negatively (Freeman et al., 2013). Thus, if vision subjectively leads audition in one individual, the same individual might show a visual lag in other measures of audiovisual integration (e.g., McGurk illusion, Stream-Bounce illusion). This curious negative correlation was first observed between explicit temporal order judgments and implicit phoneme identification tasks, performed concurrently as a dual task, using incongruent McGurk stimuli. Here we used a new set of explicit and implicit tasks and congruent stimuli, to test whether this negative correlation persists across testing sessions, and whether it might be an artifact of using specific incongruent stimuli. None of these manipulations eliminated the negative correlation between explicit and implicit measures. This supports the generalizability and validity of the phenomenon, and offers new theoretical insights into its explanation. Our previously proposed “temporal renormalization” theory assumes that the timings of sensory events registered within the brain’s different multimodal subnetworks are each perceived relative to a representation of the typical average timing of such events across the wider network. Our new data suggest that this representation is stable and generic, rather than dependent on specific stimuli or task contexts, and that it may be acquired through experience with a variety of simultaneous stimuli. Our results also add further evidence that speech comprehension may be improved in some individuals by artificially delaying voices relative to lip-movements.

Related Topics

    loading  Loading Related Articles