Home About us Contact | |||
Temporal Synchrony (temporal + synchrony)
Selected AbstractsTask-relevance and temporal synchrony between tactile and visual stimuli modulates cortical activity and motor performance during sensory-guided movementHUMAN BRAIN MAPPING, Issue 2 2009Sean K. Meehan Abstract Sensory-guided movements require the analysis and integration of task-relevant sensory inputs from multiple modalities. This article sought to: (1) assess effects of intermodal temporal synchrony upon modulation of primary somatosensory cortex (S1) during continuous sensorimotor transformations, (2) identify cortical areas sensitive to temporal synchrony, and (3) provide further insight into the reduction of S1 activity during continuous vibrotactile tracking previously observed by our group (Meehan and Staines 2007: Brain Res 1138:148,158). Functional MRI was acquired while participants received simultaneous bimodal (visuospatial/vibrotactile) stimulation and continuously tracked random changes in one modality, by applying graded force to a force-sensing resistor. Effects of intermodal synchrony were investigated, unbeknownst to the participants, by varying temporal synchrony so that sensorimotor transformations dictated by the distracter modality either conflicted (low synchrony) or supplemented (high synchrony) those of the target modality. Temporal synchrony differentially influenced tracking performance dependent upon tracking modality. Physiologically, synchrony did not influence S1 activation; however, the insula and superior temporal gyrus were influenced regardless of tracking modality. The left temporal-parietal junction demonstrated increased activation during high synchrony specific to vibrotactile tracking. The superior parietal lobe and superior temporal gyrus demonstrated increased activation during low synchrony specific to visuospatial tracking. As previously reported, vibrotactile tracking resulted in decreased S1 activation relative to when it was task-irrelevant. We conclude that while temporal synchrony is represented at higher levels than S1, interactions between inter- and intramodal mechanisms determines sensory processing at the level of S1. Hum Brain Mapp, 2009. © 2007 Wiley-Liss, Inc. [source] Effects of redundant and nonredundant bimodal sensory stimulation on heart rate in bobwhite quail embryosDEVELOPMENTAL PSYCHOBIOLOGY, Issue 4 2003Greg D. Reynolds Abstract Research with both animal embryos and human infants has provided evidence that information presented redundantly and in temporal synchrony across sensory modalities (intersensory redundancy) can guide selective attention, perceptual learning, and memory during early development. How this facilitation is achieved remains relatively unexamined. This study examined the effects of redundant versus nonredundant bimodal stimulation on a measure of physiological arousal (heart rate) in bobwhite quail embryos. Results show that quail embryos exposed to concurrent but nonredundant auditory and visual stimulation during the late stages of incubation exhibit significantly elevated heart rates following stimulus exposure and during stimulus reexposure when compared to embryos exposed to redundant and synchronous audiovisual stimulation, unimodal auditory stimulation, or no supplemental prenatal sensory stimulation. These findings indicate a functional distinction between redundant and nonredundant bimodal stimulation during early development and suggest that nonredundant bimodal stimulation during the prenatal period can raise arousal levels, thereby potentially interfering with the attentional capacities and perceptual learning of bobwhite quail. In contrast, intersensory redundancy appears to foster arousal levels that facilitate selective attention and perceptual learning during prenatal development. © 2003 Wiley Periodicals, Inc. Dev Psychobiol 43: 304,310, 2003. [source] Intersensory redundancy facilitates discrimination of tempo in 3-month-old infantsDEVELOPMENTAL PSYCHOBIOLOGY, Issue 4 2002Lorraine E. Bahrick Abstract L. Bahrick and R. Lickliter (2000) proposed an intersensory redundancy hypothesis that states that information presented redundantly and in temporal synchrony across two or more sensory modalities selectively recruits infant attention and facilitates perceptual learning more effectively than does the same information presented unimodally. In support of this view, they found that 5-month-old infants were able to differentiate between two complex rhythms when they were presented bimodally, but not unimodally. The present study extended our test of the intersensory redundancy hypothesis to younger infants and to a different amodal property. Three-month-olds' sensitivity to the amodal property of tempo was investigated. Results replicated and extended those of Bahrick and Lickliter, demonstrating that infants could discriminate a change in tempo following bimodal, but not unimodal, habituation. It appears that when infants are first learning to differentiate an amodal stimulus property, discrimination is facilitated by intersensory redundancy and attenuated under conditions of unimodal stimulation. © 2002 Wiley Periodicals, Inc. Dev Psychobiol 41: 352,363, 2002. Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/dev.10049 [source] Task-relevance and temporal synchrony between tactile and visual stimuli modulates cortical activity and motor performance during sensory-guided movementHUMAN BRAIN MAPPING, Issue 2 2009Sean K. Meehan Abstract Sensory-guided movements require the analysis and integration of task-relevant sensory inputs from multiple modalities. This article sought to: (1) assess effects of intermodal temporal synchrony upon modulation of primary somatosensory cortex (S1) during continuous sensorimotor transformations, (2) identify cortical areas sensitive to temporal synchrony, and (3) provide further insight into the reduction of S1 activity during continuous vibrotactile tracking previously observed by our group (Meehan and Staines 2007: Brain Res 1138:148,158). Functional MRI was acquired while participants received simultaneous bimodal (visuospatial/vibrotactile) stimulation and continuously tracked random changes in one modality, by applying graded force to a force-sensing resistor. Effects of intermodal synchrony were investigated, unbeknownst to the participants, by varying temporal synchrony so that sensorimotor transformations dictated by the distracter modality either conflicted (low synchrony) or supplemented (high synchrony) those of the target modality. Temporal synchrony differentially influenced tracking performance dependent upon tracking modality. Physiologically, synchrony did not influence S1 activation; however, the insula and superior temporal gyrus were influenced regardless of tracking modality. The left temporal-parietal junction demonstrated increased activation during high synchrony specific to vibrotactile tracking. The superior parietal lobe and superior temporal gyrus demonstrated increased activation during low synchrony specific to visuospatial tracking. As previously reported, vibrotactile tracking resulted in decreased S1 activation relative to when it was task-irrelevant. We conclude that while temporal synchrony is represented at higher levels than S1, interactions between inter- and intramodal mechanisms determines sensory processing at the level of S1. Hum Brain Mapp, 2009. © 2007 Wiley-Liss, Inc. [source] fMRI reveals that non-local processing in ventral retinotopic cortex underlies perceptual grouping by temporal synchronyHUMAN BRAIN MAPPING, Issue 6 2008Gideon P. Caplovitz Abstract When spatially separated objects appear and disappear in a synchronous manner, they perceptually group into a single global object that itself appears and disappears. We employed functional magnetic resonance imaging (fMRI) to identify brain regions involved in this type of perceptual grouping. Subjects viewed four chromatically-defined disks (one per visual quadrant) that flashed on and off. We contrasted %BOLD signal changes between blocks of synchronously flashing disks (Grouping) with blocks of asynchronously flashing disks (no-Grouping). Results: A region of interest analysis revealed %BOLD signal change in the Grouping condition was significantly greater than in the no-Grouping condition within retinotopic areas V2, V3, and V4v. Within a single quadrant of the visual field, the spatio-temporal information present in the image was identical across the two stimulus conditions. As such, the two conditions could not be distinguished from each other on the basis of the rate or pattern of flashing within a single visual quadrant. The observed results must therefore arise through nonlocal interactions between or within these retinotopic areas, or arise from outside these retinotopic areas. Furthermore, when V2 and V3 were split into ventral and dorsal sub-ROIs, ventral retinotopic areas V2v and V3v preferentially differentiated between the two conditions whereas the corresponding dorsal areas V2d and V3d did not. In contrast, within hMT+, %BOLD signal was significantly greater in the no-Grouping condition. Conclusion: Nonlocal processing within, between, or to ventral retinotopic cortex at least as early as V2v, and including V3v, and V4v, underlies perceptual grouping via temporal synchrony. Hum Brain Mapp, 2008. © 2007 Wiley-Liss, Inc. [source] Type of Maternal Object Motion During Synchronous Naming Predicts Preverbal Infants' Learning of Word,Object RelationsINFANCY, Issue 2 2008Dalit J. Matatyaho Mothers' use of specific types of object motion in synchrony with object naming was examined, along with infants' joint attention to the mother and object, as a predictor of word learning. During a semistructured 3-min play episode, mothers (N = 24) taught the names of 2 toy objects to their preverbal 6- to 8-month-old infants. The episodes were recoded from Gogate, Bolzani, and Betancourt (2006) to provide a more fine-grained description of object motions used by mothers during naming. The results indicated that mothers used forward/downward and shaking motions more frequently and upward and backward motions less frequently in temporal synchrony with the spoken words. These motions likely highlight novel word,object relations. Furthermore, maternal use of shaking motions in synchrony with the spoken words and infants' ability to switch gaze from mother to object contributed to infants' learning of the word,object relations, as observed on a posttest. Thus, preverbal infants' learn word,object relations within an embodied system involving tightly coupled interaction between infants' perception and joint attention, and specific properties of caregivers' naming. [source] Discrimination of temporal synchrony in intermodal events by children with autism and children with developmental disabilities without autismTHE JOURNAL OF CHILD PSYCHOLOGY AND PSYCHIATRY AND ALLIED DISCIPLINES, Issue 1 2006James M. Bebko Background:, This project examined the intermodal perception of temporal synchrony in 16 young children (ages 4 to 6 years) with autism compared to a group of children without impairments matched on adaptive age, and a group of children with other developmental disabilities matched on chronological and adaptive age. Method:, A preferential looking paradigm was used, where participants viewed non-linguistic, simple linguistic or complex linguistic events on two screens displaying identical video tracks, but one offset from the other by 3 seconds, and with the single audio track matched to only one of the displays. Results:, As predicted, both comparison groups demonstrated significant non-random preferential looking to violations of temporal synchrony with linguistic and non-linguistic stimuli. However, the group with autism showed an impaired, chance level of responding, except when presented with non-linguistic stimuli. Conclusions:, Several explanations are offered for this apparently autism-specific, language-specific pattern of responding to temporal synchrony, and potential developmental sequelae are discussed. [source] The Role of Person Familiarity in Young Infants' Perception of Emotional ExpressionsCHILD DEVELOPMENT, Issue 2 2001Ronit Kahana-Kalman This research investigated the role of person familiarity in the ability of 3.5-month-old infants to recognize emotional expressions. Infants (N= 72) were presented simultaneously with two filmed facial expressions, happy and sad, accompanied by a single vocal expression that was concordant with one of the two facial expressions. Infants' looking preferences and facial expressions were coded. Results indicated that when the emotional expressions were portrayed by each infant's own mother, infants looked significantly longer toward the facial expressions that were accompanied by affectively matching vocal expressions. Infants who were presented with emotional expressions of an unfamiliar woman did not. Even when a brief delay was inserted between the presentation of facial and vocal expressions, infants who were presented with emotional expressions of their own mothers looked longer at the facial expression that was sound specified, indicating that some factor other than temporal synchrony guided their looking preferences. When infants viewed the films of their own mothers, they were more interactive and expressed more positive and less negative affect. Moreover, infants produced a greater number of full and bright smiles when the sound-specified emotion was "happy," and particularly when they viewed the happy expressions of their own mothers. The average duration of negative affect was significantly longer for infants who observed the unfamiliar woman than for those who observed their own mothers. These results show that when more contextual information,that is, person familiarity ,was available, infants as young as 3.5 months of age recognized happy and sad expressions. These findings suggest that in the early stages of development, infants are sensitive to contextual information that potentially facilitates some of the meaning of others' emotional expressions. [source] |