Home About us Contact | |||
Gaze Direction (gaze + direction)
Selected AbstractsCan Infants Use a Nonhuman Agent's Gaze Direction to Establish Word,Object Relations?INFANCY, Issue 4 2009Laura O'Connell Adopting a procedure developed with human speakers, we examined infants' ability to follow a nonhuman agent's gaze direction and subsequently to use its gaze to learn new words. When a programmable robot acted as the speaker (Experiment 1), infants followed its gaze toward the word referent whether or not it coincided with their own focus of attention, but failed to learn a new word. When the speaker was human, infants correctly mapped the words (Experiment 2). Furthermore, when the robot interacted contingently, this did not facilitate infants' word mapping (Experiment 3). These findings suggest that gaze following upon hearing a novel word is not sufficient to learn the referent of the word when the speaker is nonhuman. [source] Social influences on formula intake via suckling in 7 to 14-week-old-infantsDEVELOPMENTAL PSYCHOBIOLOGY, Issue 4 2007Julie C. Lumeng Abstract To investigate social influences on human suckling behavior, 25 healthy, full term, 7 to 14-week-old infants were each bottle-fed their own formula twice by their mother and once in each of four experimental conditions: (a) held, provided social interaction; (b) held, without interaction; (c) not held, provided interaction; (d) not held, without interaction. Volume intake (VI), Total Sucks, infant gaze direction, and time elapsed since the last feeding were determined. There were three major findings: (1) social interaction increased VI; (2) VI was linearly related to the time since the last feeding in held infants; (3) Total Sucks and VI were both highly correlated with privation length when infants did not look at the feeder and when fed by the mother. Thus, social influences exert strong immediate impacts on suckling. Accordingly, suckling functions to obtain both nutrition from and social information about the feeder. © 2007 Wiley Periodicals, Inc. Dev Psychobiol 49: 351,361, 2007. [source] Eye remember you two: gaze direction modulates face recognition in a developmental studyDEVELOPMENTAL SCIENCE, Issue 5 2006Alastair D. Smith The effects of gaze direction on memory for faces were studied in children from three different age groups (6,7, 8,9, and 10,11 years old) using a computerized version of a task devised by Hood, Macrae, Cole-Davies and Dias (2003). Participants were presented with a sequence of faces in an encoding phase, and were then required to judge which faces they had previously encountered in a surprise two-alternative forced-choice recognition test. In one condition, stimulus eye gaze was either direct or deviated at the viewing phase, and eyes were closed at the test phase. In another condition, stimulus eyes were closed at the viewing phase, with either direct or deviated gaze at the test phase. Modulation of gaze direction affected hit rates, with participants demonstrating greater accuracy for direct gaze targets compared to deviated gaze targets in both conditions. Reaction times (RT) to correctly recognized stimuli were faster for direct gaze stimuli at the viewing phase, but not at the test phase. The age group of participants differentially affected these measures: there was a greater hit rate advantage for direct gaze stimuli in older children, although RTs were less affected by age. These findings suggest that while the facilitation of face recognition by gaze direction is robust across encoding and recognition stages, the efficiency of the process is affected by the stage at which gaze is modulated. [source] Covert attention allows for continuous control of brain,computer interfacesEUROPEAN JOURNAL OF NEUROSCIENCE, Issue 8 2010Ali Bahramisharif Abstract While brain-computer interfaces (BCIs) can be used for controlling external devices, they also hold the promise of providing a new tool for studying the working brain. In this study we investigated whether modulations of brain activity by changes in covert attention can be used as a continuous control signal for BCI. Covert attention is the act of mentally focusing on a peripheral sensory stimulus without changing gaze direction. The ongoing brain activity was recorded using magnetoencephalography in subjects as they covertly attended to a moving cue while maintaining fixation. Based on posterior alpha power alone, the direction to which subjects were attending could be recovered using circular regression. Results show that the angle of attention could be predicted with a mean absolute deviation of 51° in our best subject. Averaged over subjects, the mean deviation was ,70°. In terms of information transfer rate, the optimal data length used for recovering the direction of attention was found to be 1700 ms; this resulted in a mean absolute deviation of 60° for the best subject. The results were obtained without any subject-specific feature selection and did not require prior subject training. Our findings demonstrate that modulations of posterior alpha activity due to the direction of covert attention has potential as a control signal for continuous control in a BCI setting. Our approach will have several applications, including a brain-controlled computer mouse and improved methods for neuro-feedback that allow direct training of subjects' ability to modulate posterior alpha activity. [source] Is there a role of visual cortex in spatial hearing?EUROPEAN JOURNAL OF NEUROSCIENCE, Issue 11 2004Ulrike Zimmer Abstract The integration of auditory and visual spatial information is an important prerequisite for accurate orientation in the environment. However, while visual spatial information is based on retinal coordinates, the auditory system receives information on sound location in relation to the head. Thus, any deviation of the eyes from a central position results in a divergence between the retinal visual and the head-centred auditory coordinates. It has been suggested that this divergence is compensated for by a neural coordinate transformation, using a signal of eye-in-head position. Using functional magnetic resonance imaging, we investigated which cortical areas of the human brain participate in such auditory,visual coordinate transformations. Sounds were produced with different interaural level differences, leading to left, right or central intracranial percepts, while subjects directed their gaze to visual targets presented to the left, to the right or straight ahead. When gaze was to the left or right, we found the primary visual cortex (V1/V2) activated in both hemispheres. The occipital activation did not occur with sound lateralization per se, but was found exclusively in combination with eccentric eye positions. This result suggests a relation of neural processing in the visual cortex and the transformation of auditory spatial coordinates responsible for maintaining the perceptual alignment of audition and vision with changes in gaze direction. [source] Can Infants Use a Nonhuman Agent's Gaze Direction to Establish Word,Object Relations?INFANCY, Issue 4 2009Laura O'Connell Adopting a procedure developed with human speakers, we examined infants' ability to follow a nonhuman agent's gaze direction and subsequently to use its gaze to learn new words. When a programmable robot acted as the speaker (Experiment 1), infants followed its gaze toward the word referent whether or not it coincided with their own focus of attention, but failed to learn a new word. When the speaker was human, infants correctly mapped the words (Experiment 2). Furthermore, when the robot interacted contingently, this did not facilitate infants' word mapping (Experiment 3). These findings suggest that gaze following upon hearing a novel word is not sufficient to learn the referent of the word when the speaker is nonhuman. [source] A non-invasive method for studying an index of pupil diameter and visual performance in the rhesus monkeyJOURNAL OF MEDICAL PRIMATOLOGY, Issue 2 2006Sarah J. Fairhall Abstract Background, A non-invasive model has been developed to estimate gaze direction and relative pupil diameter, in minimally restrained rhesus monkeys, to investigate the effects of low doses of ocularly administered cholinergic compounds on visual performance. Methods, Animals were trained to co-operate with a novel device, which enabled eye movements to be recorded using modified human eye-tracking equipment, and to perform a task which determined visual threshold contrast. Responses were made by gaze transfer under twilight conditions. 4% w/v pilocarpine nitrate was studied to demonstrate the suitability of the model. Results, Pilocarpine induced marked miosis for >3 h which was accompanied by a decrement in task performance. Conclusions, The method obviates the need for invasive surgery and, as the position of point of gaze can be approximately defined, the approach may have utility in other areas of research involving non-human primates. [source] Mechanisms of face perception in humans: A magneto- and electro-encephalographic studyNEUROPATHOLOGY, Issue 1 2005Shoko Watanabe We have been studying the underlying mechanisms of face perception in humans using magneto- (MEG) and electro-encephalography (EEG) including (1) perception by viewing the static face, (2) differences in perception by viewing the eyes and whole face, (3) the face inversion effect, (4) the effect of gaze direction, (5) perception of eye motion, (6) perception of mouth motion, and (7) the interaction between auditory and visual stimuli related to the vowel sounds. In this review article, we mainly summarize our results obtained on 3, 5, and 6 above. With the presentation of both upright and inverted unfamiliar faces, the inferior temporal cortex (IT) centered on the fusiform gyrus, and the lateral temporal cortex (LT) near the superior temporal sulcus were activated simultaneously, but independently, between 140 and 200 ms post-stimulus. The right hemisphere IT and LT were both active in all subjects, and those in the left hemisphere in half of the subjects. Latencies with inverted faces relative to those with upright faces were longer in the right hemisphere, and shorter in the left hemisphere. Since the activated regions under upright and those under inverted face stimuli did not show a significant difference, we consider that differences in processing upright versus inverted faces are attributable to temporal processing differences rather than to processing of information by different brain regions. When viewing the motion of the mouth and eyes, a large clear MEG component, 1M (mean peak latency of approximately 160 ms), was elicited to both mouth and eye movement, and was generated mainly in the occipito-temporal border, at human MT/V5. The 1M to mouth movement and the 1M to eye movement showed no significant difference in amplitude or generator location. Therefore, our results indicate that human MT/V5 is active in the perception of both mouth and eye motion, and that the perception of movement of facial parts is probably processed similarly. [source] Biological Motion Displays Elicit Social Behavior in 12-Month-OldsCHILD DEVELOPMENT, Issue 4 2009Jennifer M. D. Yoon To test the hypothesis that biological motion perception is developmentally integrated with important social cognitive abilities, 12-month-olds (N = 36) were shown a display of a human point-light figure turning to observe a target. Infants spontaneously and reliably followed the figure's "gaze" despite the absence of familiar and socially informative features such as a face or eyes. This suggests that biological motion displays are sufficient to convey rich psychological information such as attentional orientation and is the first evidence to show that biological motion perception and social cognitive abilities are functionally integrated early in the course of typical development. The question of whether common neural substrates for biological motion perception and analysis of gaze direction underlies the functional integration seen behaviorally is discussed. [source] |