Home About us Contact | |||
American Sign Language (american + sign_language)
Selected AbstractsAuditory neuropathy/dys-synchrony: Diagnosis and managementDEVELOPMENTAL DISABILITIES RESEARCH REVIEW, Issue 4 2003Charles I. Berlin Abstract Auditory brainstem responses (ABRs) and otoacoustic emissions (OAEs) are objective measures of auditory function, but are not hearing tests. Normal OAEs reflect normal cochlear outer hair cell function, and an ABR indicates a synchronous neural response. It is quite possible for a patient to have normal OAEs but absent or grossly abnormal ABR and a behavioral audiogram that is inconsistent with either test. These patients, who may constitute as much as 10% of the diagnosed deaf population, have auditory neuropathy/dys-synchrony (AN/AD). To diagnose AN/AD accurately, ABRs are obtained in response to condensation and rarefaction clicks to distinguish cochlear microphonics (CM) from neural responses. Appropriate management is confounded by variation among patients and changes in auditory function in some patients over time. Recommendations for management include visual language exposure through methods such as American Sign Language (ASL), Cued Speech, or baby signs, and closely following patients. MRDD Research Reviews 2003;9:225,231. © 2003 Wiley-Liss, Inc. [source] ,Seeing voices': fused visual/auditory verbal hallucinations reported by three persons with schizophrenia-spectrum disorderACTA PSYCHIATRICA SCANDINAVICA, Issue 4 2006R. E. Hoffman Objective:, The neurocognitive basis of verbal/auditory hallucinations remains uncertain. A leading hypothesis is that these hallucinations correspond to ordinary inner speech mislabeled as non-self. However, some studies suggest pathogenic activation of receptive language neurocircuitry as the cause. A form of visualized verbal hallucinations not previously reported in the literature is described that may shed light on this controversy. Method:, Review of three cases. Results:, Two patients described visual hallucinations of speech-like lip and mouth movements fused with simultaneous auditory verbal hallucinations superimposed on perceptions of faces of actual persons in their immediate environment. A third patient described similar experiences incorporated into visual hallucinations of human figures who also exhibited finger and hand movements corresponding to American Sign Language. Conclusion:, These fused, multimodal verbal hallucinations seem unlikely to be due to inner speech mislabeled as non-self, and instead suggest top-down re-shaping of activation in visual processing brain centers by pathogenically active receptive language neurocircuitry. [source] Encoding of Facial Expressions of Emotion and Knowledge of American Sign LanguageJOURNAL OF APPLIED SOCIAL PSYCHOLOGY, Issue 1 2000NAOMI E. GOLDSTEIN The relationship between knowledge of American Sign Language (ASL) and the ability to encode facial expressions of emotion was explored. Participants were 55 college students, half of whom were intermediate-level students of ASL and half of whom had no experience with a signed language. In front of a video camera, participants posed the affective facial expressions of happiness, sadness, fear, surprise, anger, and disgust. These facial expressions were randomized onto stimulus tapes that were then shown to 60 untrained judges who tried to identify the expressed emotions. Results indicated that hearing subjects knowledgeable in ASL were generally more adept than were hearing nonsigners at conveying emotions through facial expression. Results have implications for better understanding the nature of nonverbal communication in hearing and deaf individuals. [source] Translation of the Multidimensional Health Locus of Control Scales for Users of American Sign LanguagePUBLIC HEALTH NURSING, Issue 5 2008Waheedy Samady ABSTRACT This paper describes the translation of the Multidimensional Health Locus of Control (MHLC) scales into American Sign Language (ASL). Translation is an essential first step toward validating the instrument for use in the Deaf community, a commonly overlooked minority community. This translated MHLC/ASL can be utilized by public health nurses researching the Deaf community to create and evaluate targeted health interventions. It can be used in clinical settings to guide the context of the provider-patient dialogue. The MHLC was translated using focus groups, following recommended procedures. 5 bilingual participants translated the MHLC into ASL; 5 others back-translated the ASL version into English. Both focus groups identified and addressed language and cultural problems before the final ASL version of the MHLC was permanently captured by motion picture photography for consistent administration. Nine of the 24 items were directly translatable into ASL. The remaining items required further discussion to achieve cultural equivalence with ASL expressions. The MHLC/ASL is now ready for validation within the Deaf community. [source] Language and Theory of Mind: A Study of Deaf ChildrenCHILD DEVELOPMENT, Issue 2 2007Brenda Schick Theory-of-mind (ToM) abilities were studied in 176 deaf children aged 3 years 11 months to 8 years 3 months who use either American Sign Language (ASL) or oral English, with hearing parents or deaf parents. A battery of tasks tapping understanding of false belief and knowledge state and language skills, ASL or English, was given to each child. There was a significant delay on ToM tasks in deaf children of hearing parents, who typically demonstrate language delays, regardless of whether they used spoken English or ASL. In contrast, deaf children from deaf families performed identically to same-aged hearing controls (N=42). Both vocabulary and understanding syntactic complements were significant independent predictors of success on verbal and low-verbal ToM tasks. [source] |