Suche

Wo soll gesucht werden?
Erweiterte Literatursuche

Ariadne Pfad:

Inhalt

Literaturnachweis - Detailanzeige

 
Autor/inn/enMarian, Viorica; Lam, Tuan Q.; Hayakawa, Sayuri; Dhar, Sumitrajit
TitelSpontaneous Otoacoustic Emissions Reveal an Efficient Auditory Efferent Network
QuelleIn: Journal of Speech, Language, and Hearing Research, 61 (2018) 11, S.2827-2832 (6 Seiten)
PDF als Volltext Verfügbarkeit 
Spracheenglisch
Dokumenttypgedruckt; online; Zeitschriftenaufsatz
ISSN1092-4388
DOI10.1044/2018_JSLHR-H-18-0025
SchlagwörterAcoustics; Auditory Stimuli; Speech; Listening Comprehension; Hearing (Physiology); Auditory Perception
AbstractPurpose: Understanding speech often involves processing input from multiple modalities. The availability of visual information may make auditory input less critical for comprehension. This study examines whether the auditory system is sensitive to the presence of complementary sources of input when exerting top-down control over the amplification of speech stimuli. Method: Auditory gain in the cochlea was assessed by monitoring spontaneous otoacoustic emissions (SOAEs), which are by-products of the amplification process. SOAEs were recorded while 32 participants (23 women, nine men; M[subscript age] = 21.13) identified speech sounds such as "ba" and "ga." The speech sounds were presented either alone or with complementary visual input, as well as in quiet or with 6-talker babble. Results: Analyses revealed that there was a greater reduction in the amplification of noisy auditory stimuli compared with quiet. This reduced amplification may aid in the perception of speech by improving the signal-to-noise ratio. Critically, there was a greater reduction in amplification when speech sounds were presented bimodally with visual information relative to when they were presented unimodally. This effect was evidenced by greater changes in SOAE levels from baseline to stimuli presentation in audiovisual trials relative to audio-only trials. Conclusions: The results suggest that even the earliest stages of speech comprehension are modulated by top-down influences, resulting in changes to SOAEs depending on the presence of bimodal or unimodal input. Neural processes responsible for changes in cochlear function are sensitive to redundancy across auditory and visual input channels and coordinate activity to maximize efficiency in the auditory periphery. (As Provided).
AnmerkungenAmerican Speech-Language-Hearing Association. 2200 Research Blvd #250, Rockville, MD 20850. Tel: 301-296-5700; Fax: 301-296-8580; e-mail: slhr@asha.org; Web site: http://jslhr.pubs.asha.org
Erfasst vonERIC (Education Resources Information Center), Washington, DC
Update2020/1/01
Literaturbeschaffung und Bestandsnachweise in Bibliotheken prüfen
 

Standortunabhängige Dienste
Bibliotheken, die die Zeitschrift "Journal of Speech, Language, and Hearing Research" besitzen:
Link zur Zeitschriftendatenbank (ZDB)

Artikellieferdienst der deutschen Bibliotheken (subito):
Übernahme der Daten in das subito-Bestellformular

Tipps zum Auffinden elektronischer Volltexte im Video-Tutorial

Trefferlisten Einstellungen

Permalink als QR-Code

Permalink als QR-Code

Inhalt auf sozialen Plattformen teilen (nur vorhanden, wenn Javascript eingeschaltet ist)

Teile diese Seite: