Suche

Wo soll gesucht werden?
Erweiterte Literatursuche

Ariadne Pfad:

Inhalt

Literaturnachweis - Detailanzeige

 
Autor/inn/enLoh, Marco; Schmid, Gabriele; Deco, Gustavo; Ziegler, Wolfram
TitelAudiovisual Matching in Speech and Nonspeech Sounds: A Neurodynamical Model
QuelleIn: Journal of Cognitive Neuroscience, 22 (2010) 2, S.240-247 (8 Seiten)
PDF als Volltext Verfügbarkeit 
Spracheenglisch
Dokumenttypgedruckt; online; Zeitschriftenaufsatz
ISSN0898-929X
DOI10.1162/jocn.2009.21202
SchlagwörterStimuli; Models; Auditory Perception; Speech Communication; Cognitive Processes; Comparative Analysis; Nonverbal Ability; Multisensory Learning; Evaluation Methods; Nonverbal Communication
AbstractAudiovisual speech perception provides an opportunity to investigate the mechanisms underlying multimodal processing. By using nonspeech stimuli, it is possible to investigate the degree to which audiovisual processing is specific to the speech domain. It has been shown in a match-to-sample design that matching across modalities is more difficult in the nonspeech domain as compared to the speech domain. We constructed a biophysically realistic neural network model simulating this experimental evidence. We propose that a stronger connection between modalities in speech underlies the behavioral difference between the speech and the nonspeech domain. This could be the result of more extensive experience with speech stimuli. Because the match-to-sample paradigm does not allow us to draw conclusions concerning the integration of auditory and visual information, we also simulated two further conditions based on the same paradigm, which tested the integration of auditory and visual information within a single stimulus. New experimental data for these two conditions support the simulation results and suggest that audiovisual integration of discordant stimuli is stronger in speech than in nonspeech stimuli. According to the simulations, the connection strength between auditory and visual information, on the one hand, determines how well auditory information can be assigned to visual information, and on the other hand, it influences the magnitude of multimodal integration. (As Provided).
AnmerkungenMIT Press. Circulation Department, Five Cambridge Center, Cambridge, MA 02142. Tel: 617-253-2889; Fax: 617-577-1545; e-mail: journals-orders@mit.edu; Web site: http://www.mitpressjournals.org/loi/jocn
Erfasst vonERIC (Education Resources Information Center), Washington, DC
Update2017/4/10
Literaturbeschaffung und Bestandsnachweise in Bibliotheken prüfen
 

Standortunabhängige Dienste
Bibliotheken, die die Zeitschrift "Journal of Cognitive Neuroscience" besitzen:
Link zur Zeitschriftendatenbank (ZDB)

Artikellieferdienst der deutschen Bibliotheken (subito):
Übernahme der Daten in das subito-Bestellformular

Tipps zum Auffinden elektronischer Volltexte im Video-Tutorial

Trefferlisten Einstellungen

Permalink als QR-Code

Permalink als QR-Code

Inhalt auf sozialen Plattformen teilen (nur vorhanden, wenn Javascript eingeschaltet ist)

Teile diese Seite: