Visual Timing Information and Language Background in Audiovisual Speech Perception: Evidence from Lexical Tone Identification Task between Mandarin and English Speakers

Hui Xie, Biao Zeng, Rui Wang, Yajing Li

    Research output: Contribution to conferencePaperpeer-review

    Abstract

    The present study investigated the role of lip movement durations as a visual cue to improve the intelligibility of lexical pitch contours under different noise conditions. Two types of tone pairs were used in the study: maximum contrast pair (falling vs. dipping tones, the durational difference of lip movement was 100ms) and minimum contrast pair (rising vs. falling tones, the difference was 33ms). Both Mandarin and English speakers were asked to identify a Mandarin lexical tone in one pair of tones under auditory only (AO) and audiovisual (AV) conditions. The results showed that the duration of lip movement enhanced discriminability, which was measured by d prime, in the maximum contrast falling-dipping tone pairs whereas the similar lengths of rising and dipping tones attenuated such visual benefit. Compared to non-tonal English speakers, Mandarin listeners utilised visual timing information to facilitate identification of audiovisual lexical tones. The finding suggested that visual timing information, in the form of lip movements, may be a specific cue for audiovisual lexical tone perception.
    Original languageEnglish
    Publication statusSubmitted - 2021

    Keywords

    • audiovisual speech, lexical tone, visual benefit, English, Chinese, visual timing

    Fingerprint

    Dive into the research topics of 'Visual Timing Information and Language Background in Audiovisual Speech Perception: Evidence from Lexical Tone Identification Task between Mandarin and English Speakers'. Together they form a unique fingerprint.

    Cite this