Loading...
Thumbnail Image
Item

Eye movements during visual speech perception in deaf and hearing children

Worster, Elizabeth
Pimperton, Hannah
Ralph-Lewis, Amelia
Monroy, Laura
Hulme, Charles
Macsweeney, Mairéad
Citations
Google Scholar:
Altmetric:
Abstract
For children who are born deaf, lipreading (speechreading) is an important source of access to spoken language. We used eye tracking to investigate the strategies used by deaf (n = 33) and hearing 5–8‐year‐olds (n = 59) during a sentence speechreading task. The proportion of time spent looking at the mouth during speech correlated positively with speechreading accuracy. In addition, all children showed a tendency to watch the mouth during speech and watch the eyes when the model was not speaking. The extent to which the children used this communicative pattern, which we refer to as social‐tuning, positively predicted their speechreading performance, with the deaf children showing a stronger relationship than the hearing children. These data suggest that better speechreading skills are seen in those children, both deaf and hearing, who are able to guide their visual attention to the appropriate part of the image and in those who have a good understanding of conversational turn‐taking.
Keywords
Date
2017
Type
Journal article
Journal
Language Learning: a journal of research in language studies
Book
Volume
68
Issue
1
Page Range
159-179
Article Number
ACU Department
Institute for Learning Sciences and Teacher Education (ILSTE)
Faculty of Education and Arts
Relation URI
Source URL
Event URL
Open Access Status
License
File Access
Controlled
Notes