The Effect of Multimodal Infant-Directed Communication on Language Acquisition

Rana Abu-Zhaya, Purdue University


Human infants experience a rich complex of intertwined and patterned cues from their surrounding environment. For example, infant-directed speech is systematically combined with a variety of non-speech cues that provide valuable information concerning the location and meaning of linguistic units. Little work has examined the role that infant-directed touch plays when combined with infant-directed speech. This neglect of touch cues in the study of multimodal infant-directed communication is surprising given the prominence of infant-directed touch in early infancy. This dissertation addresses this gap by investigating how infant-directed touch is used with infant-directed speech and how infants may benefit from this combination of cues. Specifically, I examine the use of touch cues in combination with speech when access to auditory input is limited and examine whether attention plays a role in infants’ ability to track the frequency of audio-tactile events in continuous speech. ^ In Chapter 1, I review the literature on the multimodality of infant-directed communication, focusing specifically on touch and speech. In Chapter 2, I show that when access to auditory input is limited, caregivers modify their multimodal infant-directed communication when interacting with their children who are deaf or hard of hearing. Specifically, findings show that compared to caregivers of children with normal hearing, caregivers of children who are deaf or hard of hearing are more variable in the frequency with which they use touch. Further, caregivers of children who are deaf or hard of hearing are more likely to align their touches with utterances directed to their children while maintaining fine temporal alignment between the two streams. In Chapter 3, I show that when infants are presented with audio-tactile events in a controlled experimental setting, they are able to use the cross-modal transitional probabilities of these events to segment the speech stream; yet, attention, as measured through heart-rate deceleration, does not mediate this ability. Thus, these results show that audio-tactile events do not always trigger heart rate-defined sustained attention, and the proportion with which they do so is not related to how predictable these events are. ^ Taken together, the results of these studies shed light on the role that touch may play in early language acquisition when it is presented within multimodal infant-directed input; yet, they emphasize the need for further research and the need for more controlled experiments to directly examine the impact of touch on infants’ attention to multimodal input. These issues, along with ideas for future directions are addressed in Chapter 4.^




Amanda Seidl, Purdue University.

Subject Area


Off-Campus Purdue Users:
To access this dissertation, please log in to our
proxy server