Please use this identifier to cite or link to this item: https://doi.org/10.1111/j.1551-6709.2010.01141.x
Title: Iconic gestures prime words.
Authors: Yap, D.F.
So, W.C. 
Yap, J.M. 
Tan, Y.Q.
Teoh, R.L.
Issue Date: Jan-2011
Citation: Yap, D.F., So, W.C., Yap, J.M., Tan, Y.Q., Teoh, R.L. (2011-01). Iconic gestures prime words.. Cognitive science 35 (1) : 171-183. ScholarBank@NUS Repository. https://doi.org/10.1111/j.1551-6709.2010.01141.x
Abstract: Using a cross-modal semantic priming paradigm, both experiments of the present study investigated the link between the mental representations of iconic gestures and words. Two groups of the participants performed a primed lexical decision task where they had to discriminate between visually presented words and nonwords (e.g., flirp). Word targets (e.g., bird) were preceded by video clips depicting either semantically related (e.g., pair of hands flapping) or semantically unrelated (e.g., drawing a square with both hands) gestures. The duration of gestures was on average 3,500 ms in Experiment 1 but only 1,000 ms in Experiment 2. Significant priming effects were observed in both experiments, with faster response latencies for related gesture-word pairs than unrelated pairs. These results are consistent with the idea of interactions between the gestural and lexical representational systems, such that mere exposure to iconic gestures facilitates the recognition of semantically related words. Copyright © 2010 Cognitive Science Society, Inc.
Source Title: Cognitive science
URI: http://scholarbank.nus.edu.sg/handle/10635/49924
ISSN: 15516709
DOI: 10.1111/j.1551-6709.2010.01141.x
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.