Please use this identifier to cite or link to this item:
|Title:||Iconic gestures prime words.|
|Citation:||Yap, D.F., So, W.C., Yap, J.M., Tan, Y.Q., Teoh, R.L. (2011-01). Iconic gestures prime words.. Cognitive science 35 (1) : 171-183. ScholarBank@NUS Repository. https://doi.org/10.1111/j.1551-6709.2010.01141.x|
|Abstract:||Using a cross-modal semantic priming paradigm, both experiments of the present study investigated the link between the mental representations of iconic gestures and words. Two groups of the participants performed a primed lexical decision task where they had to discriminate between visually presented words and nonwords (e.g., flirp). Word targets (e.g., bird) were preceded by video clips depicting either semantically related (e.g., pair of hands flapping) or semantically unrelated (e.g., drawing a square with both hands) gestures. The duration of gestures was on average 3,500 ms in Experiment 1 but only 1,000 ms in Experiment 2. Significant priming effects were observed in both experiments, with faster response latencies for related gesture-word pairs than unrelated pairs. These results are consistent with the idea of interactions between the gestural and lexical representational systems, such that mere exposure to iconic gestures facilitates the recognition of semantically related words. Copyright © 2010 Cognitive Science Society, Inc.|
|Source Title:||Cognitive science|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Aug 10, 2018
WEB OF SCIENCETM
checked on Jul 11, 2018
checked on Jul 20, 2018
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.