Toward an Understanding of Automatic Grasping Responses in the Absence of Left-Right Correspondence

Isis Chong de la Cruz, Purdue University

Abstract

Several researchers have claimed that passively viewing manipulable objects results in automatic motor activation of affordances regardless of intention to act upon an object. Support for the automatic activation account stems primarily from findings using stimulus-response compatibility paradigms in which responses are fastest when there is correspondence between one’s response hand and an object’s handle. Counter to this view is the spatial coding account, which suggests that past findings are a result of abstract spatial codes stemming from salient object properties and their left-right correspondence with responses. Although there is now considerable support for this account, there has been little attention paid to determining whether evidence in favor of the automatic activation account will be evident after accounting for the spatial issues demonstrated by the spatial coding account. The present study involved five experiments conducted to bridge this gap in two steps. First, I aimed to demonstrate the importance of considering spatial issues and left-right correspondence when studying object-based motor activation by numerous objects championed by past researchers who attempted to similarly address the aforementioned issue (Experiments 1 and 2). Second, I sought to determine whether evidence favoring the automatic activation account could be obtained when the possibility for left-right correspondence was absent in a novel set of stimuli created specifically for this purpose (Experiments 3, 4, and 5). Experiment 1 examined a stimulus set that some researchers have suggested can more definitively tease apart evidence for automatic activation from the influence of spatial factors studies. Experiment 2 was more narrowly focused and investigated a single object presented in different horizontal orientations. These experiments effectively demonstrated the importance of giving more consideration to the nature of the stimuli used in object-based compatibility studies and how they are presented. The results of Experiment 1 suggest that a stimulus set that has been claimed to sidestep spatial confounds does not, in fact, do so. Moreover, Experiment 2 demonstrated that performance could be influenced by simple rotation of the object to which a response was required. Having established the importance of controlling the stimuli used to investigate automatic activation of afforded responses, I turned to determining whether a novel stimulus set would yield findings favoring the automatic activation account even after accounting for left-right correspondence (Experiments 3, 4, and 5). Three sets of novel object stimuli were developed that do not allow for left-right correspondence and could iteratively assess support for the automatic activation account based on criteria for activation that have been put forth in the literature. The three sets of stimuli contained no information about shape nor functionality (i.e., silhouette iteration) or information about shape and functionality (i.e., functional iteration), or they were an intermediate between the two other types (i.e., intermediate iteration). Critically, the three latter experiments progressively approached the conditions that researchers have suggested are ideal for automatic activation of afforded responses to occur. Experiment 3 tasked participants with completing a color discrimination task in which they viewed only one of the three object iterations and responded with button presses. Experiment 4 used the same experimental configuration, but instead, required participants to respond with a grasping response. Finally, Experiment 5 required participants to complete a reach-and-grasp response in an object discrimination task using both the silhouette and functional iterations.

Degree

Ph.D.

Advisors

Proctor, Purdue University.

Subject Area

Logic|Neurosciences|Recreation

Off-Campus Purdue Users:
To access this dissertation, please log in to our
proxy server
.

Share

COinS