JOURNAL ARTICLE

Sequential Movements: When does Binocular Vision Facilitate Object Grasping and Placing

Dave A GonzalezEwa Niechwiej‐Szwedo

Year: 2015 Journal:   Journal of Vision Vol: 15 (12)Pages: 1145-1145   Publisher: Association for Research in Vision and Ophthalmology

Abstract

Vision provides a rich source of spatial and temporal information about the environment and one’s own actions, which is used to plan and execute upper limb movements. Previous research has shown that viewing with both eyes provides a greater advantage during the grasping phase in comparison to the reaching phase. However, most studies examined performance using a single reach-to-grasp movement. Since most of our daily activities involve sequential manipulation actions, it is important to examine hand-eye coordination during performance of these more complex actions. Therefore, we explored the role of binocular vision in a sequential task that involved precision grasping and placing a target onto a vertical needle. Six participants picked up and placed 6 beads (one at a time) onto a needle under binocular and monocular viewing conditions while eye and limb movements were recorded. The difficulty of the grasping task was manipulated by using 2 bead sizes and the kinematic analysis focused on 4 phases of the movement: approach to the bead, bead grasping, return to needle and bead placement on the needle. Therefore, our analysis allows us to delineate which component of the task (reaching for and grasping the bead vs transporting and placing the bead) benefits more from binocular vision. We found that binocular vision was most beneficial after the bead has been grasped. Movement times during the return and placement phase were significantly reduced during binocular viewing (0.6s, SE = .055s) in comparison to monocular viewing (left eye: 0.997s, SE = 0.106s; right eye: 1.136s, SE = 0.119s; p< 0.01). These results indicate that placing the bead onto a needle requires a higher level of precision and thus requires binocular visual input in comparison to the grasping phase. Further analysis will concentrate on quantifying the temporal relation between the hands and eyes during task execution. Meeting abstract presented at VSS 2015

Keywords:
Computer vision Binocular vision Monocular Task (project management) Artificial intelligence GRASP Eye movement Computer science Monocular vision Kinematics Movement (music) Object (grammar) Communication Psychology Engineering Acoustics Physics

Metrics

2
Cited By
0.35
FWCI (Field Weighted Citation Impact)
0
Refs
0.61
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Muscle activation and electromyography studies
Physical Sciences →  Engineering →  Biomedical Engineering
Musculoskeletal pain and rehabilitation
Health Sciences →  Medicine →  Pharmacology
Motor Control and Adaptation
Life Sciences →  Neuroscience →  Cognitive Neuroscience
© 2026 ScienceGate Book Chapters — All rights reserved.