Coordinating Object Handovers Between Robots and Humans

Photo of Tunik, Erdogmus, Padir

Bouve Professor Eugene Tunik (PI) on the project, ECE Professor Deniz Erdogmus, Associate Professor Taskin Padir and Bouve Associate Research Scientist Mathew Yarossi were award a $760K NSF grant for the “Coordination of Dyadic Object Handover for Human-Robot Interactions”.

Picture the deftness with which people handover objects to each other during everyday interactions, whether in ordinary environments such as shops or charged high-stakes circumstances such as operating rooms. Object handover is effortless for us because we are adept at inferring and anticipating shared intentions and actions. As robots enter our lives, their ability to assist us will depend on their ability to enable natural human engagement. Although engineering advancements have improved robot dexterity for independent actions, human-robot interactions for collaborative physical tasks remain deficient for practical application.

The overarching goal of this project is to bring human-robot collaboration for object handover to new levels of performance, where the bi-directional interaction between the pair is holistic and intuitive. To achieve this goal, the team will (i) perform a systematic investigation to identify the spatiotemporal characteristics of dyadic coordination that allow us to understand our collaborator’s intentions, anticipate their actions, and coordinate our movements with theirs during object handover tasks, and (ii) operationalize this knowledge to develop robots with whom we can collaborate on physical tasks as readily as we do with humans.


Abstract Source: NSF

The research objective of this project is to improve the fluidity of human-robot interactions wherein robots and humans can seamlessly pass objects between one another. Object handover is critical to everyday interactions, whether in ordinary environments or in high-stakes circumstances such as operating rooms. Seemingly effortless object handover results from successful inference and anticipation of shared intentions and actions. Manual object transfer between humans and robots will become increasingly important as robots become more common in the workplace and at home. The project team will perform human subject experiments investigating human-human and human-robot interactions within the context of object handover tasks to identify characteristics of dyadic coordination that allow people to understand their collaborator’s intentions, to anticipate their actions, and to coordinate movements leading to task success. The team will use that new knowledge to develop robots that people can collaborate with on physical tasks as readily as they do other humans. Broader Impacts of the project include training opportunities for high school, undergraduate, and graduate students, with efforts to increase participation of underrepresented groups.

This project explores human and robotic perception, behavior, and intent inference in a bidirectional and integrated manner within the context of object handover. Three specific aims are planned. The first uses motion capture, eye tracking, and electroencephalography data to build models of human intent and action during object handover. A novelty of the models is that they are sensitive to an individual’s role (giver vs. receiver; leader vs. follower), to the presence or absence of communicative gaze, and to the degree of predictability of certain aspects of handover, such as grasp type, locus of handover, gaze conditions, and dyadic role. The second aim will develop a real-time intent inference engine that uses recursive Bayesian state estimation to obtain a probabilistic assessment of human intent as the handover action evolves in time. The output of this model will inform the robot’s trajectory planner, thereby enabling it to make short time horizon predictions of human actions and to adjust robot motion plans accordingly in real-time. The third aim will examine how specific choices in the high-level planning and low-level control of robot motion impact human inference of robotic intent and action during object handover. If successful, this project will advance understanding of robot manipulation during human-robot handover and yield algorithms for achieving advanced autonomy during human collaboration with humanoid robots.

Related Faculty: Deniz Erdogmus

Related Departments:Electrical & Computer Engineering