sujet2019-TFC-drone

Perception of human activity for a flying co-worker

Auteur : Francis Colas

Informations générales

Encadrants Francis Colas Serena Ivaldi
Adresse
Téléphone 03 54 95 86 30
Email francis.colas@inria.fr serena.ivaldi@inria.fr
Bureau C125 C104

Context

For a long time, industrial robots were secluded in safety cages and autonomous robots were mainly designed and programmed to work without humans or while avoiding them (as obstacles). There is a growing trend towards environment sharing and collaboration between humans and robots with the objective of having robots assist humans (for manufacturing but also in our daily lives). One of the main issues for human-robot interaction is, for the robot, to be able to adequately perceive and predict what the human is doing.

Objectives

This internship is in the context of the “Flying co-worker” ANR project aiming at building a collaborative flying robot to help human workers. The objective of this subject is to be able to provide information about the current and future pose of the human worker, that is, the precise motion of the arms, hands, head, etc.

There are two main aspects to be tackled:

  • human pose perception with a flying robot,

  • human pose prediction.

Human pose perception can be achieved with various sensors such as optical motion capture1, inertial sensor suit2, rgb-d cameras , or even monocular cameras . The key challenge is to perform this on a flying platform with vastly varying scale and occlusions. An approach could be the fusion of different sensory modalities to maintain a minimal quality of the reconstruction while having an accurate estimate of the relative pose between the robot and the human.

A second task is to perform short-term prediction of the human gestures, which is necessary for physical human-robot interaction. A first approach could rely on motion models learned for various activities . However it might be necessary to have fallback generic models to ensure safety even with unseen gesture.

It is expected to balance theoretical contributions with experimental validation. To this end, various mobile robots (quadrotors and ground robots) with different sensors (including motion capture systems) are available as test platforms.

Cadre du travail

This internship takes place at Inria in Nancy in the Larsen team and is founded by the “Flying co-worker” ANR project.

References

Dib, A. and Charpillet, F. (2015) Pose estimation for a partially observable human body from RGB-D cameras. In 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (pp. 4915-4922).

Cao, Z. and Hidalgo, G. and Simon, T. and Wei, S.-E. and Sheikh, Y. (2018) OpenPose: realtime multi-person 2D pose estimation using Part Affinity Fields. arXiv preprint arXiv:1812.08008.

Dermy, O. and Chaveroche, M. and Colas, F. and Charpillet, F. and Ivaldi, S. (2018) Prediction of Human Whole-Body Movements with AE-ProMPs. IEEE/RAS Int. Conference on Humanoid Robotics (HUMANOIDS).


  1. https://www.qualisys.com/applications/human-biomechanics/

  2. https://www.xsens.com/human-machine-interaction

Comments are closed.