Nachwa Aboubakr

I am a PhD student in the team of Pervasive Interaction, started on Oct 2016 – Oct 2019 (expected). I am working on the recognition of human manipulation actions.

The objective of this doctoral study is to develop and evaluate methods for observing and modeling human manipulation activities using RGB/RBGD cameras. We are specifically interested in the construction of narratives as a causal sequence of events. We apply studied techniques to domain of cooking as we consider cooking to be a sequence of semi-structured object manipulation activities.

 

Publications:

  • Aboubakr, Nachwa, James L. Crowley, and Remi Ronfard. “Recognizing Manipulation Actions from State-Transformations.” arXiv preprint arXiv:1906.05147 (2019). Accepted for a presentation in EPIC@CVPR2019. [Project Page]
  • Nachwa Aboubakr, Rémi Ronfard, James Crowley. Recognition and Localization of Food in Cooking Videos. CEA/MADiMa’18, Jul 2018, Stockholm, Sweden. 2018, 〈10.1145/3230519.3230590〉. Ground-truth annotation of key-frames in 50 salads dataset. [Project: Project page] [Download evaluation set: annotation_json]
  • Nachwa Abou Bakr, James Crowley. Histogram of Oriented Depth Gradients for Action Recognition. ORASIS 2017, Jun 2017, Colleville-sur-Mer, France. pp.1-2, 2017, 〈https://orasis2017.sciencesconf.org/〉. 〈hal-01694733〉

 

 

Internship Proposals:

2. How many samples are sufficient for learning image recognition tasks?

  • Experimental study on how many samples are required to solve an image recognition task?
  • Is the number of samples task dependent? Does certain simple image recognition tasks require smaller number of samples?
  • How much transfer learning techniques contributes in the number of samples per class?
  • How to evaluate dataset samples to be enough?
  • Reference: Stabinger, Sebastian, Antonio Rodríguez-Sánchez, and Justus Piater. “25 years of cnns: Can we compare to human abstraction capabilities?.” International Conference on Artificial Neural Networks. Springer, Cham, 2016.

Comments are closed.