NAOLab is a middleware library for developing robotic applications in C, C++, Python and Matlab, using the humanoid robot NAO
Software Download | Publications | People | Support | Acknowledgements
NAOLab is a middleware for the development of robotic applications in C, C++, Python and Matlab, using the humanoid robot NAO networked with a PC. NAOLab enables the joint use of NAO’s on-board computing resources and external resources. More precisely, it allows the development of applications that combine embedded libraries, e.g., motion control, image/sound acquisition and transmission, etc., with external toolboxes, e.g., OpenCV, Matlab toolboxes, etc.
The NAOLab toolbox has the following characteristic. The middleware complexity is transparent to the users. An user-friendly interface is provided through C++ and Python libraries extended with mex functions for Matlab. This enables the development of sophistaticated audio and visual processing algorithms without the stringent constraints of the NAOqi SDK.
NAOLab and NAOqi share the same modular approach, namely there are three categories of modules: vision, audio and motion. An interface (vision, audio, motion) is associated with each NAOqi module. Each interface deals with sensor-data access and actuator control. The role of these interfaces is twofold: (i) to feed the sensor data into a memory space that is subsequently shared with existing software or with software under development, and (ii) to send to the robot commands generated by the external modules.
|NAOLab Software Package can be freely downloaded. NAOLab is distributed under LGPL. It requires a NAO robot V5 and NAOqui middleware V2.1 (or higher), both manufactured and distributed by Aldebaran Robotics|
This video overviews and illustrates NAOLab
F. Badeig, Q. Pelorson, S. Arias, V. Drouard, I. D. Gebru, X. Li, G. Evangelidis, R. Horaud. A Distributed Architecture for Interacting with NAO. International Conference on Multimodal Interaction, Nov 2015, Seattle, WA, United States.
This work has received funding from the EU-FP7 STREP project EARS (#609465) and ERC Advanced Grant VHIA (#340113) and has been supported by Aldebaran Robotics.