The Rainbow research group exploit a robotics platform consisting of 6 trays (vision robotics, indoor mobile robotics, medical robotics, advanced manipulation robotics, indoor unmanned aerial vehicles (UAVs) and haptics and shared control. They allow team members to use reliable robotic systems to validate their researches in visual servoing, visual tracking, active perception and shared control. From a software point of view, all the materials are interfaced with ViSP an open source software platform we develop in the team, but also with ROS framework.
We operate two industrial robots to validate our research in visual servoing and active vision. The first one is a 6 DoF Gantry robot built by Afma Robots in the nineties, the other one is a 4 DoF cylindrical robot also built by Afma Robots. These robots are equipped with a collection of various RGB and RGB-D cameras used to validate vision-based real-time tracking algorithm. The Gantry robot also allows mounting grippers on its end-effector.
© Inria / H. Raguet
For fast prototyping of algorithms in perception, control and autonomous navigation, the team uses a Pioneer 3DX from Adept equipped with various sensors needed for autonomous navigation and sensor-based control.
Moreover, to validate the researches in personally assisted living topic, we have three electric wheelchairs, one from Permobil, one from Sunrise and the last from YouQ. The control of the wheelchair is performed using a plug and play system between the joystick and the low level control of the wheelchair. Such a system lets us acquire the user intention through the joystick position and control the wheelchair by applying corrections to its motion. The wheelchairs have been fitted with cameras and ultrasound sensors to perform the required servoing for assisting handicapped people. In 2019, we bought a wheelchair haptic simulator to develop new human interaction strategies in an virtual reality environment.
In 2016, this platform was extended with Pepper, another human-shaped robot designed by SoftBank Robotics to be a genuine day-to-day companion. It has 17 DoF mounted on a wheeled holonomic base and a set of sensors (cameras, laser, ultrasound, inertial, microphone) that makes this platform interesting for researches in vision-based manipulation, and visual navigation.
© Inria / H. Raguet, C. Morel
This platform is composed by two 6 DoF Adept Viper arms. Ultrasound probes connected either to a SonoSite 180 Plus or an Ultrasonix SonixTouch imaging system can be mounted on a force torque sensor attached to each robot end-effector. The haptic Virtuose 6D device from Haption of the Omega 6 device can also be used within this platform.
This testbed is of primary interest for researches and experiments concerning ultrasound visual servoing applied to probe positioning, soft tissue tracking, elastography or robotic needle insertion tasks.
© Inria / H. Raguet
This platform is composed by 2 Panda lightweight arms from Franka Emika equipped with torque sensors in all seven axes. An electric gripper, a camera or a soft hand from qbrobotics can be mounted on the robot end- effector to validate our researches in coupling force and vision for controlling robot manipulators and in shared control for remote manipulation. Other haptic devices can also be coupled to this platform.
From 2014, we started some activities involving perception and control for single and multiple quadrotor UAVs. To this end, we purchased four quadrotors from Mikrokopter Gmbh, Germany, and one quadrotor from 3DRobotics, USA. The Mikrokopter quadrotors have been heavily customized by: (i) reprogramming from scratch the low-level attitude controller onboard the microcontroller of the quadrotors, (ii) equipping each quadrotor with a NVIDIA Jetson TX2 board running Linux Ubuntu and the TeleKyb-3 software based on genom3 framework developed at LAAS in Toulouse (the middleware used for managing the experiment flows and the communication among the UAVs and the base station), and (iii) purchasing the Flea Color USB3 cameras together with the gimbal needed to mount them on the UAVs. The quadrotor group is used as robotic platforms for testing a number of single and multiple flight control schemes with a special attention on the use of onboard vision as main sensory modality.
Various haptic devices are used to validate our research in shared control. We have a Virtuose 6D device from Haption. This device is used as master device in many of our shared control activities. It could also be coupled to the Haption haptic glove in loan from the University of Birmingham. An Omega 6 from Force Dimension and devices in loan from Ultrahaptics complete this platform that could be coupled to the other robotic platforms.