Projects (NEW)

National projects

ANNAPOLIS (22-26)  Website  Publications

AutoNomous Navigation Among Personal mObiLty devIceS

Fundings: ANR (2022-2026, 832 (238)k€, ACENTAURI, Inria (ACENTAURI, CHROMA), LS2N-ARMEN, Heudiasyc-SYRI),

Description: Urban centers are increasingly invaded by new means of PLEV (Personal Light Electric Vehicles: electric scooters, Hoverboards, Gyro-wheels, etc.), directly or indirectly at the source of unpredictable behaviors in the traffic environment. ANNAPOLIS increases the vehicle’s perception capacity both in terms of precision, measurement field of view and information semantics, through vehicle to intelligent infrastructure communication. The project also seeks new models or concepts to take into account unpredictable behaviors of the new means of individual electric transport, to interpret and analyze scenes under constant evolution, and finally to decide the best future and safe motion of the self-driving car even in highly dynamic environments with unexpected and dangerous events.

Keywords: Safety and security, ADAS, autonomous vehicle, PLEV, connected vehicle, smart cities, intelligent transport systems, artificial intelligence, decision making under uncertainty

SAMURAI (22-26) Website  Publications

ShAreable Mapping using heterogeneoUs sensoRs for collAborative robotIcs

Fundings: ANR (2022-2027, 649(302)k€, ACENTAURI, LS2N-ARMEN, MIS)

Description: The scientific objectives of SAMURAI are: (i) to build shareable maps of a complex environment using high-end heterogeneous sensors (lidar, vision, IMU, GPS, …); (ii) to utilize the map to perform long term infrastructure monitoring using collaborative robots having low-end sensors different from the high-end sensors used to build the shareable map; (iii) to update the map when changes are detected using the data collected by the robots with limited sensor capability during their monitoring task.

Keywords: Long term navigation, multi robot system, heterogeneous sensor, heterogeneous robots

NINSAR (22-28) Website  Publications

New ItiNerarieS for Agroecology using cooperative Robots

Fundings: PEPR AGRONUM (2022-2028, 2160(425)k€, INRIA (ACENTAURI, CHROMA, RAINBOW), INRAE, CEA, LS2N, IP, LAAS, XLIM, ISIR, CRISTAL, IBISC, IRL, UniLasalle)

Description: The global objective of NINSAR is to define agroecological avenues achievable by an autonomous system composed of several elementary and associable robots acting at the plant scale. The main idea is to propose robotic devices to act on soil and vegetation to conduct technical itineraries fullfilling ecological requirements. It is focused on the coordination of the fleet that implies, both the development of reconfiguration and adaptation process of the robots’ behavior.  Evaluation of environmental impact of using agricultural robot will be evaluated, using Life Cycle Assessment (LCA) method. As a result, Ninsar will also contribute to provide a relevant and reliable method to assess the environmental impact of agroecological practices using robots.

Keywords: robotics in agriculture, Agroecology, Multi-robot system, agro-ecological impact,

ROAD-AI (21-25) Website  Publications

Defi ROAD-AI

Fundings: Defi INRIA (2021-2024, ()k€, INRIA (FUN, ACENTAURI, COATI, TITANE, MODAL, STATIFY), CEREMA (EDSUM, STI, GéOCOD))

Description: The aim of the Inria-Cerema ROAD-AI (2021-2024) defi is to invent the asset maintenance of infrastructures
that could be operated in the coming years. This is to offer a significant qualitative leap compared to traditional methods. Data collection is at the heart of the integrated management of road infrastructure and engineering structures and could be simplified by deploying fleets of autonomous robots. Indeed, robots are becoming an essential tool in a wide range of applications. Among these applications, data acquisition has attracted increasing interest due to the emergence of a new category of robotic vehicles capable of performing demanding tasks in harsh environments without human supervision.

Keywords: infrastructure inspection and monitoring, SLAM, Digital twin

ASCAR (24-27) Website  Publications

Fundings: DGA ASTRID (2024-2027, ()k€, I3S,  INRIA( ACENTAURI))

Description: The physical laws governing the motion of Autonomous Robotic Systems involve natural symmetries reflected in sensory measurements and external forces applied to the vehicles that present invariance and/or equivariance properties. ASCAR project exploit this structure explicitly by developing design principles and methods tailored for systems with symmetries. More specifically, the project will establish i) a new paradigm of Guidance and Control for Autonomous Systems that seamlessly integrates, in a unified framework, modeling, control, and optimization design procedures, ii) a framework for Navigation that integrates situation awareness for the analysis and design of efficient and reliable state observers for general systems with symmetries, and iii) a new paradigm and new tools for robust sensor-based control.

Keywords: Visual servoing,

STAIRS (25-28) Website  Publications

Shared Tools, impact Assessment and Interoperability for Robotic solution Sustainability

Fundings: GDRA (2025-2028, 1521(50,4)k€, INRAE,INRIA (ACENTAURI), LAAS, XLIM, CEA,  SABI AGRI, AGREENCULTURE, SHERPA ENGINEERING, EXXACT Robotics, Michelin, TIMotion, OSIRIS Agriculture, Opal-RT, Téléespace, MyEasyFarm, Abelio, Kéréval, NAIO Technologies, RobAgri,CTIFL,IFV, ITB, CIVC, FNCUMA,VEGEPOLYS VALLEY, Vinipole Sud Bourgogne)

Description: STAIRS is a project resulting from the Grand Défi de la Robotique Agricole (GDRA). This program aims to accelerate the dissemination operational of a new generation of agricultural equipment based on robotics, allowing the implementation on a large scale of more sustainable agricultural practices. The project is structured around three main actions:
– Define and provide designers of robotic solutions with an appropriate framework allowing the development and the adoption of standard software and hardware components,
– Develop standards for interoperability and integration of robots into agricultural management systems and human supervision,
– Design evaluation methodologies and metrics to measure the effectiveness and impact of robots agricultural.

Keywords: Robotics in agriculture, Data collection, Metrics, Standard

International projects

AGRIFOOD-TEF (23-27) Website

The European Testing and Experimentation Facilities for Agrifood Innovation

Fundings: EU&State (2023-2027, 60 (2,5)M€, FBK, EV ILVO, WR, JR, INRAE, POLIMI, INRIA (ACENTAURI), L-PIT, WODR, PSNC, FEM, RISE, UNIMI, WU, LNE, ACTA, IDELE, ARVALIS, IFV, UNINA, AZ, CAPL, RGRD, FHWN, UDL, UCO, GRADIANT, AGACAL, CEP, AGT, ABT, HISPATEC, DATALIFE, TRUST-IT, DTI, CREA),

Description: AgrifoodTEF is a network of test and validation infrastructures in Europe that supports Agri-Food technology companies to do near product development of their AI and Robotics solutions in real-world facilities. The overall aim is to close the gap between excellent research in these fields and actual products that support an efficient and sustainable agriculture, while meeting stringent usability and economic requirements of their end-users. AgrifoodTEF foundations are solidly rooted in existing experimental farms and facilities for AI and Robotics in Agriculture, already operational in various regions highly representative of European Agri-Food production..

Keywords: AI & Robotics, Agriculture, Agrifood, Test & Evaluation, Benchmarking

AI-SENSE (23-25)  Website  Publications

Artificial intelligence for advanced sensing in autonomous vehicles

Fundings: Inria associate team (2023-2025, ()k€, ACENTAURI, KAIST)

Description: The main scientific objective of the collaboration project is to study how to build a long-term perception system in order to acquire situation awareness for safe navigation of autonomous vehicles. The perception system will perform the fusion of different sensor data (lidar and vision) in order to localize a vehicle in a dynamic peri-urban environment, to identify and estimate the state (position, orientation, velocity, . . . ) of all possible moving agents (cars, pedestrians, . . . ), and to get high level semantic information.

Keywords: AI, Hybrid AI, Sensing, Autonomous Vehicle

EUROBIN (22-26)  Website

The European Excellence Network on AI powered robotics

Fundings: Horizon (2022-2026, ()k€, DLR, KIT, INRIA, CEA, DTI, PRAGUE univ., C.R.E.A.T.E., IMEC, KTH, SORBONNE univ., OREBRO univ., CNRS, IST-ID, PISA univ, EPFL, ETHZ, SEVILLE univ., IIT, TUM, TECNALIA, TWENTE univ., JSI, ASTI MOBILE ROBOTICS, DHL, PAL ROBOTICS, VOLKSWAGEN, BREMEN univ., FRAUNHOFER GESELLSCHAFT, FUNDINGBOX ACCELERATOR SP ZOO, VOLOCOPTER, SIEMENS, MATADOR)

Description: euROBIN proposes a threefold strategy: First, leading experts from the European robotics and AI research community tackle the questions of transferability. Second, the relevance of the scientific outcomes are demonstrated in robotic manufacturing,  in personal robots forenhanced quality of life, and in outdoor robots for sustainable communities. Advances are made measurable by collaborative competitions.
Finally, euROBIN create a sustainable network of excellence to foster exchange and inclusion. Software, data and knowledge are exchanged over the EuroCore repository, designed to become a central platform for robotics in Europe.
The vision of euROBIN is a European ecosystem of robots that share their data and knowledge and exploit their diversity to jointly learn to perform the endless variety of tasks in human environments.

Keywords: Robotics, AI


Industrial Collaborations

La Fontaine (22-25)

Fundings: Naval-Group (2022-2025)

Description: The context is that of decision support for a collaborative autonomous multi-agent system with a common objective. Themulti-agent system try to get around ”obstacles” which, in turn, try to prevent them from reaching their goals. As part of a collaboration with NAVAL GROUP, we wish to study a certain number of issues related to the optimal planning and control of cooperativemulti-agent systems. The objective of this contract is therefore to identify and test methods for generating trajectories responding to a set of constraints, dictated by the interests, the modes of perception, and the behavior of these actors.

The first problem to study is that of the strategy to adopt during the game. The strategy consists in defining “the set of coordinated actions, skillful operations, maneuvers with a view to achieving a specific objective”. In this framework, the main scientific issues are (i) how to formalize the problem (often as
optimization of a cost function) and (ii) how to be able to define several possible strategies while keeping the same tools for implementation (tactics).
The second problem to study is that of the tactics to be followed during the game in order to implement the chosen strategy. The tactic consists in defining the tools to execute the strategy. In this context, we study the use of techniques such as MPC (Model Predictive Control) andMPPI (Model Predictive Path Integral) which make it possible to predict the evolution of the system over a given horizon and therefore
to take the best action decision based on knowledge at time t.
The third problem is that of combining the proposed approaches with those based on AI and in particular the machine learning. Machine Learning can intervene both in the choice of the strategy and in the development of tactics. The possibility of simulating a large number of parts could allow the learning of a neural network whose architecture remains to be designed..

Usine du Futur (22-25)

Fundings: Naval-Group (2022-2025)

Description: The context is that of the factory of the future for Naval Group in Lorient, for submarines and surface vessels. As input, we have a digital model (for example of a frigate), the equipment assembly schedule and measurement data (images or Lidar). Most of the components to be mounted are supplied by subcontractors. At the output, we want to monitor the assembly site to compare the “as-designed” with the “as-built”. The challenge of the contract is a need for coordination on the construction sites for the planning decision. It is necessary to be able to follow the progress of a real project and check its conformity using a digital twin. Currently, as you have to see on board to check, inspection rounds are required to validate the progress as well as the mountability of the equipment: for example, the cabin and the fasteners must be in place, with holes for the screws, etc. These rounds are time-consuming and accident-prone, not tomention the constraints of the site, for example the temporary lack of electricity or the numerous temporary assembly and safety equipment.

NXP CIFRE (24-27)


Embedded machine learning solutions for vision-based autonomous navigation

Fundings: NXP CIFRE

Description: The PhD thesis setup a complete Perception system based on a generic spatio-temporal multi-level representation of the scene (geometrical, semantical, topological, …) that provide information needed by an ontology of navigation task and directions originating from various modalities (sound, text, images, other systems). The geometric representation is provided by state of the art SLAM algorithm, while the PhD subject  focuses on extracting semantic and topological information using a Data based approach and an abstraction toolbox (Graphs based) is developed to make the connection with ontologies on one side and with the task to be done on the other side.

SAFRAN CIFRE (24-27)


Dense SLAM using artificial intelligence approaches for inertial-vision data integration

Fundings: SAFRAN CIFRE

Description: The objective of the thesis is to study the capacity of deep neural networks to deal with the SLAM problem using several sensor modalities, in order to take advantage of each. The difficulty lies in the ability to find a representation space common to the different modality while maintaining a representation of the robot’s poses in the SE3 space. The architecture to be developed should take advantage of attention mechanisms (developed in Transformers) to weight the measurements coming from the different sensors (images, inertia) according to the state of the robot (proprioceptive information: inertia) as well as the environment (exteroceptive information: vision). The balance between real-time operation and performance, as well as robustness to dynamic, uncertain and complex environments are important elements to consider in the study.