DARPA Subterranean Challenge
CoSTAR team formed by NASA-JPL, CalTech, MiT, KAIST and LTU.
The DARPA Subterranean (SubT) Challenge is a robotic competition that seeks novel approaches to rapidly map, navigate, and search underground environments. The competition spans a period of three years. CoSTAR is a DARPA-funded team participating in the systems track developing and implementing physical systems that will be tasked with the traversal, mapping, and search in various subterranean environments: including natural caves, mines, and urban underground. Team CoSTAR (Collaborative SubTerranean Autonomous Resilient Robots) is a collaboration between NASA’s JPL, MIT, Caltech, KAIST, LTU and several industry partners.
(CSIC 201850E103) MINNA: Módulos de posicionado preciso para manipulación aérea
The objective of MINNA is to consolidate the research tasks in precise positioning for aerial robotics that have been developed throughout the European projects ARCAS, AEROARMS and GAUSS, and to prepare the research group with a solid base to be able to face the scientific challenges in this topic that supposes the concurrence to new calls for projects.
(EU H2020-GALILEO-GSA-2017-1-776293) Galileo-EGNOSS as an Asset for UTM Safety and Security
The GAUSS project aims fast and thorough achievement of acceptable levels in terms of performance, safety and security for both, current RPAS and future UTM operations. The key element within GAUSS is the integration and exploitation of Galileo-EGNOS exceptional features for precise and secure positioning. GAUSS will increase resilience in UTM operations and, at the same time, ensure UTM coordination capabilities to increase the number of platforms that can share the same airspace. The UTM infrastructure will also benefit from the GAUSS Galileo-EGNOS based ADS-B solution and encrypted air-ground communications. GAUSS includes the definition, negotiation and execution of safe trajectories both in normal operation and in case security or safety is compromised. The GAUSS systems will be validated with two field trials (in-land and sea) with the operation of 4 UTM coordinated RPAS with different types (fixed and rotary wing) and EASA operational categories.
(SP DPI2017-89564-P) Event-based simultaneous localization and mapping
EB-SLAM aims at developing a high-speed, high-dynamic-range localization and mapping device that fuses inertial measurements with those of a dynamic vision system, most commonly known as event-based camera. Such a device could be used for estimating the movement of an autonomous vehicle or a UAV, in environments without GPS readings, undergoing high dynamics, and under conditions of poor illumination or severe illumination changes.
AErial RObotics system integrating multiple ARMS and advanced manipulation capabilities for inspection and maintenance
AEROARMS proposes the development of the first aerial robotic system with multiple arms and advanced manipulation capabilities to be applied in industrial inspection and maintenance operations (I&M). The main objectives include the development of systems (helicopter-based aerial manipulators) which are able to grab and dock with one or more arms and perform dexterous accurate manipulation with another arm. To achieve these objectives, AEROARMS will develop the first aerial telemanipulation system with advanced haptic capabilities able to exert significant forces with an industrial robotic arm, as well as autonomous control, perception and planning capabilities. Special attention will be paid to the design and system development in order to receive future certification taking into account ATEX and RPAS regulations.
Aerial Robotics Cooperative Assembly System
The ARCAS project proposes the development and experimental validation of the first cooperative free-flying robot system for assembly and structure construction. The project will pave the way for a large number of applications including the building of platforms for evacuation of people or landing aircrafts, the inspection and maintenance of facilities and the construction of structures in inaccessible sites and in the space.
Instructing robots using natural communication skills,
The ROBINSTRUCT project aims at developing the technology to instruct a general-purpose robot in a natural and human-like manner and pursue a number of assistance tasks in urban areas, such as streets, a university campus or a shopping mall. In this project, we will put together tools from computer vision, machine learning, natural language processing and robotics. Specifically, we will first develop parsers to represent both video and natural language data using an intermediate abstraction level. We will then investigate learning approaches to discover mappings between the visual/textual content and the robot action space. To bring these robots to outdoor scenarios, the human-to-robot communication skills will be combined with new algorithms to localize the robot in very large 3D maps and for long periods of time, even in GPS-denied areas. For this purpose, we will integrate novel computer vision pose estimation algorithms with inertial sensors.
Perception and Action in Robotics Problems with Large State Spaces
This project is the continuation of the PAU project (DPI2008-06022), in which we mainly studied novel uncertainty parameterizations that allow efficient inference, new probabilistic hypothesis testing strategies with respect to information load, active exploration paradigms for scene and object reconstruction, and new algorithms for rigid and non-rigid object reconstruction from single images, for the application domains of mobile robot mapping and navigation, and perception and manipulation of deformable objects.
VIS - Grup de Recerca Consolidat
(CAT 2014 SGR 897)