AUDEL: Autonomous package delivery in urban areas
(SP TED2021-131759A-I00)

In AUDEL, we will focus on the autonomous navigation part of individual robotic delivery devices under challenging conditions and in realistic city scenarios. In particular, we will leverage the expertise of the research team to investigate on robust ego-motion estimation, detection and tracking of vulnerable road users (VRU), semantic understanding (augmented mapping) and predictive control. In the course of AUDEL, results will be demonstrated in a number of robotic prototypes with increasing level of complexity. The works related to motion estimation and scene representation (e.g., detection and tracking of VRU or semantic understanding) will be developed first for hand-held sensor suites moving freely in 3D. In the same way, the control algorithms will be first developed for simulated settings, then for the non-holonomic motion of the ONA robot, and finally for the complex case of the open-source quadruped robot SOLO.

. . .

LOGISMILE: Last mile logistics for autonomous goods delivery
(EU EIT-UM-2020-22140)

The amount of goods to be delivered in metropolitan areas will increase dramatically in the next few years. Deliveries are more frequent and fragmented, especially because of a skyrocketing use of e-commerce. Todays logistics operations in city centres lead to very negative effects: increase in traffic congestion; safety problems for pedestrians, bikers and deliverers; air and noise pollution. To tackle these challenges, the LogiSmile partners will demonstrate in pilot cities a fully autonomous delivery system consisting of an autonomous hub vehicle that works in cooperation with smaller autonomous delivery devices. To control the robots and remotely coordinate the fleet operations, a back-end control centre will be piloted too. The robots and remote back-end control centre will be tested in different urban environments. This autonomous delivery system will reduce delivery costs, parking problems, emissions and congestion. It will ensure flexible, rapid, and convenient deliveries.

. . .

CANOPIES: A Collaborative Paradigm for Human Workers and Multi-Robot Teams in Precision Agriculture Systems
(EU H2020-ICT-2020-2-101016906)

In CANOPIES, our goal is to develop a novel collaborative human-robot paradigm addressing the challenges of Human Robot Interaction and Human-Robot Collaboration in the unstructured highly dynamic outdoor environment of permanent crop farming (Agri-Food Area). CANOPIES represents the first attempt to introduce a collaborative paradigm in the field of precision agriculture for permanent crops where farmworkers can efficiently work together with teams of robots to perform agronomic interventions, like harvesting or pruning in table-grape vineyards. CANOPIES impact will contribute to filling the current gap in the development of fully autonomous robotic solutions for permanent crops by introducing a novel concept of farming robots, where we leverage an effective interaction with the human workers to mitigate the greater complexity of permanent crops as compared with field crops.

. . .

EBCON: Motion estimation and control with event cameras
(SP PID2020-119244GB-I00)

In EBCON we continued our efforts on SLAM for event cameras (EB-SLAM), being able to build the map automatically and in real-time. Secondly, we resorted to the use of artificial neural networks to learn discriminative features from events. Moreover, we investigated the use of spiking neural nets to compute salient features from events, compute optical flow, and recover camera motion. In EBCON we explored the design of reactive controllers requiring fast reaction times, such as obstacle avoidance for UAVs. In this project, we explored techniques that, departing from the most modern nMPC techniques, allowed us to approach the dynamics observed by our event-based estimators. In the course of EBCON, results will be demonstrated in a number of robotic prototypes with increasing levels of complexity.

. . .

DARPA Subterranean Challenge
CoSTAR team formed by NASA-JPL, CalTech, MiT, KAIST and LTU.

The DARPA Subterranean (SubT) Challenge is a robotic competition that seeks novel approaches to rapidly map, navigate, and search underground environments. The competition spans a period of three years. CoSTAR is a DARPA-funded team participating in the systems track developing and implementing physical systems that will be tasked with the traversal, mapping, and search in various subterranean environments: including natural caves, mines, and urban underground. Team CoSTAR (Collaborative SubTerranean Autonomous Resilient Robots) is a collaboration between NASA’s JPL, MIT, Caltech, KAIST, LTU and several industry partners.

. . .

MINNA
(CSIC 201850E103) MINNA: Módulos de posicionado preciso para manipulación aérea

The objective of MINNA is to consolidate the research tasks in precise positioning for aerial robotics that have been developed throughout the European projects ARCAS, AEROARMS and GAUSS, and to prepare the research group with a solid base to be able to face the scientific challenges in this topic that supposes the concurrence to new calls for projects.

. . .

GAUSS
(EU H2020-GALILEO-GSA-2017-1-776293) Galileo-EGNOSS as an Asset for UTM Safety and Security

The GAUSS project aims fast and thorough achievement of acceptable levels in terms of performance, safety and security for both, current RPAS and future UTM operations. The key element within GAUSS is the integration and exploitation of Galileo-EGNOS exceptional features for precise and secure positioning. GAUSS will increase resilience in UTM operations and, at the same time, ensure UTM coordination capabilities to increase the number of platforms that can share the same airspace. The UTM infrastructure will also benefit from the GAUSS Galileo-EGNOS based ADS-B solution and encrypted air-ground communications. GAUSS includes the definition, negotiation and execution of safe trajectories both in normal operation and in case security or safety is compromised. The GAUSS systems will be validated with two field trials (in-land and sea) with the operation of 4 UTM coordinated RPAS with different types (fixed and rotary wing) and EASA operational categories.

. . .

EB-SLAM
(SP DPI2017-89564-P) Event-based simultaneous localization and mapping

EB-SLAM aims at developing a high-speed, high-dynamic-range localization and mapping device that fuses inertial measurements with those of a dynamic vision system, most commonly known as event-based camera. Such a device could be used for estimating the movement of an autonomous vehicle or a UAV, in environments without GPS readings, undergoing high dynamics, and under conditions of poor illumination or severe illumination changes.

. . .

AEROARMS
(EU H2020-ICT-2014-1-644271)
AErial RObotics system integrating multiple ARMS and advanced manipulation capabilities for inspection and maintenance

AEROARMS proposes the development of the first aerial robotic system with multiple arms and advanced manipulation capabilities to be applied in industrial inspection and maintenance operations (I&M). The main objectives include the development of systems (helicopter-based aerial manipulators) which are able to grab and dock with one or more arms and perform dexterous accurate manipulation with another arm. To achieve these objectives, AEROARMS will develop the first aerial telemanipulation system with advanced haptic capabilities able to exert significant forces with an industrial robotic arm, as well as autonomous control, perception and planning capabilities. Special attention will be paid to the design and system development in order to receive future certification taking into account ATEX and RPAS regulations.

. . .

ARCAS
(EU FP7-ICT-287617)
Aerial Robotics Cooperative Assembly System

The ARCAS project proposes the development and experimental validation of the first cooperative free-flying robot system for assembly and structure construction. The project will pave the way for a large number of applications including the building of platforms for evacuation of people or landing aircrafts, the inspection and maintenance of facilities and the construction of structures in inaccessible sites and in the space.

. . .

ROBISTRUCT+
(SP TIN2014-58178-R)
Instructing robots using natural communication skills,

The ROBINSTRUCT project aims at developing the technology to instruct a general-purpose robot in a natural and human-like manner and pursue a number of assistance tasks in urban areas, such as streets, a university campus or a shopping mall. In this project, we will put together tools from computer vision, machine learning, natural language processing and robotics. Specifically, we will first develop parsers to represent both video and natural language data using an intermediate abstraction level. We will then investigate learning approaches to discover mappings between the visual/textual content and the robot action space. To bring these robots to outdoor scenarios, the human-to-robot communication skills will be combined with new algorithms to localize the robot in very large 3D maps and for long periods of time, even in GPS-denied areas. For this purpose, we will integrate novel computer vision pose estimation algorithms with inertial sensors.

. . .

PAU+
(SP DPI2011-27510)
Perception and Action in Robotics Problems with Large State Spaces

This project is the continuation of the PAU project (DPI2008-06022), in which we mainly studied novel uncertainty parameterizations that allow efficient inference, new probabilistic hypothesis testing strategies with respect to information load, active exploration paradigms for scene and object reconstruction, and new algorithms for rigid and non-rigid object reconstruction from single images, for the application domains of mobile robot mapping and navigation, and perception and manipulation of deformable objects.

. . .

VIS - Grup de Recerca Consolidat
(CAT 2014 SGR 897)