Financial Grant from LKABs Stiftelse on forskning och utveckling

We express our appreciation to the LKAB's Foundation for Research and Development for financially supporting our research directions in robotics for exploration of sub-terranean environments as a means to increase safety and efficiency.

#LKAB #Luleå #Sweden #Robotics #artificialintelliegence #mining #automation #RAI

COSTAR Shafter Sensor Configuration from Luleå University Open Sourced in SubT

Our COSTAR Shafter flying platform has been included in the DARPA SubT virtual Tech Repository, part of the SubT Challenge.

The Shafter aerial platform is an autonomous system capable of being deployed in an unknown environment. In this configuration, the platform is equipped with the following sensors: 3D LIDAR, forward facing RGBD camera and an IMU.

Open Source Code: https://github.com/.../sub.../costar_shafter_sensor_config_1

Screenshot 2021-01-26 at 09.13.24.png

Embedded Artificial Intelligence: The ARTEMIS Vision

Our White Paper on “Embedded Artificial Intelligence: The ARTEMIS Vision” that reflects the ideas about the future of Embedded Artificial Intelligence is out.

https://ieeexplore.ieee.org/document/9237345

Read about the latest trends in Embedded and Cyberphysical Systems from the ARTEMIS Scientific Council.

#ARTEMIS #AI #ArtificialIntelligence #Robotics #Robots #Autonomy #CPS #embedded #RAI

Horizon 2020 - DISIRE @ CORDIS News

The research results from the Horizon 2020 project DISIRE have been featured at the CORDIS news summarising the impact of the project and the prestigious award from the Solar Impulse Foundation - https://solarimpulse.com/ in the Top 1000 solutions that will change the word.

More reading here: https://cordis.europa.eu/article/id/421407-developing-technologies-to-ensure-profitable-environmental-protection?WT.mc_id=exp

Special Issue: Visual Perception for Micro Aerial Robots

Journal of Intelligent and Robotic Systems

Special Issue on: “Visual Perception for Micro Aerial Robots”

---- Introduction

During the last decades, aerial robots have emerged from a concept to a leading-edge technology with the enormous potential to become a valuable tool in multiple applications, in terms of human life safety and task execution efficiency. So far, the commercial use of aerial robots is mainly restricted within the photography-filming industry, but its growth is rapid, investing nowadays in applications that require autonomous inspection and environmental interaction. The vision of integrating aerial robotic platforms in the industrial process is in its infancy, with quite a few open challenges remaining. One of the backbone functionalities that these platforms should possess to enable and support such tasks are advanced perception capabilities. Specifically, from a scientific point of view, reliable localization, navigation, mapping and object perception are topics that have received a lot of attention, but still require further developments to reify aerial robot autonomous inspection and physical interaction.


---- Thematic Scope

The purpose of this special issue is to address theoretical and application-oriented problems in the general area of visual perception for micro-aerial robots and to identify and provide key perception solutions that meet the real-time constraints posed by aerial vehicles, following recent advances in computer vision and robotics. Topics of interest include (but are not limited to):

·   Vision-based control and visual servoing

·   Visual navigation, mapping, and SLAM

·   Cooperative perception using multiple platforms

·   Vision-assisted floating-base manipulation

·   Deep Learning for visual perception

·   Object recognition, tracking, semantic and 3D vision techniques

·   Fusion of vision with other sensing systems, e.g., laser scanner

·   Advanced visual sensors and mechanisms (event-based, solid state sensors,  LiDAR,  RGB-D, time-of-flight cameras, etc.)

·     Aerial robot applications on key enabling perception technologies

·     Model predictive control for vision-based autonomous navigation

·     Reinforcement learning for visual perception

---- Manuscript Submission


Manuscripts should describe original and previously unpublished results which are currently not considered for publication in any other journal. All the manuscripts shall be submitted electronically athttp://www.editorialmanager.com/jint/, and will undergo a peer-review process.

IPC committee at the RSS Pioneers 2018

Screen Shot 2018-02-13 at 20.01.11.png

RSS Pioneers is a day-long invitation-only workshop for senior graduate students and postdocs, held in conjunction with Robotics: Science and Systems, that seeks to bring together a cohort of the world’s top early career researchers in all areas of robotics. The first annual RSS Pioneers will be held on Monday, June 25, 2018 at Carnegie Mellon University in Pittsburgh, PA, USA.

The goal of RSS Pioneers is to foster collaboration, networking, and professional development among a new generation of robotics researchers. The workshop will include a mix of research and career talks from top scholars in the field, research presentations from attendees, networking activities, and mentoring of junior students as part of the Inclusion@RSS initiative. To facilitate attendance, the workshop will provide partial support for travel and registration for RSS Pioneers and the main RSS 2018 conference. RSS Pioneers is Inspired by the format of the prestigious HRI Pioneers workshop.

For further information please visit: https://sites.google.com/view/rsspioneers2018/