Skip to content

UDRONE

UW vision and Robotics

UDRONE: Robot submarino inteligente para la exploración omnidireccional e inmersiva del bentos

Project reference: CTM2017-83075-R
Project Leader
: Rafael Garcia
Budget: 226.270€
Duration: 2017-2020

The oceans play a key role on our planet. They regulate the Earth’s climate and supply living and non-living resources, being a major source of food and many raw materials. And although they cover over 70% of the Earth’s surface, we know very little about this vast area of our planet. Underwater exploration is fundamental to improve our knowledge of the ocean. Exploration helps to ensure that ocean resources are well managed, and to ensure that they will be preserved for future generations to enjoy. However, oceans are a hostile environment for human beings, making ocean exploration difficult, specially at greater depths. Absence of light and increasing pressure make the ocean a dangerous place for humans to operate. Therefore, sea exploration, beyond the capacity of human divers, requires the use of underwater vehicles.

In this project we continue our previous work in a radically novel paradigm for ocean exploration through the development of an intelligent underwater robot. Recent progress in the information technology field has opened compelling alternatives to traditional methods of exploration. In our previous project we successfully converted the untethered Girona500 Autonomous Underwater Robot (AUV) into a Hybrid Remotely Operated Robot (HROV). The promising results obtained so far guide us towards pursuing the same goals but improving the robot’s spatial awareness, providing a fully-immersive telepresence experience to the scientists leading the exploration. In this sense this project will take a step forward in two main directions. Firstly, the development of novel real-time omnidirectional stereo algorithms, together with sensor fusion techniques, that will make the robot more aware of its environment and possible obstacles/hazards,

automatically avoiding such situations and ensuring its safety while being teleoperated by an unskilled pilot (e.g., by a scientist that is exploring the ocean floor). This requires the ability to build 3D maps of the environment in real time. We will achieve this goal by equipping the robot with two omnidirectional camera systems.

Secondly, immersive systems have made significant advances in the last two years, driven by the entertainment and game industries. These systems allow the observer to experience the virtual environment through head movements and feel immersed in a simulated environment. In our application, the omnidirectional cameras will provide the data to build immersive panoramas of the explored area in real time, allowing the scientist to explore the seafloor using a virtual reality headset, while safely driving the robot with a haptic joystick. The technology developed within this project will enable the automatic detection of obstacles and collision avoidance during the navigation of the robot, allowing the (pilot) scientist to easily and freely navigate in cluttered or high-3D relief areas without worrying about the safety of the vehicle. When the robot detects that it is being teleoperated too close to a dangerous feature of the seafloor (e.g. a wall or even the seafloor), its control system will override the commands of the pilot and will provide a visual feedback on the headset of the pilot, as well as force feedback on the joystick.

Finally, post-mission panoramic videos will be available for visualization using a virtual reality headset, thus playing a key role in science outreach for general public

Share it!

UW vision and Robotics Projects

Tecniospring | Codruta Orniana Ancuti | ACCIÓ

UW vision and Robotics

ATLANTIS

UW vision and Robotics

EUROFLEETS2

UW vision and Robotics

Tecniospring | Angelos Mallios | ACCIÓ

UW vision and Robotics