Skip to content

AI4SEA – AI4UV

UW vision and Robotics

AI4AUV, Artificial Intelligence for AUV-based underwater habitat restoration

Project reference: 2155163346-163346-41-925
PI: Pere Ridao Rodriguez and Narcis Palomeras
Budget: € —-
Duration: 2025 – 2028

The AI4AUV subproject, led by the University of Girona (UdG), aims to develop an embedded artificial intelligence architecture that enables an Autonomous Underwater Intervention Vehicle (I-AUV) to perform complex ecological restoration tasks autonomously, safely, and efficiently. This work is part of the AI4SEA project, whose overall goal is to transform the management of marine ecosystems through intelligent robotic systems capable of mapping, monitoring, and restoring underwater habitats in a sustainable and adaptive manner.

 

Given the limitations of traditional marine restoration, typically based on manual diver operations, AI4AUV proposes a disruptive alternative. The GAIIA (Girona Artificial Intelligence Intervention Architecture) is designed as a modular system integrating symbolic planning, behavior-based control, machine learning, and autonomous mission supervision. This architecture allows the I-AUV to interpret high-level commands expressed in natural language and convert them into symbolic plans executable through classical planning techniques (PDDL), deep reinforcement learning (DRL), and behavior trees. A central focus of the project is the development of precise manipulation skills using a deep learning framework that combines Deep Reinforcement Learning (DRL) and Imitation Learning (IL). These capabilities are trained within the Stonefish simulator, allowing optimization of control policies before transferring them to the real-world environment. The key task is to enable the I-AUV to pick up Posidonia oceanica rhizomes from a tray and replant them accurately in selected areas. This species, essential for biodiversity and carbon sequestration in the Mediterranean, represents the primary use case for the project.

The system also incorporates an innovative shared autonomy scheme, in which the I-AUV collaborates with a remote control center through lowbandwidth, high-latency acoustic communication channels. To overcome these constraints, the vehicle converts its sensory perception into compressed symbolic representations that can be efficiently transmitted and used by both the human operator and a cloud-based Large Language Model (LLM) to update or replan missions in real time. Finally, a digital twin of the I-AUV is developed using the Stonefish simulator. This virtual environment faithfully replicates the dynamics, sensors, and control architecture of the real vehicle, allowing mission rehearsal, plan validation, and operation simulation prior to real-world deployment, ensuring safety and reliability.

The planned experiments will be conducted in two phases: first, in a controlled tank environment to validate manipulation capabilities and refine models; and second, at sea, in collaboration with the OBSEA observatory. In this setting, the complete restoration cycle will be executed: detection of degraded areas, collection and replanting of Posidonia oceanica, and assessment of ecological impact.

 

Share it!

UW vision and Robotics Projects

project-trident1

TRIDENT

UW vision and Robotics
omnius

OMNIUS

UW vision and Robotics
eurready40s

e-URready40s

UW vision and Robotics
biter-auv

BITER-AUV

UW vision and Robotics