The MAVIS System: Towards the Use of Marsupial Robotic Networks for Automatic Sensing in Polar Regions


Nowadays, the study of robotic networks, for exploration and environment monitoring applications, is one of the core topics of many researchers worldwide. In this context, the robots heterogeneity, in terms of different sensing, motion, and actuation capabilities, is leveraged in order to perform complex missions. Furthermore, in harsh and dynamic environments, more complex interactions among robots, such as the so-called “marsupial relationship”, have to be taken into account.

A marsupial relationship is a particular physical interaction in which a container robot carries one or more passenger robots. The container robot is essentially a carrier for a faster deployment of the passengers.

One of the main research topics of the Field and Interaction Robotics group at the Institute of Intelligent Systems for
Automation of the National Research Council of Italy (ISSIA-CNR) arises in this scenario, where the development of new methodologies in perception, control, and communication for marsupial robotic networks are still an open challenge.


We consider a team of robots composed by:
1) an Unmanned Surface Vehicle (USV)
2) a multi-rotor Unmanned Aerial Vehicle (UAV)
3) an Unmanned Underwater Vehicle (UUV),

The proposed system features a marsupial robotic network, that we refer to as Marsupial Autonomous Vehicles for Intelligent Sensing (MAVIS) system, with the USV carrying the UAV and UUV in predefined zones of interest for data gathering. Under this configuration, the USV is assumed to act as a messenger between the two passengers, and to provide support to the tasks the UAV and UUV have to execute.


Autonomous cooperation between the UAV and USV has to be guaranteed. In particular, a vision-based control strategy for autonomous precision landing of the UAV is currently under development and will be tested in different outdoor setups before the Arctic application.

Overcome the limited time of flight of UAV through a coupled USV-UAV system, which consists in the UAV tethered to the USV. As far as the UUV is concerned, the USV will be exploited as a communication relay for making up to its limited communication capabilities.

USV used as a recharge station for the UAV and carry a supplementary energy source for the UUV. Another key challenge of marsupial robotic networks regards the  ission control. Current mission control frameworks do not  onsider the constraints imposed by the physical connections of marsupial robots. Thus, in order  to manage large-scale marsupial networks, novelmission control frameworks  ave to be developed.



The enclosed video shows the results of image processing system ables
to estimate the pose of a target “HeliPad” using only the images
acquired by a camera installed on drone. The target consists of an
external anulus with outer diameter of 100 cm. Inner diameter is of 80
cm. An additional target is an anulus with outer diameter of 6 cm and
inner diameter of 3 cm. The size of “H” is with segments having
different lenghts: 10 cm for short ones, 20 cm for medium ones and 50
cm for long ones.
The graphic symbols used in the video must be interpreted as follows:
The presence of a green dot in the center of the H show that the
searching algorithm of the external anulus has been successful and the
location of its center in 3D coordinates is provided.
The presence of a blue or white dot in the center of the H shows that
the searching algorithm of the anulus, inscribed in H, has been
successful and therefore the location of its center in 3D coordinates
is provided. The dot becomes white when the anulus is very close to the
camera and for this reason a different searching algorithm is used.
The presence of 18 green dots near the edge of H shows that the H
searching algorithm has been successful and it is able to provide the
pose estimation of H in terms of translation and rotation.

This clip shows the helipad detection performed by means of a vision system attached under a UAV (Unmanned Aerial Vehicle). The method implemented on an onboard computer is able to provide the pose of drone with respect to the target. Two operating mode can be identified:

– Circles detection mode: When the UAV flies on high heights(above ~6 m), the algorithm is able to detect the outer corona providing an estimation of altitude and the position X and Y of drone. A green dot in the middle of landing target is shown if the circles are correctly detected. Otherwise, a red dot appears if the recognition fails.

– H detection mode: When the drones operates in the altitude range of ~1.5-6 m, the 6-DOF pose estimation is returned to the control system if the “H” is completely framed. In this case, the related corners are highlighted in green.