Constructor Robotics participated as partner, at the time still under the previous name of Jacobs University of our institution, in the EU FP7 project “Cognitive autonomous diving buddy (CADDY)”, which deals with the challenges to replace a human buddy diver with an autonomous underwater vehicle and to add a new autonomous surface vehicle to improve monitoring, assistance, and safety of the diver’s mission. The resulting system plays a threefold role similar to those that a human buddy diver should have:
- the buddy “observer” that continuously monitors the diver;
- the buddy “slave” that is the diver’s “extended hand” during underwater operations performing tasks such as “do a mosaic of that area”, “take a photo of that” or “illuminate that”; and
- the buddy “guide” that leads the diver through the underwater environment.
More info on the overall project can be found on the CADDY project’s website. The role of Constructor Robotics in the project dealt with the underwater perception aspects, i.e., elements of tracking the diver, the recognition of diver’s gestures, and service tasks like mapping the environment.
There are also multiple data-sets and code generated as part of the project, e.g., related to underwater gesture recognition, diver pose estimation, or data augmentation for underwater vision.
Publications
The following publications by Constructor Robotics have appeared in the context of the project. The list also includes a few related publications that have appeared after the project ended.
[1] M. Pfingsthorn, A. Birk, F. Ferreira, G. Veruggio, M. Caccia, and G. Bruzzone, “Large-Scale Image Mosaicking using Multimodal Hyperedge Constraints from Multiple Registration Methods within the Generalized Graph SLAM Framework,” in IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Chicago, USA, 2014. https://doi.org/10.1109/IROS.2014.6943209 [Preprint PDF]
[2] A. G. Chavez and A. Birk, “Underwater Gesture Recognition based on Multi-Descriptor Random Forests (MD-NCMF),” ed: Progress Report: EU-FP7 Cognitive autonomous diving buddy (CADDY), grant 611373, 2015.
[3] A. G. Chavez, M. Pfingsthorn, A. Birk, I. Rendulic, and N. Miskovic, “Visual Diver Detection using Multi-Descriptor Nearest-Class-Mean Random Forests in the Context of Underwater Human Robot Interaction (HRI),” in IEEE Oceans, Genoa, Italy, 2015. https://doi.org/10.1109/OCEANS-Genova.2015.7271556 [Preprint PDF]
[4] I. Enchev, M. Pfingsthorn, T. Luczynski, I. Sokolovski, A. Birk, and D. Tietjen, “Underwater Place Recognition in Noisy Stereo Data using Fab-Map with a Multimodal Vocabulary from 2D Texture and 3D Surface Descriptors,” in IEEE Oceans, Genoa, Italy, 2015. https://doi.org/10.1109/OCEANS-Genova.2015.7271561 [Preprint PDF]
[5] N. Miskovic, A. Pascoal, M. Bibuli, M. Caccia, J. A. Neasham, A. Birk, M. Egi, K. Grammer, A. Marroni, A. Vasilijevic, and Z. Vukic, “CADDY Project, Year 1: Overview of Technological Developments and Cooperative Behaviours,” in IFAC Workshop on Navigation, Guidance and Control of Underwater Vehicles (NGCUV), 2015. https://doi.org/10.1016/j.ifacol.2015.06.020 [Preprint PDF]
[6] N. Miskovic, M. Bibuli, A. Birk, M. Caccia, M. Egi, K. Grammer, A. Marroni, J. Neasham, A. Pascoal, and A. V. Z. Vukic, “CADDY – Cognitive Autonomous Diving Buddy: Two Years of Underwater Human-Robot Interaction,” Marine Technology Society (MTS) Journal, vol. 50, pp. 1-13, 2016. https://doi.org/10.4031/MTSJ.50.4.11 [Open Access]
[7] N. Miskovic, A. Pascoal, M. Bibuli, M. Caccia, J. A. Neasham, A. Birk, M. Egi, K. Grammer, A. Marroni, A. Vasilijevic, and Z. Vukic, “CADDY Project, Year 2: The First Validation Trials,” in 10th IFAC Conference on Control Applications in Marine Systems (CAMS), Trondheim, Norway, 2016. https://doi.org/10.1016/j.ifacol.2016.10.440 [Preprint PDF]
[8] M. Pfingsthorn and A. Birk, “Generalized Graph SLAM: Solving Local and Global Ambiguities through Multimodal and Hyperedge Constraints,” International Journal of Robotics Research (IJRR), vol. 35, pp. 601-630, 2016. https://doi.org/10.1177/0278364915585395 [Preprint PDF]
[9] A. G. Chavez, C. A. Mueller, A. Birk, A. Babic, and N. Miskovic, “Stereo-vision based diver pose estimation using LSTM recurrent neural networks for AUV navigation guidance,” in IEEE Oceans, Aberdeen, UK, 2017. https://doi.org/10.1109/OCEANSE.2017.8085020 [Preprint PDF]
[10] N. Miskovic, A. Pascoal, M. Bibuli, M. Caccia, J. A. Neasham, A. Birk, M. Egi, K. Grammer, A. Marroni, A. Vasilijevic, D. Nad, and Z. Vukic, “CADDY project, year 3: The final validation trials,” in OCEANS, Aberdeen, Scotland, 2017, pp. 1-5. https://doi.org/10.1109/OCEANSE.2017.8084715 [Preprint PDF]
[11] A. G. Chavez, A. Ranieri, D. Chiarella, E. Zereik, A. Babic, and A. Birk, “CADDY Underwater Stereo-Vision Dataset for Human-Robot Interaction (HRI) in the Context of Diver Activities,” Journal of Marine Science and Engineering (JMSE), spec.iss. Underwater Imaging, vol. 7, 2019. https://doi.org/10.3390/jmse7010016 [Open Access]
[12] A. G. Chavez, A. Ranieri, D. Chiarella, and A. Birk, “Underwater Vision-Based Gesture Recognition: A Robustness Validation for Safe Human-Robot Interaction,” IEEE Robotics and Automation Magazine (RAM), vol. 28, pp. 67-78, 2021. https://doi.org/10.1109/MRA.2021.3075560 [Preprint PDF]
[13] A. Birk, “A Survey of Underwater Human-Robot Interaction (U-HRI),” Current Robotics Reports, Springer Nature, vol. 3, pp. 199-211, 2022. https://doi.org/10.1007/s43154-022-00092-7 [Open Access]