Animal biometric systems can also be applied to the remains of living beings – such as coral skeletons. These fascinating structures can be scanned via 3D tomography and made available to computer vision scientists via resulting image stacks.
In this project we investigated the efficacy of deep learning architectures such as U-Net on such image stacks in order to find and measure important features in coral skeletons automatically. One such feature is constituted by growth bands of the colony, which are extracted/approximated by our system and superimposed on a coral slice in the image below. The project provides a first proof-of-concept that machines can, given sufficiently clear samples, perform similarly to humans in many respects when identifying associated growth and calcification rates exposed from skeletal density-banding. This is a first step towards automating banding measurements and related analysis.
This work was supported by NERC GW4+ Doctoral Training Partnership and is part of 4D-REEF, a Marie Sklodowska-Curie Innovative Training Network funded by European Union Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie grant agreement No. 813360.
Traditionally animal biometric systems represent and detect the phenotypic appearance of species, individuals, behaviors, and morphological traits via passive camera settings – be this camera traps or other fixed camera installations.
In this line of work we implemented for the first time a full biometric pipeline onboard a autonomous UAV in order to gain complete autonomous agency that can be used to adjust acquisition scenarios to important settings such as individual identification in freely moving herds of cattle.
In particular, we have built a computationally-enhanced M100 UAV platform with an on-board deep learning inference system for integrated computer vision and navigation able to autonomously find and visually identify by coat pattern individual HolsteinFriesian cattle in freely moving herds.We evaluate the performance of components offline, and also online via real-world field tests of autonomous low-altitude flight in a farm environment. The proof-of-concept system is a successful step towards autonomous biometric identification of individual animals from the air in open pasture environments and in-side farms for tagless AI support in farming and ecology.
W Andrew, C Greatwood, T Burghardt. Aerial Animal Biometrics: Individual Friesian Cattle Recovery and Visual Identification via an Autonomous UAV with Onboard Deep Inference. 32nd IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 237-243, November 2019. (DOI:10.1109/IROS40897.2019.8968555), (Arxiv PDF)
The problem of visually identifying the presence and locations of animal species filmed in natural habitats is of central importance for automating the interpretation of large-scale camera trap imagery. This is particularly challenging in scenarios where lighting is difficult, backgrounds are non-static, and major occlusions, image noise, as well as animal camouflage effects occur: filming great apes viacamera traps in jungle environments constitutes one such setting. Finding animals under these conditions and classifying their behaviours are important tasks in order to exploit the filmed material for conservation or biological modelling.
Together with researchers from various institutions including the Max Planck Institute for Evolutionary Anthropology we developed deep learning systems for detecting great apes in challenging imagery in the first place and for identifying animal behaviours exhibited in these camera trap clips once apes have been recognised.
F Sakib, T Burghardt. Visual Recognition of Great Ape Behaviours in the Wild. In press. Proc. 25th International Conference on Pattern Recognition (ICPR) Workshop on Visual Observation and Analysis of Vertebrate And Insect Behavior (VAIB), January 2021. (Arxiv PDF)