A camera positioned underwater near the Kermadec Islands, north of New Zealand, with the goal of capturing footage of sharks and other marine life, reeled in a big one recently when a great white took a special liking to the rigging.
According to a report in the New Zealand Herald, the 13-foot-long shark, named “Kermit,” was a surprise get for researchers from Massey University who were more accustomed to seeing grey and whitetip reef sharks, among others.
The project, called Global FinPrint, is a worldwide effort funded by Microsoft billionaire co-founder Paul Allen. Launched in the summer of 2015, surveys are made of sharks, rays, and other types of marine life on coral reefs using baited remote underwater video surveys (BRUVs).
The goal is to better understand the coral reef ecosystem and how humans impact species and their habitats, according to the FinPrint website. “Ultimately, the consolidation of this collaborative global research into one single analysis will aid management and conservation efforts for life on the reef.”
According to the NZ Herald, there’s no live video feed of what’s being seen below water. So researchers analyzing footage were excited at the end of the day when they saw the great white.
The shark made multiple passes at the camera, giving the gear “curiosity bites” before picking up the entire BRUV rig three separate times and swimming with it to the surface before dropping it.
“Getting the great white at the end of the trip was a definitely a highlight though, especially considering they’ve only been recorded at the Kermadecs a few times before,” said Dr Adam Smith, who led the expedition. “It’s likely that some great whites stopover while migrating between NZ and the tropics, like humpback whales do.”
Allen’s Vulcan Inc. shared more about how machine learning is being used in the process in a post this summer on the company’s Tech Dev Blog.
The use of BRUVs to collect the videos eliminates the need for people on the reef that could affect the behavior of the sharks and rays, allowing for a more accurate report of the animals in the area. The BRUVs are left underwater for 60-90 minutes, after which the collected videos each require annotation by two different humans — usually grad students — to identify the animals that appear. The annotations are then evaluated and verified by a third expert. These annotations will make up the final dataset that will be released.
Vulcan’s Technology team has developed ElasmoFinder — a machine learning tool designed to reduce the need for human analysis of the video captured by the BRUVs. ElasmoFinder, named after elasmobranchii — the subclass of fish containing sharks and rays, speeds up the annotation process by automatically identifying animals in the videos.