Government scientists, academia, and fishermen are working together to develop innovative monitoring tools to identify and measure fish from digital images. This technology could revolutionize the way fisheries data are collected.
Machine vision technology advances electronic monitoring systems on fishing vessels, which use cameras to collect video of commercial catches. With this technology, scientists are able to automate image analysis at sea eliminating manual data processing on land, and providing quicker access to data to make management decisions.
Catch Monitoring in Commercial Fisheries
Alaska’s commercial fisheries are the biggest and most valuable in the nation. Their successful and sustainable management relies on accurate information on fishing effort and catch. Traditionally, data have been collected by fishery observers, scientists who live and work aboard fishing vessels. However, deploying an observer is not feasible on all vessels -- smaller boats may not have enough bunk space, safety equipment, or space on deck to accommodate an observer.
Fisheries scientists and managers are increasingly turning to electronic monitoring (EM) to augment observer data by deploying camera systems to remotely monitor compliance and record catches on fishing boats. These systems produce vast amounts of video data that are reviewed back on land, delaying the availability of data that could be used for fishery management.
More Timely Management
“We will be sending managers data, not images, right from the dock,” explains Farron Wallace of the Alaska Fisheries Science Center (AFSC), who is leading the project. “It’s a big goal and a difficult challenge. But the end result will be more cost-effective, safe, and timely data collection.”
The EM Innovation Project started in 2013 in the Alaska Fisheries Science Center’s Fisheries Monitoring and Analysis Division and is funded by NOAA Fisheries’ Fisheries Information Systems and National Observer Program (FIS/NOP). Wallace’s EM Innovation Team is working in collaboration with numerous international, federal and state agencies as well as Dr. Jenq-Neng Hwang from the Information Processing lab at the University of Washington College of Electrical Engineering, who is leading the development of machine vision to automate length measurement and species identification.
The team is also collaborating with the fishing industry, a partnership that helped overcome one of the biggest hurdles to the project: developing equipment that can survive Alaskan weather and seas.
“Fishermen have been enormously helpful in creating hardware that can live on a boat. We’ve made great advances based on their suggestions for better methods, materials, and equipment,” says EM Innovation Team member Suzanne Romain. Romain also points out the project would not have been possible before now.
“We started developing machine vision for fisheries just when the hardware capable of running this powerful image analysis algorithm was becoming available-- that’s really recent. Other industries are using the same technology. We are applying it to fisheries management. We had the enormous luck of coming into the field just when that field is exploding.”
A Photo Booth for Fish
Other challenges the EM Innovation Team has tackled include developing an image library to build machine learning algorithms to measure and identify fish in different lighting conditions and positions. Development of a camera chute, or photo booth for fish, has been a key tool for addressing these problems. This system is aiding the development of fish shape models that will be used for machine vision analysis of images from stereo cameras deployed on the rails of longline sablefish and halibut fishing boats, where bycatch is released without being brought on board.
Results so far have been encouraging. “EM is not as good as an observer, but we are getting closer. Machine vision is able to distinguish species, like some rockfish and sole that are difficult for an observer to differentiate. That’s very exciting,” says Wallace. “Our system parallels other Intelligent Monitoring Systems such as security systems that automate facial recognition or highway transit systems that automate vehicle identity.” In addition to serving as a tool to develop machine vision, the chute is being used directly to collect halibut bycatch data on trawlers, where bycatch is sorted on deck and can be fed through the chute as they are released.
According to EM Team member Craig Rose, “In this application, the system currently produces actual data directly at sea. And it gives a census of halibut released, not just a sample."
A Foundation to Build On
An ultimate goal of the machine vision project is to produce open-source software and hardware that could be used nationwide for EM and will serve as a foundation for others to build their own systems.“ For example, open-source products could empower fishermen to assemble their own systems. If they operate out of a remote Alaska port far from service areas, they could order online and build their own,” Romain says.
Rose puts the project in historical context: “I’ve been watching the development of this kind of tool for a long time -- since the 1980s. So many attempts made a certain amount of headway but didn’t get there. Then the next project had to start at zero because it was all proprietary. Our approach is to come up with open source software and hardware that other people can start from and move forward. Our hope is to put out something that can be built on.”
watch this video at: https://www.youtube.com/watch?v=7gV9jRCcgH0&t=4s
Comments