Skip to Main Content U.S. Department of Energy
Energy and Environment Directorate
Page 8 of 944

Research Highlights

Highlights Archive

PNNL Develops EyeSea, a Machine-Learning Tool to Automate Video Analysis of MHK Sites

New tool aids in determining the environmental impact of tidal, wave, and small hydropower

November 2017
PNNL Develops EyeSea, a Machine-Learning Tool to Automate Video Analysis of MHK Sites

EyeSea can "watch" underwater video footage and automatically identify when wildlife enters the frame.

Marine hydrokinetic (MHK) energy taps into the movement of waves, tides, and water currents to turn a turbine and produce electricity. Similar in principle to hydropower but without the dams, MHK has the potential to generate upwards of 1,850 TWh/year. That’s enough energy to power well over 150 million U.S. homes.

So, what’s the hold up?

One thing impeding the growth of MHK is the potential impact on the environment. We just don’t know how MHK affects wildlife. To get a better grasp of the effects and accelerate the deployment of MHK, researchers at PNNL are studying and developing new technologies to measure and evaluate the environmental performance of MHK.

PNNL scientists and engineers recently created a software tool—EyeSea—that automates the analysis of underwater video footage. A common and simple means of seeing how fish and mammals interact with MHK systems is to simply set up an underwater camera and begin recording. One hour of video, however, can take 5 or more hours to assess manually, and it’s not uncommon for there to be hundreds of hours of footage to review.

To mitigate the problem, operators and researchers often resort to sub-sampling, picking random one-hour intervals of footage to evaluate. While such an approach speeds up an analysis, it reduces accuracy.

Funded by DOE’s Water Power Technologies Office, EyeSea uses machine vision algorithms to “watch” video footage for any incidents where a fish or mammal is near an MHK turbine. The tool automatically detects when a fish or mammal enters the frame and flags every event. The flagged events tell an operator which segments of footage should be evaluated, significantly reducing labor time.

For video footage taken underwater, identifying an object and determining whether that object is an animal or not can be especially challenging and time consuming.

For video footage taken underwater, identifying an object and determining whether that object is an animal or not can be especially challenging and time consuming.

PNNL recently developed and tested the tool using footage from a pilot project of an MHK unit in Igiugig, Alaska by Ocean Renewable Power Company (ORPC). For about two months, an ORPC turbine generated electricity during the middle of Alaska’s annual salmon run. Using data from the pilot project, PNNL staff developed EyeSea to analyze the underwater footage and detect when fish were around the turbine.

Researchers analyzed 43 hours of video footage, where they observed less than 20 fish interactions and no occasions of fish injuries. Equally important, PNNL assessed the accuracy of EyeSea and determined it was 85 percent accurate at detecting when wildlife was present. From this data, PNNL is refining the algorithms behind EyeSea. If successful, EyeSea will be made available to MHK operators and developers to streamline siting and permitting processes, and meet post-installation monitoring requirements at future MHK sites.

PNNL Research Team: Genevra Harker-Klimes, Shari Matzner, and Garrett Staines


Page 8 of 944

Energy and Environment

Core Research Areas

Resources

Contacts