Plainsight’s partnership with MarineSitu is already making waves. In interviews, our teams have detailed the ways durable submersibles combined with cutting-edge computer vision solutions are powering green innovation across emerging Blue Economy sectors like marine energy. In the latest episode of AI In Plainsight, I sat down with MarineSitu’s Co-Founders to discuss the origins of their business and some of the challenges their technology has already helped to address.
In Part 2 of the conversation, we dove into some of the most exciting MarineSitu projects, including initiatives to protect marine populations and serve communities. Check it out and stay tuned for next week’s conclusion.
BG: Welcome back to AI in Plainsight. I’m Bennett Glace and thanks for tuning into Part 2 of my conversation with James Joslin, Paul Murphy, and Mitchell Scott, the co-founders of MarineSitu. Last time, we discussed the team’s backgrounds and some of the challenges their technology was developed to address. In this Part 2, we’re taking a deeper dive into some cold Washington and Alaska waters to learn more about some of MarineSitu’s most exciting and impactful projects.
So you mentioned your vision for the future of the Blue Economy and the kind of space you’re operating in. But I’m wondering if we could look back and maybe you could speak to some of the early projects and applications you guys worked on and some of the lessons learned from those early wins.
JJ: Sure. We started off developing the AMP thinking mainly about the mechanical challenges and, in general, we think about the development we went through as having basically three different phases.
There was the mechanical development phase where we were just figuring out how to put all these different instruments into a single body and make sure it survived biofouling and corrosion and all of that. And then there was the electrical and initial software challenge of getting all the instruments to work well together and not have crosstalk. Ultimately, the biggest challenge, and what we are still working on is the data management and processing. You get instruments in the water and you start getting all this data, but then you need to be really smart about what you do with that data to actually get the information that you need. Initially, the deployments that we were doing were really just to demonstrate each of those different steps.
And we got to do a whole series of deployments out at the Pacific Northwest National Labs, their Marine and Coastal Research Lab out in Sequim Bay, Washington. They have a pretty cool little site there that has access to a pier that sticks into the mouth of Sequim Bay and it offers a moderate tidal channel.It’s been a great place to basically “test-deploy” a number of our systems. The initial version of the AMP was deployed there, followed by what we called the I amp, which was the first AMP that really integrated all the instruments together software-wise. Then the 3G AMP, which was the third-generation AMP capable of doing that first step of data processing and management in real time.
In addition, we’ve done a few different versions that have been deployed elsewhere. One that was deployed out in Hawaii at what’s called the Wave Energy Test Site (or WETS). That’s off of the Marine Corps base and it’s another cool site to go to and do work. That was in conjunction with a wave energy converter called the Fred Olson lifesaver buoy. Basically, we installed one of our instrumentation systems on that device and had it in the water for almost eight months out in Hawaii. And then we’ve had another version that has lived on one of UDub’s research vessels and allows us to test turbines on the vessel. And another version that we deployed this last year with tidal turbines developed by the University of Washington.
BG: I’d love to know, what specific conditions are you monitoring for when you’re looking at these environments? What’s most important for marine energy providers when it comes to understanding what’s happening around their turbines, what are some of the impacts of not getting a good look, can we unpack that a little bit?
JJ: For monitoring underwater turbines, there are a whole number of challenges, but what we’re interested in seeing is how the turbine affects marine life. If there are fish that are swimming through the area, are they being impacted by the turbine? You hear about wind turbines killing birds, that can lead people to think that putting a turbine underwater will mean the same problems with fish. But that may not actually be the case. There’s a lot of evidence to suggest that fish are better at avoiding these turbines. The turbines also move significantly slower than wind turbines do. Still, there are a number of challenges with actually trying to optically monitor these turbines because of the water quality, and the size of the turbine, and how far you can see with a camera underwater.
We’ve had a couple of pretty unique deployments that we’ve been part of that have allowed us to kind of dig into that and start to see what we can do with underwater cameras. Paul, you’ve been most involved with the RivGen turbine up in Alaska. Anything you can say about that?
PM: The RivGen turbine is in the Kvichak River, it flows out of Lake Iliamna. And this turbine is a very exciting project because it’s positioned to support Igiyaga, a village there of native people. There’s maybe about 50 residents and they currently get all their power from diesel fuel that’s flown in. That’s obviously very expensive and not environmentally friendly. This turbine represents a really great way to transition to both a cheaper and more sustainable fuel source.
One of the challenges of this deployment is that the salmon that flow through the river are essential to the local economy and people who live there. They survive off the salmon, it’s one of their primary food sources. And these turbines are hydrokinetic, so they are spinning in the water. There’s concern about whether or not the fish might be impacted. They might run into the turbine, get hit by the blades, or maybe just get disoriented as they pass over the turbine blade. That can create a risk of predation. If they’re disoriented moving through the water, a bird can fly down and easily pick them out. That’s one of the main reasons we want to study this particular turbine. We’ve been working with them since 2019 and helped them deploy some cameras. It was initially a pair of cameras, upstream of the turbine and downstream. So, you can imagine, we are looking across this turbine to see what it looks like when fish come into the wake, into the flow, of the turbine and out of its wake. And that has so far been pretty enlightening. There’s been some really great data that a couple of biologists from the University of Alaska-Fairbanks have been analyzing to get a statistical sense of how many fish are passing through and how many appear like they might be impacted by the turbine blades.
That’s been really great and really exciting. We’re continuing to work with them. In the coming month or so, we’ll be deploying our first machine learning-based detection algorithm with them to hopefully make this process a little easier. Currently, their collection strategy is to collect 10 minutes of data every 60 minutes, forever. This produces a massive amount of data that no one can reasonably look through. It’s a real challenge and we hope to help here by introducing this real-time detection and allowing them to very easily isolate the exact frames from each camera that have fish in them so they can do their counts and analysis more quickly.
One of the challenges we’re seeing is that the water quality is kind of variable. Sometimes when the water quality is excellent, you can really clearly see the definition of a fish passing through the wake. Other times – because of factors like ice melt – things can be blurry and the fish might look like a blob. That’s been something we’ve been working with Plainsight on. We’re trying to train a machine learning-based detector to detect those anyway. So we can basically try to get as many of these sorts of objects, whether we know exactly if they’re a fish or not as we can, and then pass this data off to the biologist to try and ascertain for sure if we think we’ve seen a fish or not.
We’re very excited about that project and so far. We think it’s been really successful and we’re really thrilled to be going forward with it this spring.
BG: And to pick your brains a little more, what do you do when the water quality is inconsistent? Do you rely on computer vision and machine learning for that out of the gate, do you complement that with some of the acoustics and sonar that you’ve mentioned? Could you speak to that a little more?
JJ: It’s certainly not a challenge that’s unique to that location. Almost every site has variable water quality and in some places it’s so bad that optical cameras are just not an option, not effective at all. Imaging sonars are one of the next best tools. Those use sounds basically to create a 2D image of what you would see. It looks pretty different from an optical image, as you can imagine, but you can apply a lot of the same techniques to data processing. Mitchell, you’ve been most involved in that. Do you want to talk about the imaging sonar machine learning you’ve done?
MS: Imaging sonar, as James said, is one of the ways we can get around or, or at least improve upon really poor optical data. The problem with imaging sonar information is that it’s much, much lower resolution than what you get out of a good optical image. So, a fish that would appear in an optical image to have distinct features such as a fin or a tail might, in an imaging sonar world, more likely appear like a blob or just a high-intensity point.
There’s been a question of how accurate you can get with detecting and imaging sonars, species-level identification, and things of that nature. What we’ve found in some past projects is that applying common machine learning algorithms to imaging sonar data for the detection of things in the water – recognizing a fish versus some turbidity of a rock – we are able to do that. We’re able to say, “this is a fish.” or “this is not a fish.” Using those types of algorithms, we want to continue to advance what we can do when turbidity is really high and we’re not able to see through the water quality very well. I think we’ve had pretty good initial success.
BG: Before the conversation, we sent some questions back and forth and you guys had some notes on past projects. Two that we haven’t mentioned are Daisy and Micro Float. Could you tell me a little about those?
JJ: Sure. As mentioned, we came out of the University of Washington, the Applied Physics Lab and the Mechanical Engineering Department in particular. My research background was with the adaptable monitoring package there, but we were also involved with the development of a couple other technologies, including the DAISY, which stands for the Drifting Acoustic Instrumentation System and the Micro Float, which is basically a buoyancy-driven instrumentation platform that is a very small and and low-cost system. Both of these systems are pieces that we are working with our lab there to try and commercialize and further develop. So we see them as being tools in our toolkit and products that we can offer to kind of expand our monitoring capabilities.
Specifically, both of these platforms have applications for acoustic monitoring of these devices. One of the big concerns around marine energy systems is that having these moving systems in the water creates a lot of noise and it might deter animals from being in that area. Animals like harbor porpoises or whales that use echolocation to navigate will often avoid noisy places because when they can’t hear, they can’t navigate.
One of the requirements for a lot of these devices is to go out and sample the acoustics of the device and try to see how loud it is and whether or not it’s gonna have an impact. Both the DAISY and the Micro Float have pretty unique capabilities for collecting data around these devices in very complex wave and tidal environments where they’re deployed.
BG: Thanks for listening to Part 2 of my conversation with the MarineSitu team and thanks again to James Joslin, Paul Murphy and Mitchell Scott for joining me. Make sure to tune into the next episode of AI in Plainsight for the conclusion to our discussion and details on how our partnership with MarineSitu got started.
Stay Tuned for the Conclusion
We’ll conclude the conversation and discuss how Plainsight and MarineSitu come together to support a broad range of Blue Economy use cases in the next episode of AI in Plainsight.