The final installment of our three-part conversation with the MarineSitu team focuses on the origins of our partnership, the exciting things made possible by our combined resources, and the ways computer vision powers Blue Economy innovation.
Check out the conclusion of the conversation below and revisit Parts 1 and 2.
Bennett Glace: Welcome back to AI in Plainsight and thanks for tuning into the conclusion of my three-part conversation with MarineSitu’s co-founders. In the first two installments, we covered MarineSitu’s origins, the ways its technology addresses common Blue Economy challenges, and some of their most exciting projects so far. Our final chat focuses on the partnership between MarineSitu and Plainsight and all the ways our combined powers enable customers.
And what made Plainsight an especially appealing computer vision partner?
James Joslin: As we were building up our company and really trying to build our software pipeline we realized that we needed a lot of tools that we could try to develop ourselves or we could go out and find a partner. We did a review of the providers out there and landed on Plainsight.
Mitchell Scott: I think there’s been two main reasons that we were really looking for a computer vision partner and eventually ended up with Plainsight. The first reason is that we essentially have lots and lots of archived data and we need a tool to help us annotate that data and train it in a way which is scalable and easily deployable and then essentially download models and push them onto edge devices.
In the past, this work – because it’s very research based – has involved a lot of one-off implementations of computer vision algorithms. We needed something more centralized and buildable so we can make iterations really easily, annotate using AutoLabel, and things of that nature. Plainsight was really key in that, providing a platform where we could do all of this online and have our data stored in a cloud. The second reason is that Plainsight has allowed us to have access to computer vision experts. We as a team are relatively proficient in the space, but it’s good to have people who are also proficient and in some cases more proficient, particularly with some of the more advanced algorithms. It’s nice to have a team that’s able to help advise and build algorithms for specific detection cases. It’s been really key so far in building some of the projects we’re working on and we think it’s gonna be key in the next six months to a year as we work toward a couple of projects.
BG: And what are some of those projects that have helped to sort of kick off the relationship?
MS: So there’s been a couple projects so far that are, of course, focusing in the space of the detection of fish in marine environments. One of those has been at the site Paul was alluding to earlier where there’s fish near a turbine in a river system. And using the labeling tool alongside the ability to build and deploy models has been really key. And I have been focusing on a lot of data that was collected alongside the University of Washington for the detection of potential collision events at a pass near Sequim Bay, Washington. In that case, we’re doing something similar where we’re trying to detect individual instances of fish near a hydrokinetic device and see how behaviors are potentially influenced by its presence.
JJ: There are a couple different levels of models that we’re trying to build. The initial one is just detecting the fish and deciding which images to save and archive for review. The next level is trying to understand or classify either the type of fish it is or what they’re doing. As Mitchell mentioned, there’s a study we’re doing with the University of Washington where there’s a lot of interest in what these fish do when they come into the realm of a turbine underwater. Are they trying to avoid it? Are they able to do that? Are they gathering behind it and aggregating around it like its a reef? Either way, it’s important for us to understand these behaviors to try and mitigate or even promote them in certain cases.
BG: You’ve mentioned a lot of ongoing work, but are there any goals or projects in mind that are especially exciting for the first year of this partnership in particular?
JJ: Well we’re very excited to get these models out and deployed on our devices. That project with the turbine in Alaska is one that we’re especially excited about. In addition, there’s going to be a deployment of a turbine by the University of Washington here in Puget Sound where we’ll be involved with data processing and management. We’re working with a new customer right now doing some fish passage monitoring for hydroelectric dams and getting into that space. In general, we see this technology as being pretty broadly applicable across the Blue Economy. This year, we’ve got at least five different deployments that we’ll be working on over the next six months. One of the things that Mitchell alluded to as to why we chose this partnership and why we’re excited to work together is the scaling it’ll allow us to do. As a small group, the ability to leverage all of these tools and boost the computing power we have in our personal workstations gives us the opportunity to grow the business in a way that we wouldn’t have been able to before.
BG: Well I think that wraps it up for today and concludes our three-part conversation. Thanks again to James, Paul, and Mitchell from MarineSitu for joining us. And thanks again for listening to this multi-installment episode of AI in Plainsight.
For more updates from the Plainsight team and the worlds of AI, ML, and Computer Vision delivered directly to your inbox, make sure to subscribe to our AI Week in Review newsletter on LinkedIn.
If you think computer vision may be the answer to your sustainability questions, schedule a conversation to discuss your enterprise’s needs.
Thanks again for listening to AI in Plainsight. See you next time.