CWA Innovator Spotlight: AI & Machine Learning
[00:00:00] Louise Quigley: Hello everyone. As you're starting to join, we have a few more people that will be coming in, but I'd like to thank you all for joining and welcome to Cleveland Water Alliance's July edition of our Innovator Spotlight series. I am Louise Quigley and I'm going to be your host today. This month's spotlight focuses on innovators that are utilizing artificial intelligence and machine learning to provide solutions to the water industry.
[00:00:41] We'll be talking to experts from three companies who are currently innovating within the space, including BloomOptix. Satelytics and AGS Scientific representing their partners at Zweec. We do have
[00:01:00] polling that will be launched during the webinar, so please take a moment to participate in that.
[00:01:06] We also have a Q&A function at the bottom of your zoom screen, so you are welcome to send your questions for the innovators as we go through the presentation and we will address those in the discussion portion. Of the program.
[00:01:27] With that, before we get into the content, I would just like to say a brief thank you to our series sponsor. Badger Meter is our sponsor and has a family of technologies that enable faster insights for water management, including their innovative Blue Edge solution for real-time monitoring. And system optimization.
[00:01:50] So thank you so much to Badger Meter for being a CWA supporter, and thank you to you all for being here and joining us today. Again, my name is Louise Quigley. For those of you that don't know me, I'm a strategic innovation executive with the Cleveland Water Alliance or CWA. I manage a number of initiatives, including our relationship with Great Lakes Renew, a National Science Foundation innovation engine.
[00:02:18] I have a background in corporate innovation at Moen, a division of Fortune Brands Innovations, and most recently at Salesforce. So thank you again. With that. I'll just say a few words about Cleveland Water Alliance, whether you're new to us or you're returning guests. We are a US-based nonprofit and we're located on the shores of Lake Erie in the Great Lakes region.
[00:02:47] CWA is a connector and an enabler of water tech innovations and innovators. We help to foster research, accelerate impactful technologies and drive economic impact. We typically are meeting with 250 to 300 innovators each year, and we're connecting over a hundred industry partners, over 30 utility partners and 23 research institutions.
[00:03:22] So, we are very involved in each of those areas. Some of the ways that we do accelerate innovation include the following: We have open innovation challenges to help align innovation focus within market need. We have accelerator test beds and piloting options to provide validation, real world conditions and user feedback.
[00:03:50] We also provide funding, partner support and services to support the businesses that are bringing technologies to market. Additionally, we provide market exposure to accelerate market familiarity and adoption for innovators and manufacturing and distributors to, and making connections there to accelerate expansion and growth.
[00:04:17] So that's a little bit about us, but let's get to today's topic. So AI and AI/ML are pretty much the hottest topic in technology right now. But since we're all exposed to it, let's just do a little level setting, cover some high level definitions and some real world examples in case it's not an area you're involved in every day.
[00:04:43] So artificial intelligence or AI, as it's known, is the ability of a computer or a machine to do tasks that usually require human intelligence, like understanding language, recognizing images. Or making decisions. So some examples of that in the real world would be things like voice assistance, like Siri or Alexa that understand your speech and respond with helpful information.
[00:05:10] Another one would be self-driving cars. We all know about those. They're making decisions on the road like when to stop, when to turn, or to avoid obstacles. Also spam filters. We all know about those. They're detecting and blocking unwanted emails and they're recognizing patterns in those messages to do so.
[00:05:31] when it comes to machine learning or ML. This is actually a subset of AI where computers are learning from data and improving their performance over time without being explicitly programmed for each task. So examples in our real world of that would be Netflix recommendations where it's learning what you like to watch.
[00:05:53] It's suggesting shows or movies based on your viewing history. Email spam detection, learns from past spam emails to better filter out new ones and credit card fraud detection. So this is where monitoring your spending patterns is spotting unusual or suspicious transactions. So today, we will be diving into these topics from a water technology standpoint.
[00:06:25] And with that, I'm going to introduce you to our expert panel of innovators. First up, we have Dr. Igor Mrdjen, the project and science lead at BloomOptix. Igor has focused on changing how we see HAB’s or harmful algae blooms, for over a decade. He works with emerging technologies such as drones, advanced sensors, and AI, in research that focuses on improving how we study our environment and protect human health.
[00:07:01] Igor's been working with Lake Associations, academics and NGO partners to design, test, and launch BloomOptix solutions since 2019.
[00:07:16] Next up we have Jay Almlie, the Chief Marketing Officer at Satelytics. Jay leads a focus team charged with the task of conveying the central message of Satelytics as the leader in geospatial analytics for the energy sector. Jay brings a unique perspective to marketing. Having previously been a customer of Satelytics and was managing the founding formation and execution of the nationally acclaimed industry consortium, the Intelligent Pipeline Integrity Program, or iPipe, and with Jay at the helm, iPipe awarded.
[00:07:57] Three consecutive contracts to Satelytics to develop software tailored specifically to the needs of pipeline operators placing Satelytics in the leadership position they hold today. And last but not least, we have Gene Warning, who is the sales manager at AGS Scientific and representing their partners at Zweec today.
[00:08:19] and we'll be discussing Zweec's technology, Gene’s work both in the field and in the lab at different points in, in his career representing instrumentation and service companies in the analytical market. And in his role at AGS Scientific, he communicates daily with clients and prospects and identifies market trends.
[00:08:42] And with that, I'd like to hand it over to Igor to tell us a little bit about BloomOptix. Over to you, Igor.
[00:08:50] Igor Mrdjen: Thanks, Louise. So yeah, I'm here on behalf of BloomOptix, which is a startup that I helped found. and I'm now, the lead innovation person for, and so we basically set out to improve how we use emerging technologies, to monitor, for algal blooms, in lakes, ponds, rivers, what have you.
[00:09:16] next slide, please, if you could. So, one of the things that we realized, in the early days of BloomOptix when we were, using drones to kind of monitor for habs and differentiate them from clean water, was that the biggest pipeline when it comes to monitoring for HAB’s is, quite often the turnaround time between when you send your sample into a laboratory and when you get results back.
[00:09:40] a lot of the folks that we worked with were kind of telling us, you know, I send my sample to like– I have two options in the whole state that I can send my samples to for analysis, and then I'm waiting five, 10, maybe, you know, 20 days to, to hear back from them. And that's just not good enough.
[00:09:59] From the standpoint of human health protection, lake management and so forth. So what we really wanted to do was kind of shrink that time down by using some advanced technologies, to get those results turned around in 30 seconds or less is the time that we targeted. So next slide, please.
[00:10:20] and so the first thing that we did was we partnered up with a firm out of the UK called IO Light. They make these really cool, portable digital microscopes. They're linked up to your cell phone, which really enables us to tap into the power of the internet, allowing us to database, geolocate, [00:10:38] store photos, communicate rapidly across vast distances and so forth. and we also aimed for the entire setup to be run by, machine learning program on the software end. so that we can make it accessible, reliable, and representative of what you're actually seeing in the field.
[00:11:01] So, next slide, please. All right. So, the way it works is basically on the front end, a user will take 10 or so photos, provide the location where those photos were taken, and then any other field data that they might be collecting, like depth, turbidity, sensor readings, anything like that.
[00:11:20] And they would upload that information into the BloomOptix app on their cell phone and hit send. In the backend, the AI or machine learning magic happens where the software will, basically take a look at those photos, trace any sign of bacterial colonies that it's detecting and communicate that data back to the user within about 30 seconds.
[00:11:43] While it's doing that, it's also calculating cell density so we can get, col cell counts and colony counts, for each, each sample uploaded and present it to the user so they can kind of, make a call of whether or not this is, meeting regulatory guidance in their region.
[00:12:05] And it's also data basing all of this data so you can go back and do, basically, analytical assessments of, historical trends and, download your data and put it into GIS or what have you. So next slide, please. And, this is basically an example of how our machine learning algorithm sees and functions.
[00:12:28] It takes a look at each one of these colonies separately and compares it to the library that we have trained it on. and it produces this pretty colored mask that you see on the right there. For each colony, and provides a percentage number for each label it assigns. And that's basically communicating its confidence in how closely those labels align with what we've trained the AI on so that you can go back and you can double check its work.
[00:12:59] And if you see any errors, you can kind of take action accordingly and so forth. but yeah, these masks are also what we use to calculate cell counts. for each type of cyanobacteria that the system detects. So, go ahead, Louise. Next slide. All right, there we go. So yeah, one of the things that,
[00:13:23] that we wanted to do was basically, compare how these, how the AI performed, compared to what you would get out of a human doing the same task. And, the formatting seems to be a little messed up on the, on the pie chart on the left here, but, basically what we found was that as long as you're uploading clear and well taken imagery, to the AI, it has about a 94%.
[00:13:50] Accuracy rates that aligns exactly with, with taxonomy. and the, one of the, the issues that, that some folks, were shown to run into was just, the photos they were providing to the AI were, lower quality and therefore, because of that, they saw some reduced accuracies.
[00:14:16] but, what we saw from our users and our testing is that the more people were using the system, the better the photo quality got overall. And then when we compared our cell counts to the flouroprobe readings in the laboratory. What we found was that there was a pretty good correlation between the two technologies.
[00:14:35] but, this is only for samples that haven't been introduced to shipping stress and long delay times before analysis. So as soon as you introduce that variable of shaking the sample. Or, keeping it in the dark around the cold. Those two technologies diverge, which kind of showcases another great reason to immediately test your samples on site rather than shipping them to a laboratory and, and in introducing them to all that stress.
[00:15:08] so yeah. With that, I think I'm done. I think my next slide is the outro. So, I'm really happy to be here and, and participate.
[00:15:16] Louise Quigley: Thank you so much, Igor. We're now gonna pass it over to Jay, who's going to tell us a bit about Satelytics. And as a reminder to everyone, you can submit questions for the panelists using the Q&A feature at the bottom of your screen.
[00:15:31] So over to you, Jay.
[00:15:34] Jay Almlie: Good morning everyone. Thanks for inviting me. Appreciate the time. very briefly. Satelytics is a software company and our game, what we do is we ingest large amounts of data imagery, usually satellite data because it's most effective. But we can use others. And what we're doing is looking at spectral signatures in that imagery to produce timely alerts,
[00:16:08] Early indications of problems in the field. For us, our customers are industry customers who have vast geographies that they need to monitor for the sake of today's discussion. That could be water bodies, rivers, lakes, even the Great Lakes. And what we're doing is analyzing that, that essentially spectroscopy, we're analyzing that imagery.
[00:16:37] Using AI powered algorithms, looking for certain fingerprints. Is it a HAB? We measure HABs by chlorophyll A, ficocyan, and we can measure those constituent constituents. Is it nitrogen or phosphorus feeding those? we can measure chemicals. We can measure metals. All of those things have a specific spectral signature, but in every case, the name of the game is early indication because if it takes you weeks to get that data, it's too late.
[00:17:14] So I applaud Igor and BloomOptix for getting the gist of that. They, too, want to do things very quickly. Next slide please. In a nutshell, here's how it works. Passively, the power of the sun bounces off a feature on the face of the Earth. In our case today, we're talking about a water body, most likely that reflected.
[00:17:41] sunlight is caught by a sensor. That sensor can be on a UAV, an airplane or a satellite. We use satellite for about 95% of what we do. Those photons are then converted into zeros and ones. That's data, especially in the near infrared spectra. That's what we're looking at to distinguish chlorophyll A, nitrogen, phosphorus, bathymetry.
[00:18:15] a number of different things that we can measure from analyzing that reflected sunlight. We can only do that at speed at scale by applying AI to automatically analyze all of that data for those spectral fingerprints that I talked about, the ones in zeros go into the cloud, our algorithms meet that data in the cloud, and we can then convert that into not quite real time, but.
[00:18:49] Fairly immediate results –within hours– indicating you've got a problem here and here. Here's what your problem is. Here's how big your problem is, and indicators of what you can do about it. We can deliver that in a mobile app. We can deliver that via Satelytics.io, our web-based interface, or we can deliver it via APIs application programming interfaces.
[00:19:18] To existing workflow software. That is what makes all this quick. we can tell you within hours what problem you're dealing with. Next slide, please. Just an assortment to a sampling of the various algorithms we have and have ground truth. we can do simple things like change detection, but we can also do more complicated chemical analyses for the water world.
[00:19:48] We can look at phosphorus, chloride, nitrogen, barium, calcium, copper. You can read the list as well as I can. but all of these things are distinguishable by applying AI to sort out those spectral fingerprints. And that's the name of our game. Final slide. next. Yes. Just a few examples. I know that's tiny, but if you'd like to see them in detail, shoot me a line and I'll be happy to share more with you.
[00:20:20] Either an email or a demonstration online. But yeah, the upper left, what you see is, imagery of Lake Erie. That's where we're located actually at the west end of Lake Erie. Cleveland, two hours to the east. And we're the nasty side of it. We're the ones putting phosphorus into the lake because of all the farmland.
[00:20:42] And there you can see it in red. The lower left is an example of work we've done for BP. BP had concerns on the north, the north range of Alaska, where they were concerned about water quality on tens of thousands of water bodies. We could scan them all at once and give them a history of chemicals and metals in those water bodies.
[00:21:09] Upper right, we're looking at rivers. Lower, right We're looking at a bay near San Diego, and, we're looking at bathymetry. So we can measure physical, chemical, or biological by picking apart those spectral signatures with AI powered algorithms. That's all I have.
[00:21:34] Louise Quigley: Great. Thank you Jay. Next we'll be moving over to Gene, who's going to be talking about Zweec’s innovative technology and a reminder to participants. Place your questions in the Q&A section in Zoom and over to Eugene.
[00:21:53] Gene Warning: Alright, well thank you everyone for having me here. My colleagues in Singapore couldn't make it so you get me.
[00:22:00] So, what we're gonna discuss is AI in the Algapro, which is the main instrument that Zweec provides. Now, next, please, about us. This is a little bit, they've been around for over a decade, mainly in Southeast Asia and Australia. And now we've transitioned over to the US. Next please. key features of the Algapro is, surveying phytoplankton and fresh water and sea water, scan and detect harmful phytoplankton, monitor harmful algal blooms, and, if everything's working right, you want to kind of predict those because that's what's really impactful to your watershed.
[00:22:40] utilizes artificial intelligence, machine learning, and deep learning, and then reduces the biologist workload in phytoplankton identification. So as you can see from the chart there, there are certain phytoplankton. The nice thing about using the AI is you can take a fragment and then it'll anticipate and enumerate the actual numbers and types of phytoplankton in there.
[00:23:02] But this comes at a cost. Next, please. Here's the Algapro. It's got three components. It has the microscope, the software and the library. And with the library, we have to train the instrument as that's why we still need the biologist input in order to create the database.
[00:23:24] next please. How deep learning is used in the Algopro: phytoplankton, detection and segmentation. We had talked about this just briefly a minute ago about the segments as, it's really difficult for a biologist to, you know, this type of phytoplankton because of its position.
[00:23:44] It'll go ahead and, that's where the AI comes in and helps to anticipate and count those properly. Use deep learning models for instant segmentation of phytoplankton. Images of various vital plankton types remain labeled, like I said, by biologists prior to training. Neural networks help identify the visual features for accurate detection, segmentation, and then especially effective for filament, filamentous, sorry, algae, where segmentation supports the length based cell counting.
[00:24:19] And then the camera, view analysis for movement control is the deep learning helps interpret it because since you're in a liquid matrix, as everything's kind of moving around a little bit, it acts like a, a little bit of a stable stabilizer in there. And then, the mosaic data augmentation is applied to enrich training data, improving robust robustness in diverse conditions.
[00:24:42] Because typically you just don't have one phytoplankton. You have several. And through training it is. You can go ahead and get the numbers on those different ones. Next, please. So what this basically says is the more you train, the better your results are gonna end up because you're gonna put it through a number of training cycles.
[00:25:04] Each training cycle takes about 20 minutes, and then the average time to train a species of algae is three to four days. Next, please. And then this is the method that Zweec has used: the Sedgewick rafter. And then, what they did in Singapore is compare the output of the Algapro versus using the biologist.
[00:25:29] And then, it isn't perfect like with 25 algae. they had 83.7 and 83.35. The precision recall, respectively. So there's five that they're still continuing to make improvements on. so, you know, it's a work in progress but it's still, I think, easier, maybe not easier is the best way to describe it.
[00:25:54] It's, it's one of those things where it's a lot to do, these, kind of surveys manually and it's just speeding up the process. And then there's some marine data as well. next please. Thank you for your time. We appreciate you allowing us to be here. The PUB is what really helped bring this to the US over in Singapore.
[00:26:20] Thank you.
[00:26:22] Louise Quigley: Thanks Gene. And thank you all, for your overviews. We're gonna switch now into the Q&A section of the webinar, and as we're talking to the innovators please remember to place your questions that you have for them in the Q&A section. So I have an initial question that I would like each of you to answer.
[00:26:47] and I'm going to, I'm going to start with Igor, but I'm curious if there was a specific success or type of breakthrough that convinced you to invest more heavily in AI or machine learning. And if you could talk about that briefly.
[00:27:06] Igor Mrdjen: Yeah. I don't know that it was really a specific breakthrough or success, but we just slowly saw the technology kind of improve, over time.
[00:27:17] And I think we were finally ready to try it out. Once we saw it evolving, and this was early days before everybody kind of knew what AI was and before AI, really, really evolved. But, one of our colleagues developed an algorithm that was, similar to what we use in BloomOptix for use in the wastewater treatment space.
[00:27:42] And we thought, man, it's such a pain in the butt to have to, you know, analyze all of these samples, from lakes and, and we just don't have a good solution that you can take out into the field with you, for cyanobacteria. So why not try it? but really it was kind of a snowball effect of like the capabilities and our understanding of machine learning that slowly accumulated over time.
[00:28:06] Louise Quigley: Yeah. That's great. How about you, Jay? Was there something specific that prompted you to invest more heavily in the space?
[00:28:14] Jay Almlie: Yeah, I can't tell you that there was something specific that prompted us to invest in AI per se, because we were built on AI from the get go. It's the only way to chew through that much data
[00:28:26] quickly, but I will say that the real breakthrough was when BP, British Petroleum became interested in using us for their purposes. As I mentioned, those 10,000 bodies of water on the North shore of Alaska, and them wanting to know, can you detect barium? We think, we don't think we're polluting, but some people have accused us of that.
[00:28:51] Can you detect barium and barium trends in those lakes? We applied the analytics and certainly we could. And from there BP grew more and more interested. Can you do calcium? Can you do metals? Can you do this? That, and that's really where the breakthrough was, is finding a big customer who wanted to pay to help us develop additional algorithms.
[00:29:17] Louise Quigley: That's really interesting. So not so much of a breakthrough. Prompting it, but a problem or challenge to solve for a customer. Right. That really elevated your investment in AI. How about you, Gene? Do you have a, a story related to
[00:29:37] Gene Warning: Yeah, yeah. Real quick. Zweec worked on a fish activity monitoring system in Singapore and, their subtlety with the fish's behavior and, their survival status in bodies of water.
[00:29:50] And, basically it's what the other gentleman had said as well, is it decreases the amount of manpower you need, but also the ability to use these, the AI speeds up the process.
[00:30:05] Louise Quigley: Yeah, that's, that's great. So another question that I will ask of each of you is really, thinking about your early expectations.
[00:30:20] About what AI and ML could do for you and for your customers. You know, how did those early expectations compare to what you actually encountered along the way? And I'll go back to Igor to start.
[00:30:36] Igor Mrdjen: Yeah. I think we kind of underestimated how much the model that we ended up applying, could really do.
[00:30:45] at first we started off with just targeting, for example, Microcystis and Dolichospermum, 'cause those are the two most popular, let's say, cyanobacterial, types that you'll encounter out and about. But, what we actually did was to build our library, was we distributed a whole bunch of the IO Light digital microscopes to a bunch of volunteers across New York, Wisconsin, Oklahoma, and other states.
[00:31:12] And we had them provide data to us rather than us going after specific types of cyanobacteria. and at the end of the experiment, we ended up actually from an expected two or three, genera of cyanobacteria being able to detect and classify six. and so that was really kind of, eyeopening where,
[00:31:34] Honestly, we just underestimated how useful and how relatively little data you needed to kind of get a working model out there for pilot testing and, and further data acquisition. So,
[00:31:49] Louise Quigley: yeah. That's, great. I'm gonna move next to Gene on this. Gene, do any sort of, expectations you had early on that.
[00:32:01] Compared to what actually happened with implementation?
[00:32:04] Gene Warning: Yeah. Yeah. It was, it was much the same, where your expectations, you have to kind of temper them, and then once the AI starts taking over, you don't realize how fast the uptake is once you start training the software or you know, the artificial intelligence for the recognition.
[00:32:23] And then it just, it starts growing and growing, much quicker than, as we had anticipated. Like establishing a library or a database.
[00:32:34] Louise Quigley: Thank you. Yeah, I, I'm gonna move over to Jay. I don't know if you have anything to add on that, that topic, but I also, since you are a software company and started out, as you know, in the AI space, if you feel like, this changed your company's mindset in any way, being so embedded in AI and in the way that you or risk taking or problem solving?
[00:33:03] Jay Almlie: Yeah, I think that's a great way to frame the question, Louise. We started off with AI, but when we started 2015, AI in theory existed. Perhaps the computing power and computing storage didn't exist as it does today. So we've had to learn along the way.
[00:33:24] But let's look at macroscopically. We're, we're in the year 2025. Now every one of us on this call is learning new ways to use AI daily, especially with the dawn of generative AI. Similarly, Satelytics has learned new ways to apply what we started with to get us better results. Let me use an example. when we got into methane work, which I, I know probably isn't the central focus of our discussion today, but we can measure methane emissions using, this same approach.
[00:34:00] And when we got into that work, we found we needed to synthetically generate some sample sets of data because there just wasn't much out there. And so to do that we learned to use convolutional neural nets. To create synthetic data to train the models on which already were AI powered. So now we're using AI to train AI, and I think that's the curve that we're probably all on and we're all adapting day by day, finding new ways to apply new AI techniques to get the job done better and quicker.
[00:34:38] Louise Quigley: Yeah, I think it's, it's an interesting revelation about AI as we know that it's constantly learning, but it, it impacts our approaches as well, because things that we may have assumed early on, we start to change our point of view based on what's developing through AI and the learning. I do have some audience questions for our speakers.
[00:35:02] So, I will, I'm gonna read this to you. The first one is for Igor. it says, thanks for your presentation. Is there a minimum lower bound to the size range of the cells identified? Does the library also include coccoid green algae? That can be difficult to distinguish from. I hope I'm pronouncing this right, Coccoid, and I don't, I'm going to do my best: cyanobacteria without a higher level of magnification.
[00:35:34] And then lastly, which. Oh, genera of cyanobacteria hasn't been trained on. That's for you, Igor.
[00:35:46] Igor Mrdjen: Got it. Thank you. so let me see if I can cover all of the, all of the questions. So, the lower bound of the cells, is kind of, it's an interesting question. So, what we have limited the AI to do is.
[00:36:05] Anything below 100 pixels in size, any colony below 100 pixels in size, it will flat out ignore because that is what we found from our training set and our experience is too small for the resolution that we're using in the field to make an accurate and good ID on, right? So that's how we limit that.
[00:36:34] So anything smaller than a hundred pixels on our photos is going to be ignored. That is the equivalent of about two or three Dolichospermum cells in a chain, because really, if you have two or three cells. It's not enough morphology from a colony level to be identified.
[00:36:58] the, the question for the coccoid cells, we did not train the system on any green algae. Everything that it identifies, or attempts to identify is from the cyanobacterial side of things and it ignores all the other green algae. We did this because one it is really time intensive and expensive to add.
[00:37:19] additional parameters. And then once you open the Pandora's box of, of other types of algae and things like that, and organisms, it really kind of, balloons your need to do things. The way we, kind of, I don't wanna say got around this, but compromised on this, is that we don't identify cells on a cell by cell basis.
[00:37:40] We identify cyanobacterial colonies by their morphology. And, some of the features like, how the cells are clustered, how tight they are, what color, shape and size they are, and all those different things. so that really kind of lets us get a little bit away from having to resolve each individual cell and it reduces the magnification and resolution requirements of the hardware as well.
[00:38:06] and then what genera has it been trained on? We have six genera that we classify and enumerate. Microcystis, Dolichospermum, Aphanizomenon, Woronichinia, and Limnoraphis. And Gloeotrichia, that's right. I always miss the Gloeotrichia. and so from there, we, for each of those, for Microcystis and Dolichospermum, we developed their own self counting model.
[00:38:35] And for the other four, we use ones that have been established in literature and our paper on all of these methods development and things like that should be being published in the next couple of weeks. So if you really wanna nerd out like we did, you can follow that whole thing along.
[00:38:52] Louise Quigley: I don't even know if I'd get past the first sentence, but Igor, I'm sure our audience appreciates that.
[00:38:59] I do have another question from the audience, and this one is for Jay. Does Satelytics provide realtime or near realtime monitoring of freshwater bodies? And what are the current limitations in latency or frequency of data refresh, especially for smaller inland water bodies like lakes, rivers, or reservoirs?
[00:39:25] Jay Almlie: What a great question, because it allows me to dive into the mechanics a little bit. I did say we focus mostly on satellites, so I'm going to dwell there to answer this question, but I think the audience can extrapolate if they wanted to use aerial data or drone data. you can extrapolate, but let's focus on satellite to make the point.
[00:39:48] Freshwater bodies latency or frequency of data refresh sort of inverse, but the same two sides of the coin. We have enough satellites in orbit now that we can monitor daily if we wish. with high currents, for example, on Erie, there are some currents that would bring HABs or any pollutant downstream or to the east fairly quickly.
[00:40:18] You might want to monitor daily, you might want to monitor weekly. Those can be done. But now I get to talk a little bit about the size of water, body and resolution and the interplay between those. We have a choice of satellites. We can use Landsat or Sentinel, which are government satellites, course resolution, 20 to 30 meters.
[00:40:43] But because those bodies of water are so large, we get enough pixels to give you the good picture of what's happening chemically or. or physiologically or, biologically in that water with those big pixels. For smaller bodies of water, rivers, smaller lakes, like I mentioned with BP's work, those 20 to 30 meter pixels aren't going to do the job because the lake might not, or the pond might not be 20 meters wide.
[00:41:14] So now we have to use alternate satellites, and those would be. Airbus makes one, a se a constellation of satellites called Pleiades. And Maxar offers something similar, Legion or Worldview Three and Planet Labs offers in excess of 400 satellites now that can do that job. And they're all in the neighborhood of 30 to 50 centimeter pixels.
[00:41:45] Now we can measure those smaller bodies of water, same algorithms, analyzing the same way. Just better, more refined, higher resolution data allows us to get to those smaller bodies of water. I think I answered the question on latency. It's up to daily unless we use those coarse resolution satellites provided by the government and then it's a few times a week. I can't say daily.
[00:42:15] Louise Quigley: That's great. Thank you to the audience for these great questions coming in. I do have another one from the audience. This one's for Gene, at Zweec. How many images were needed for each taxon for training?
[00:42:30] Gene Warning: I don't know right off the top of my head. What I do know is when it's running it's pretty continuous, so.
[00:42:40] Like it's what, 20 minutes per, but I can find that out for you. But it's one of those things that I just, I don't know right off.
[00:42:48] Louise Quigley: That's a pretty specific question, but Correct. Thank you. Okay. another audience question. This one is for Igor at BloomOptix. Does BloomOptix offer a scalable solution for micro algae monitoring that can integrate with drones to cover large freshwater or coastal water bodies.
[00:43:10] If yes, how do you handle spatial resolution, calibration and data stitching at scale?
[00:43:18] Igor Mrdjen: Yeah. This goes back to our kind of origins when we started BloomOptix and using drone imagery to monitor nearshore, harmful algal blooms for human health purposes for beach goers and so forth. We moved away from that type of system, mainly because it took too long to process data for us and turn things around.
[00:43:49] And one of the biggest, prohibitive, factors with collecting drone imagery is basically having FAA clearance to, to fly everywhere. And, you're greatly limited by the team that you have around you or the operators you have access to when it comes to collecting the drone imagery.
[00:44:11] and like Jay said, there's resolution issues and so forth. So we found a lot of challenges in implementing that, in a cost effective matter that was really accessible to everybody. So we pivoted, towards. the AI microscopy aspect and now what we do is we actually will fly out to set coordinate points and collect water samples using drones and bring them back on shore and analyze them, with the microscopy.
[00:44:40] And that allows you to basically go way offshore and know what the wind could be bringing in or what the wind is pushing out. and you can actually get a much more kind of dynamic. sense of what's going on, while still using the microscopy, and rapid assessment, methodology. but we have, admittedly, moved a little bit away from the remote sensing side of things.
[00:45:06] So, if you have more questions, I would probably refer you to Jay, honestly, who's, who's probably much better, equipped and, you know, much more up to date to those technologies, than we are.
[00:45:21] Louise Quigley: Yeah, I, I'll pass it over to Jay. Do you, do you wanna speak a bit about integration capabilities with, with other technologies?
[00:45:30] Jay Almlie: Yeah, sure. An analogy we use frequently is something from my pipeline world. We call it defense in depth. There is no one technology that's going to cover all the holes in the Swiss cheese. We have some weaknesses from a remote sensing point of view. some of those weaknesses Igor can address and some of them Gene can address with doing more refined point sampling.
[00:45:58] they have some holes respectfully that we can address. So if you layer these pieces of Swiss cheese on top of each other, you'll eventually cover all the holes. And that in the defense world or in the pipeline world, we've adopted that. We call it defense in depth. Multiple technologies applied to get you a bulletproof solution. I think that's the answer. That's how you integrate together.
[00:46:31] Louise Quigley: Yeah. That's one of the challenges. That's forever going to be there with other technologies. I don't know, Gene, if you wanna weigh in as well.
[00:46:42] Gene Warning: No, I agree, The collaboration between each of the, the different, companies is, is crucial in building a better monitoring system altogether.
[00:46:57] It's like we all bring our components in. Sometimes you look at a competitor like, you know, you're competing, but sometimes you need to have that overlap because of your, your shortcomings versus theirs, and you can overlap. And then is it for the betterment of the whole process.
[00:47:16] Louise Quigley: Yeah. And I appreciate that.
[00:47:18] And I do have another question for you, Gene, from the audience. And the question is about the number of different species of algae that Zweec is currently able to differentiate and how the technology performs in freshwater versus marine environments.
[00:47:39] Gene Warning: Okay. I think we covered that briefly. It was like the second and last, it's really comparable.
[00:47:45] We'll go ahead and do the last part first, because you're actually looking at the algae or the phytoplankton. so if it can recognize it and you can identify it. So it's probably more important that you can identify it first. You're setting it up in the software. And then it'll go ahead and identify it every time, even depending on the angle and the fragment that it's looking at to get you the numbers.
[00:48:07] And, really, if you can identify it, and that's why we use a biologist. So I don't wanna say it's unlimited because that's not the right way to look at it, but if you can have a biologist identify it and it's repeatable, there are no real limits on what you can, can look for.
[00:48:31] Louise Quigley: Great, thank you.
[00:48:32] I do have a more general question or looking to the future that I will ask each of you to answer. Thank you again to the audience for submitting your questions in the Q&A. If there's any we are not able to get to today, you will receive a response, after the webinar from our team. Okay. So my question to each of you as we're starting to get towards the end of this session, is in the next few years, what emerging AI or ML trends do you see as potential game changers for the water industry? You know, just looking out there and, Igor, I'll start with you.
[00:49:16] Igor Mrdjen: yeah. I think there's a lot of advances in the types of models that we're using, currently that are.
[00:49:26] Not so much reinventing the game, and kind of like revolutionizing things, but they're, they're sort of incremental advances that are kind of increasing our ability to apply some of these models, more broadly. And I think we live, in such a, You know, niche world, in terms of application for, whether it's for HABs or water quality in general, that a lot of these more generic models are gonna start to trickle in, and, and be, you know, much more applicable, to each of us.
[00:50:05] So, yeah, it'll be, it'll be really interesting, especially with image processing. and picking up on some of these more discrete patterns, the more power we start to get, the more details we can start to pick up on. so it's difficult to say what new emerging technology is going to be directly relevant, but I think just the increased amount of power and ability from these models is kind of nice to see because it opens up so many different doors for us.
[00:50:38] Louise Quigley: It's super exciting. Gene, I'll ask you the same question. Are you seeing any trends or tools that are gonna impact the water industry? Sure, sure.
[00:50:49] Gene Warning: The folks in Singapore, they said, they see AI powered digital twins, federated learning for cross site data modeling and foundation models, adapted to environmental science as the major game changers.
[00:51:02] Those tools will help, The ability to simulate, predict, and respond to water related risks more intelligently and efficiently.
[00:51:10] Louise Quigley: So yeah, the twins topic is, one is of interest to me because I think that what that means is you're able to test things stimulating the real world environment without actually directly impacting that environment.
[00:51:24] Is that right?
[00:51:25] Gene Warning: Correct. Correct.
[00:51:27] Louise Quigley: Yeah, that's, that's cool. How about you, Jay? What's your crystal ball showing you?
[00:51:33] Jay Almlie: I'll try and distill it down to two answers. Macroscopically, I'll say that simply the growing comfort level of users, end users in applying AI. As we all have chatGPT in our pockets, now we're becoming more comfortable.
[00:51:54] And so these solutions, the three of us speaking today with AI backing are becoming more accepted. Whereas, 10 years ago, five, even five years ago, it was seen as little too star-trekky too, too sciencey. I'm just gonna go do my grab sample, send it to the lab, and wait two weeks for my results. I'll do it that way, but.
[00:52:16] Now people are growing comfortable with the use of AI and they're accepting that it's part of our solution. I think that's changing the way we do, water monitoring for sure. Now, let me offer a wild forecast. If the weatherman can do it, I can do it. So Nostradamus: I think that quantum computing is going to change probably within five to seven years.
[00:52:45] How we look at water modeling, how we look at pollution transport in water. And because quantum offers something like 10,000 times the processing speed of anything we have right now, we're going to be able to apply that to develop better models that act faster and are more accurate for large systems.
[00:53:10] We're gonna find answers that we can't dream of today.
[00:53:16] Louise Quigley: Very wild.
[00:53:18] Jay Almlie: It is.
[00:53:19] Louise Quigley: Yeah. Thank you. as I have one final question, this is a lightning round for each of you. In less than one minute, if you could offer one piece of advice to other water to technology innovators that are exploring investing in AI and ML, what would that be?
[00:53:39] And we'll go Igor, Gene, Jay.
[00:53:43] Igor Mrdjen: I would say, start looking into it now and start keeping up with where the technology is going because, you know, six months ago AI couldn't draw fingers, and now we have full video voice and, and ads and basically recreations of everything in AI and some of the new agentic models that we're seeing are layering AI capabilities and nesting them within other AI capabilities.
[00:54:11] So the growth is exponential. so it's. I think we're at an inflection point where it's either you're going to start paying attention and start integrating it in some way into your life, or you're gonna get left behind and it's gonna be that much tougher to catch back up tomorrow. So my number one advice is, get in while the getting's good, I guess.
[00:54:35] Louise Quigley: Yeah. Before you're left behind. How about you, Gene? What advice do you have?
[00:54:39] Gene Warning: Yeah, I would say embrace it because it is the, kind of the wave of the future. This is the way it's going. But then don't forget you're gonna need more than just your group of people. You're gonna need domain experts, data scientists, system engineers, things like that. It's like you have the idea of AI, but you need to have, don't be afraid to collaborate with others.
[00:55:03] You know, you have to be careful about it, of course, but. You know, is it's gonna be, just because you have this thought doesn't mean you, you need to have, bring some other folks on to help you achieve whatever goals you have.
[00:55:16] Louise Quigley: Yeah. Good advice. And Jay, you,
[00:55:18] Jay Almlie: I'm sorry to let you down, Louise, but I couldn't say it any better than Igor did, so I'm gonna just stand on his answer.
[00:55:24] Louise Quigley: Okay. Thank you. That was definitely less than a minute. I appreciate that. with that. We will be wrapping up the session. I just wanna thank our speakers, Igor and Jay and Gene, you've been wonderful. Thank you to the audience for your thoughtful questions. I think this has been a very informative series.
[00:55:48] and this session in particular, we could probably go all day on this topic. We only had an hour, but we will be looking forward to revisiting some of the topics in our series in the future. We are taking the month of August off from webinars, not off, but from webinars and we'll be back, with a new series starting in September.
[00:56:13] And our plan for that is to do a deeper dive with Badger Meter, but more specifics on what that actually will be, will be coming, over the next month. So. With that, I am going to give all of you two minutes of your day back and thank you again for joining us. Thank you for your support, from Badger Meter and to our innovators. Take care everyone. Have a great day.