MachineryLink
At the Dawn of Sense and Act Technology: An Interview With Willy Pell
Progressive Farmer had the opportunity at the 2025 Consumer Electronics Show (CES), in Las Vegas, Nevada, to sit down with Willy Pell, CEO of Blue River Technology, to talk about the evolution of Blue River's LettuceBot sense and act technology, its acquisition by John Deere and the introduction of See & Spray. Today, See & Spray targets weeds. Tomorrow, the technology, Pell predicts, will be used for much more.
Here's the edited interview.
PF: See & Spray was first introduced to the farming market in 2021. Initially, Deere marketed this new machine-learning-based technology to small grains farmers who manage weed pressure on fallow acres as part of a regular rotation. Five years later, See & Spray technology detects weeds in green, growing crops -- in corn, soybeans and cotton. How did we get here?
Pell: One of the things about machine-learning systems is that they improve with use. There is no book, no database that will show how every weed ever will appear at some time of the day or climate condition with all practices. With machine-learning systems, you look at the performance and then you look at the images that didn't perform as well to improve the model so that it hits the weed every single time and saves the most chemical every single time.
PF: How does the machine learn?
Pell: There is basically a level of confidence for every single pixel gathered by the cameras. How confident are we that this is a weed class? How confident are we that this is a crop class? You basically get confidence scores to determine the images uploaded to the model. But, we can't possibly upload all the images. So, onboard selection criteria takes the images that most likely have the most value ... and the model improves.
PF: Blue River was working on this technology well before Deere purchased it for $305 million in 2017. How did it begin?
Pell: Blue River started with a lettuce-thinning system called LettuceBot. The germination rates of [lettuce] seeds are quite poor. But, the seeds are cheap, so farmers overplant, and then they thin the plants. They do that with hoe crews. But, the crews would often strike the plants, expose the roots to disease and things like that. So, we made this incredibly precise system that would thin the lettuce. It would spray a heavy dose of fertilizer on the lettuce to be removed.
PF: Tell us about the technology at the time.
P[L1] D[0x0] M[300x250] OOP[F] ADUNIT[] T[]
Pell: The best technology at the time was what were called support vector machines. It is old-school machine learning where engineers would hand-tune algorithms. So, I'm going to look at a lettuce plant. It has this many leaves and this many interior angles and this brightness of green. You could break it down with your own mind and your own intellect. That was the state of the art. Then, the deep learning revolution hit, and rather than engineers hand-tuning algorithms, you use data. It was a lot of data. It went from 1,000 training images to 100,000 training images to train a model to predict crops versus weeds accurately.
PF: Tell us about the time John Deere met Blue River and how it changed things for Blue River.
Pell: We saw this future ... every single plant gets what it needs. No more. No less. Farmers were going to use fewer inputs and get better results. It was a pretty easy decision to join forces (with Deere). We went to self-propelled spraying ... operating in the full spectrums of daylight, fog, harsh shadows. We were going 6 mph with our pull-behind LettuceBot implement, now we were going 12 mph and, today, 15 mph. You used to have time to process a picture before you sprayed. On a self-propelled sprayer, you have about 30 milliseconds.
PF: What does it take to consistently model a weed?
Pell: The cameras are RGB sensors. They see very similarly to how you and I see. Anything that would look different to you, any type of variation that makes it look different to you, it's going to look different to the model. We test the model against known images, and we score ourselves. Anything that is a less-than-good experience is outside the domain, and we expand the domain. We are perpetually expanding the domain.
PF: How often are you refining the model?
Pell: We can spin a model in about 36 hours. Testing procedures (require) another day or two after that.
PF: How specific is the model? Is there a Midwest model, a Southern model?
Pell: We started with very specific models for different regions. What we found was systems would perform better the more we merged models. Adding data from Brazil actually improves (performance) in the Midwest and vice versa.
PF: Today, See & Spray produces a weed map but not a weed species map.
Pell: Yes. It produces a map of where weeds are in the field. The obvious next step is species. It's in play. One hundred percent.
PF: This would be a highly valuable evolution.
Pell: Yes. Take Palmer amaranth, for example. It is more resistant [to chemical] treatment than others (weed species). And, with Palmer and all its seeds, a field can suffer for years. A species map suggests different treatment strategies.
PF: Where does this go from here? Beyond treating weeds?
Pell: For the first time, farmers will have an actual sense of what their crop stand is after planting. Is it even? Is it uniform? All that correlates with the potential yield. You can drive certain decisions. Data can support a decision about inputs. Here's an example: A doctor cannot tell the gender of a person by their iris. But, an algorithm can. So, there is some subtle difference between a male and a female iris that is unknown to Western medicine, yet it is true. We let the data do the talking. If you need to ground truth an aphid infestation, is there a visual signature they make in a crop? Farmers will have (in season) actionable information. Machine learning gives us a clearer window into the production system.
For more on this topic, see "Sense and Act Tech for Precision Spray," https://www.dtnpf.com/…
Dan Miller can be reached at dan.miller@dtn.com
Follow him on social platform X @DMillerPF
(c) Copyright 2025 DTN, LLC. All rights reserved.
Comments
To comment, please Log In or Join our Community .