Trimble Talks About Its Autonomy Strategy

Trimble Plans Smart Automated Systems to Fuel Always-Learning Decision Engines

Dan Miller
By  Dan Miller , Progressive Farmer Senior Editor
Trimble combined hardware with its autonomous systems in the purchase last year of French technology company Bilberry, which specializes in selective herbicide applications. It's camera and data management system are shown here attached to the top of the sprayer booms. (DTN image courtesy of Bilberry.)

FORT MYERS, Fla. (DTN) --- Trimble is wrapping up its first Ag Partner Connections conference here with customers in a two-day presentation of its digital technologies -- collection, analysis and implementation -- and its hardware offerings.

Among the enterprise partners, as Trimble calls the large agricultural customers with which it works, are U.S. Sugar (nearly 100 years old in Clewiston, Florida, sugarcane, citrus, fresh vegetables such as sweet corn on 230,000 acres) and Dyson Farming (wheat, spring barley, potatoes, vining peas, strawberries, energy crops, carbon neutral on 35,000 acres in Lincolnshire, England).

Among the topics Trimble executives and systems designers presented was a discussion of automation and autonomy. That was a discussion led by Trimble's Vice President for Product Management, Agriculture Dave Britton.

Britton pointed to the large amount of data collected through autonomous machines. For example, he said a self-driving car generates 40 terabytes an hour while driving on the road. That's the same as about 1,000 to 2,000 iPhones in the same amount of time.

"But it's [data] not really doing any work. That's what's exciting about autonomy in agriculture. You're going to combine self-driving with a lot of data," Britton said. "So, when you start thinking about the challenges and the ways that we're going to have to operate, in order to be successful in using this data, it's really dramatic to think about the impact [of] all the data that is going to be produced off the back of an implement."

P[L1] D[0x0] M[300x250] OOP[F] ADUNIT[] T[]

In the broad scope of autonomous applications, Britton said tillage and spraying operations have floated to the top because, while not simple, those practices are not as complex or data thick as, for example, the harvest process. He said harvesting will be among the last farming functions brought into the world of autonomy because of its complexity -- a machine that would be self-driving and intelligent about various harvest volumes, qualities and even, grain types.

Britton offered additional insight into automation and autonomous farming operations. Here are four:

1. SELF DRIVING

"Autonomy is an act [that is] a combination of self-driving and intelligence," he said. "The idea that the power unit (i.e. tractor) can move in the field itself is only going to be valuable if the operation off the back of the vehicle [i.e. an implement] is also able to run independently as well. Self-driving, [intelligence from the implement] both are very, very hard for us to do. But it's important distinction around autonomy that it creates real value [for agriculture] versus autonomy that seems really cool," Britton added.

2. AUTONOMY AS AUTOMATIONS

"When we get into automated workflows, we figured out how to get [a practice] to move through each step," Britton explained. "But it was a pretty dumb process, for lack of a better word. It would start its loop and it would just keep going. It allowed us to be much more efficient. But if there is a context [in that] you want it to drive a decision, you are pretty much out of luck. [With autonomy] the system can use contextual information to determine which automation link to apply. That's what becomes really exciting about this in my mind, because the decision-making engines become something that can keep getting better and better, we can continue to make the automated loops more fast, more effective. But the decision-making engines are going to be what really drives the value and the autonomy."

3. GEOFENCING

"In order to safely execute self-driving and autonomy in the field, we have to have a boundary with which the machine knows it can't exit," Britton said. "Once we have a boundary around the field that the machine knows it cannot exit, we can then move into path planning. [We look] within the boundary, to come up with a [path] plan that is the most efficient for the work that needs to get done. It's about the boundary. It's also about obstacles. One of the most interesting things about this is obstacles are not created equal. It's very easy to put a point on the map and say there's an obstacle. It's a very different thing [to say] that obstacle is a 20-foot telephone pole or it's just a rock. Is it something the boom [can travel] over? Or is it something that you're going to run into. But you have to have that additional information to be able to make the appropriate decisions. [Path planning] requires a lot of contextual information from the field for us to be able to design the most effective ways [in which] it's operated. We're also working on ways to make sure that operator preferences be included in the path. We all know that [operators] have preferences for the way [they make] turns and those types of things."

4. PERCEPTION

"So, we've got a boundary. We've got a plan. Now we need perception," Britton explained. "[Perception] is incredibly complex. There is a huge amount of information to ingest. It's 'I can identify whether that is something I want to drive over or not.' Our industry is one place where the decision around what to do with these insights is a little bit challenging. When you're driving a car on the road, there are very few situations where the car will make the decision to run something over," he added. "Nearly every time the car is figuring to stop, how it can get off the path it was on safely and then get back on the path. We don't really want to have to have tractors trying to figure out where to go to get around something. There's going to be situations where we purposely want to drive over things, [where] it's identified the problem and it needs to know it's okay to drive through that, like a harvest application. So, it isn't as simple as just stop, we're going to have to be able to provide enough contextual information for the machines to know whether it continues or whether to stop and call for help."

Dan Miller can be reached at dan.miller@dtn.com

Follow him on Twitter @DMillerPF

P[L2] D[728x90] M[320x50] OOP[F] ADUNIT[] T[]
P[R1] D[300x250] M[300x250] OOP[F] ADUNIT[] T[]
P[R2] D[300x250] M[320x50] OOP[F] ADUNIT[] T[]
DIM[1x3] LBL[article-box] SEL[] IDX[] TMPL[standalone] T[]
P[R3] D[300x250] M[0x0] OOP[F] ADUNIT[] T[]

Dan Miller