“We’re generating a lot of data every day from a bunch of different systems – feed, milk and shipping — but none of those systems talk to each other,” Breunig said.
While Breunig has access to great data, he said he can’t use it the way he’d like. He’d like a daily report on his feed efficiency so he could adjust rations to improve profitability. But that’s difficult to calculate because it requires data from his feed-management software, written notes on tanker weight and reports via text from his milk buyer.
“You can enter it by hand, but you don’t have the time so you don’t do it for a week,” Breunig said. “Then you go back and enter the data and cram it in. Unless you’re doing it every day it’s hard to get it right. You’re always looking too far in the rearview mirror. Data is generated every day. We should be able to look at it every day.”
There ought to be an app for that, and soon there could be. A multidisciplinary team of University of Wisconsin-Madison scientists has started to create a “virtual dairy-farm brain” that will collect and integrate all a farm’s data streams in real time and then use artificial intelligence to analyze the data to help farmers make better management decisions.
The dairy industry needs to reach this level in data management, said team leader Victor Cabrera, a UW-Madison dairy-science professor who develops software to help dairy farmers evaluate their management options.
“Dairy farmers have embraced a lot of technologies that generate vast amounts of data,” he said. “The problem is that they haven’t been able to integrate the information to improve whole-farm decision-making.”
The UW team of dairy scientists, agricultural economists and computer scientists is starting the project by streaming data on about 4,000 cows in three Wisconsin herds – including Breunig’s – to a campus-based server. That is no simple task because dairy operations generate so many types of data from so many sources – everything from pounds of feed consumed and pounds of milk produced to how many times a cow chews, how many steps she takes and her internal temperature. In addition there are sire records, genomic tests and other data on each cow, as well as data such as weather, and prices of milk and feed.
UW dairy scientists are no strangers to data management, but wrangling so many streams of disparate data in real time requires a specialized skill set. That’s why they’re collaborating with the UW-Center for High Throughput Computing.
“It’s not just a matter of having access to systems that can handle big data sets,” Cabrera said. “We also need the expertise to filter it. We’re collecting a lot of data, but a lot of it is repetitious or irrelevant. We need to be able to filter the noise and attach identifiers to each type of data. To do this in real time is not a trivial thing.”
Computer-science expertise also is key to the project’s second step – using artificial intelligence to predict more accurately the outcome of various management options. The computer scientists will devise algorithms that analyze what’s happening on the farms – which inputs result in which outcomes – and to learn from that to do a better job of predicting. The final step will be to apply what they’ve learned to create intuitive cloud-based decision-support tools that allow farmers to use real-time data from their farms to make smarter management decisions.
In addition to Breunig’s Mystic Valley Farm near Sauk City, Wisconsin, the team is streaming data from Larson Acres near Evansville, Wisconsin, and the UW dairy science department’s own research herd. The team looked for farms close to campus that were already generating and using lots of data, including genomic information on every cow. They also wanted well-managed operations.
“We called this project the virtual dairy-farm brain because we’re trying to mimic the thinking of a very good dairy-farm manager,” Cabrera said. “We’re going to start by seeing what the manager decides to do with the data and then see what our system would come up with as potentially the best decision.”
When the two-year project is complete, Cabrera said he hopes to follow it with a larger study involving 100 to 200 farms, representing a variety of sizes and management styles.
“We think the methodology should apply to any farm,” he said. “It could be adjusted to suit whatever data are available. The basic approach would be very similar on a 100-cow farm or an 8,000-cow operation. The concept would not be different as long as there’s good-quality data. Every farm is generating data. It’s just a question of how it’s used.”