Data resilience in agriculture
Methods like machine learning or deep learning are successfully applied to various areas in science (biology, economy, physics, …), so those methods seem promising for an application in agriculture. Here it is necessary to get hands on long term data. In countries like the U.S.A., Canada or Australia this data is publicly available. For many computer scientists this was the kick-off to test their new techniques in agriculture.
The quick movement toward and increasing reliance on automated sensing, along with issues around security and reliability require a new solutions to be adopted. Automated data collection has many benefits over traditional approaches, it gives researchers the ability to collect high velocity and precise data in near real-time, however methods to aid in the resilience of data collection have lagged behind.
Data stream mining could provide the ability to not only learn complex behavior with unsupervised machine learning but also gives the power to identify anomalous.
In the course of the doctoral program “DiLaAg” (Digitalization and Innovation Laboratory in Agricultural Sciences), the goal is to apply and optimise data stream mining and utilise anomaly detection. These anomalies can be caused by failing sensor, data injection attacks or other undefine behaviours. The ability to detect these comes at a cost, high power consumption and computational requirement along with the need for quick updating data streams. These however are necessary for such solution to work. The ability to optimize and reduce overhead in such a technology will determent its effectiveness in an agricultural setting and may provide a much more efficient & resilient way to collect and process data in the future.