Periodic Reporting for period 1 - FLEXIGROBOTS (Flexible robots for intelligent automation of precision agriculture operations)
Okres sprawozdawczy: 2021-01-01 do 2022-06-30
FlexiGroBots aims at addressing these challenges because the society may obtain an important benefit of a new generation of solutions. First of all, as the population is increasing, increasing the productivity of the fields leads to a food security scenario. Also, the early detection of diseases and application of phytosanitary products would reduce the application of products that may be toxic if not managed carefully, increasing safety for the population. Finally, because of the gain in efficiency when using multiple robots, the society will benefit from improvements in the economical and decarbonization aspects.
Having these aspects in mind, FlexiGroBots proposes to develop a digital platform that can support the next generation of digital solutions for agriculture, facilitating tools for the definition and testing of AI-based models, some horizontal services based on AI models, tools for the management of fleets with multiple robots and an infrastructure for managing a rich Data Space for Agriculture. Such solutions are validated through three pilots focused on three different types of crops (grapevine, rapeseed and raspberry), using several kinds of robots (UAVs, UGVs), as well as other data sources (such as satellite images).
From the base infrastructure perspective, FlexiGrobots has set up a Kubeflow cluster, with access to a GPU and several example pipelines, as the base platform to design, develop and test AI models in the agriculture domain. Additionally, the infrastructure for a Data Space for Agriculture has been implemented based on IDSA components and some connectors, together with MinIO and an instance of Open Data Cube for georeferenced datasets.
There are some AI models already implemented that can support multiple pilots: objects detection (and tracking), automatic datasets annotation (for facilitating training of other models) and images anonymization (capable of changing faces in videos).
The implementation of a Mission Control Centre has also started, although for the moment, only a few connectors for robotic protocols are available.
The implementation of the pilots has also progressed. The grapevines pilot has shown the usage of UGVs supporting harvesting and field monitoring activities, while UAVs have been sued to collect high-resolution images, used to implement a first version of a botrytis detection AI model. This information, together with other data coming from other sources (meteorological models and IoT devices) is accessible from an agriculture platform.
The rapeseed pilot has also implemented use cases supporting activities like silage harvesting, rumex weeding and pest management. Several datasets have been collected and different robots have been prepared and used (autonomous tractor, weeding robot, etc.) in order to show how they can be used to improve these activities. Additionally, UAVs are used to support these use cases as well (e.g. follow and track autonomous tractors in the field).
The raspberry pilot started installing a weather station and multiple sensors in the field, in order to enable data collection. A robot for soil sampling and a robotic sprayer were defined and developed. Additionally, UAVs were also used to acquire more data, to be combined with the soil-related information, enabling field monitoring and optimising soil sampling.
These activities have been communicated in different forms (scientific publications, social networks, press notes, website, etc.) and are the base to enable collaboration with other initiatives.
The new horizontal AI models provide interesting advantages. In the case of object identification and tracking, this had not been used for identifying tractors using the angle of vision of a drone. In the case of the automatic annotation of datasets, this is a new combination of AI models that showed to work very well to generate the datasets required for training, using as base not annotated datasets. The anonymization model is also innovative, enabling the implementation of GDPR and ELSE factors in AI. More horizontal AI services are under development (such as human actions detection, a general disease detection model or pest detection in traps). These models will have a positive impact in the agriculture domain, as they support the application of robots in the field.
An important work will be carried out in the Mission Control Centre, which will complete the implementation of the solution that will enable the management of multiple robots in the field, facilitating the automation of field tasks, optimizing routes and enabling self-healing mechanisms that can re-allocate tasks to robots in case of problems with some of the robots. Such implementations will have a great impact in the reduction of risks when applying robotics to agriculture, the increase of adoption of robotics in the field and the deployment of platforms for long-time operation.
The pilots have also shown progress in the state of the art in several areas. From new models for diseases detection using UAV images to the improvement and development of new robots to be used in the field, showing new applications of robotics in the agriculture domain, supporting multiple activities (monitoring, disease treatment, weeding, etc.). The potential impact in the domain is huge, opening a greater applicability of the digital technologies. The future work will improve the use cases implementation, including automatic management of tasks and multi-robot scenarios.