Geovisualization of big data for the supervision of agricultural autonomous robots

When:
31/01/2024 – 01/02/2024 all-day
2024-01-31T01:00:00+01:00
2024-02-01T01:00:00+01:00

Offre en lien avec l’Action/le Réseau : – — –/– — –

Laboratoire/Entreprise : TSCF, INRAE
Durée : 5-6 mois
Contact : sandro.bimonte@inrae.fr
Date limite de publication : 2024-01-31

Contexte :
The main goal of agro-ecology is to provide new practices that respect the environment and grant good farming production. Internet of Things (IoT) and robots play an important role in this context. Indeed sensors are able to provide accurate pedo-climatic data and robots can be employed for repetitive and accurate agricultural tasks during a long period. Moreover, robots are usually supported by electrified engines and they are light, reducing the impact of soil compaction. Nowadays robots are arriving in farms, where several types of machines exist and cohabit: tractors and robots of different types. The main task of farmers and agricultural stakeholders is moving more and more towards managing this equipment and analyzing agronomic and economic data by means of Farm Management Information Systems (FMSIs). Existing FMSIs lack tools dedicated to the monitoring of fleets of diverse robots, which represents an important barrier to the growth of the usage of robots in the field and therefore of the agro-ecology development. Therefore, the need of a system being able to monitor the behavior of the robots in the field in real-time appears. TSCF, INRAE Clermont Ferrand have proposed an architecture (called LambAgrIoT) for robots monitoring and scheduling, based on a complex Big Data architecture (i.e. Lambda architecture) [1]. This architecture allows an effective management of real-time and historical data issued from sensors and robots. Although LambAgrIoT presents an effective data management framework for the storage and analysis of IoT and robotic agricultural data, the Stream Layer, which in in charge of managing real-time data, is supported by a simple web based client that do not allow an awareness monitoring of the on-going execution of the agricultural practice.

Data used by this system is BIG DATA. In particular the are are: (1) complex spatio-temporal data (e.g., robot trajectories, meteorological data); (2) stream data (e.g., from sensors deployed in fields), multimedia data (e.g., video, images) (3) historical data (e.g., warehoused data). These data are also acquired at different spatial and temporal scales (such as plot and city, second and hour). In order to take benefit from these data in such a supervision system, an ad-hoc geovisualization of these data must be provided

Sujet :
The main goal of this project is to define a data-driven geovisualization method that allows for an effective situation awareness of the fleet of robots supervision. Since data are too much, complex and at different temporal and spatial scales a new geovisualization method must be proposed in order to show to the end user “only” data that are relevant for his/her supervision task at the right moment. This means that the system must automatically propose the visualization method based on the real time data. To achieve this goal, a set of indicators/rules must be defined in order to choose the right geovisualization, and for each of them the most appropriate semiology must be used. Therefore, indicators/rules and data must be able to be presented to the user as a set of aggregated data in a dynamic and interactive way.

Planned work

Study existing work on geovisualization in the context of agricultural robots

Define the indicators/rules for changing visualization

Define the most appropriate geovisualization for each ‘state”

Study the Superob supervision system developed by INRAE [1]

Implement the proposal in SuperRob

Write the M2 report

Profil du candidat :
Master 2

Formation et compétences requises :
Web Development (HTML, CSS, JavaScript)

Wep mapping (Mapbox, Deck GL, etc.)

Adresse d’emploi :
9 Avenue Blaise Pascal, Aubiere

Document attaché : 202401131554_stageM2GeoVis (1).pdf