Ensuring a stable and reliable agricultural production is a major challenge, es- pecially in contexts where human presence is limited or discontinuous. The need for continuous monitoring, combined with the unpredictability of certain crop dis- eases, makes it desirable to use autonomous systems capable of making decisions based on learned knowledge. A robotic agent equipped with artificial vision and decision-making capabilities can reduce reaction times, act locally without waiting for external instructions, and help manage resources more efficiently. These needs become even more relevant in remote or extreme environments - such as extrater- restrial exploration scenarios - where long distances, communication delays, and harsh environmental conditions make constant human supervision unfeasible. In such cases, the ability to respond autonomously and promptly can make the differ- ence between the success and failure of a mission. This thesis fits into this vision by presenting an integrated robotic and computer vision system for the automatic recognition of tomatoes affected by blossom-end rot. In a simulated environment, the hardware consists of an omnidirectional wheeled platform modeled in CoppeliaSim, topped by an anthropomorphic robotic arm. The end-effector carries an RGB-D camera (Kinect type), used to acquire scans of tomato rows. At the core of the system lies the object detection algorithm A.S.I.A. (Agricultural Smart Inspection Agent), developed from the YOLO architecture. The model was trained on a dataset composed of real and synthetic images, enabling it to robustly identify and classify fruits as healthy or diseased and provide precise coordinates for further inspection or targeted intervention. From a robotics perspective, inverse kinematics algorithms were implemented with singularity handling, and motion planning strategies were tested to optimize the se- quential detection of multiple targets. This allowed simulation of realistic scenarios, laying the groundwork for deployment on physical platforms. The long-term vision is to employ this system in autonomous agricultural missions, both in remote terrestrial environments and in extraterrestrial contexts, where the ability to detect plant diseases early and take corrective action without direct human intervention could prove crucial to ensuring sustainability and food security.
Garantire una produzione agricola stabile e sicura rappresenta una sfida cru- ciale, specialmente in contesti in cui la presenza umana è limitata o discontinua. La necessità di monitoraggio continuo, unita all’imprevedibilità di alcune malattie delle colture, rende auspicabile l’impiego di sistemi autonomi in grado di prendere deci- sioni sulla base di conoscenze apprese. Un agente robotico dotato di visione artificiale e capacità decisionali può infatti, ridurre i tempi di reazione, intervenire localmente senza attendere istruzioni esterne e contribuire a una gestione più efficiente delle risorse. Queste esigenze diventano ancora più marcate in scenari remoti o estremi — come quelli di esplorazione extraplanetaria — dove le grandi distanze, i lunghi tempi di comunicazione e le condizioni ambientali avverse amplificano l’impossibilità di un controllo umano diretto e costante. In tali contesti, la capacità di intervenire in modo autonomo e tempestivo può fare la differenza tra la sopravvivenza e il fal- limento della missione. Il progetto presentato in questa tesi si colloca proprio in questa visione, proponendo un sistema integrato di: robotica e visione artificiale per il riconoscimento auto- matico di pomodori affetti da blossom-end rot. In ambiente simulativo, l’hardware è costituito da una piattaforma a ruote omnidirezionali modellata in CoppeliaSim, sormontata da un braccio robotico antropomorfo. L’end-effector ospita una fotocam- era RGB-D di tipo Kinect, utilizzata per acquisire scansioni dei filari di pomodoro. Il cuore del sistema è l’algoritmo di object detection A.S.I.A. (Agricultural Smart Inspection Agent), sviluppato a partire dall’architettura YOLO. Il modello è stato addestrato su un dataset composto sia da immagini reali sia da immagini sin- teticheGrazie a questo approccio, il sistema è in grado di individuare e classificare i frutti come sani o malati, fornendo coordinate precise; utili per successive ispezioni e interventi mirati. Dal punto di vista della robotica, sono stati implementati algoritmi di cinematica inversa con gestione delle configurazioni singolari e testate strategie di motion plan- ning pensate per ottimizzare il rilevamento sequenziale di più obiettivi. Questo ha permesso di simulare scenari realistici; ponendo le basi per un’implementazione su piattaforme fisiche. La prospettiva di lungo termine è quella di impiegare questo sistema in missioni agricole autonome, sia in ambienti remoti sulla Terra sia in contesti extraterrestri, dove la capacità di diagnosticare precocemente le malattie e adottare contromisure senza intervento diretto dell’uomo può rivelarsi cruciale per garantire la sostenibilità e la sicurezza alimentare.
Vision-Based Robot Control for a Mobile Manipulator in Tomato Plant Disease Detection and Monitoring
PIPPOLINI, SIMONE
2024/2025
Abstract
Ensuring a stable and reliable agricultural production is a major challenge, es- pecially in contexts where human presence is limited or discontinuous. The need for continuous monitoring, combined with the unpredictability of certain crop dis- eases, makes it desirable to use autonomous systems capable of making decisions based on learned knowledge. A robotic agent equipped with artificial vision and decision-making capabilities can reduce reaction times, act locally without waiting for external instructions, and help manage resources more efficiently. These needs become even more relevant in remote or extreme environments - such as extrater- restrial exploration scenarios - where long distances, communication delays, and harsh environmental conditions make constant human supervision unfeasible. In such cases, the ability to respond autonomously and promptly can make the differ- ence between the success and failure of a mission. This thesis fits into this vision by presenting an integrated robotic and computer vision system for the automatic recognition of tomatoes affected by blossom-end rot. In a simulated environment, the hardware consists of an omnidirectional wheeled platform modeled in CoppeliaSim, topped by an anthropomorphic robotic arm. The end-effector carries an RGB-D camera (Kinect type), used to acquire scans of tomato rows. At the core of the system lies the object detection algorithm A.S.I.A. (Agricultural Smart Inspection Agent), developed from the YOLO architecture. The model was trained on a dataset composed of real and synthetic images, enabling it to robustly identify and classify fruits as healthy or diseased and provide precise coordinates for further inspection or targeted intervention. From a robotics perspective, inverse kinematics algorithms were implemented with singularity handling, and motion planning strategies were tested to optimize the se- quential detection of multiple targets. This allowed simulation of realistic scenarios, laying the groundwork for deployment on physical platforms. The long-term vision is to employ this system in autonomous agricultural missions, both in remote terrestrial environments and in extraterrestrial contexts, where the ability to detect plant diseases early and take corrective action without direct human intervention could prove crucial to ensuring sustainability and food security.| File | Dimensione | Formato | |
|---|---|---|---|
|
Vision_Based_Robot_Control_for_a_Mobile_Manipulator_in_Tomato_Plant_Disease_Detection_and_Monitoring.pdf
accesso aperto
Dimensione
28.33 MB
Formato
Adobe PDF
|
28.33 MB | Adobe PDF | Visualizza/Apri |
È consentito all'utente scaricare e condividere i documenti disponibili a testo pieno in UNITESI UNIPV nel rispetto della licenza Creative Commons del tipo CC BY NC ND.
Per maggiori informazioni e per verifiche sull'eventuale disponibilità del file scrivere a: unitesi@unipv.it.
https://hdl.handle.net/20.500.14239/33546