Why AI and Drones Will Shape the Future of Plant Disease Detection and Global Food Security
By Khawla Almazrouei, Robotics Engineer, Technology Innovation Institute


Ensuring a stable and sustainable food supply is one of the most pressing challenges of the 21st century, but innovation in plant disease detection can offer solutions to strengthen agricultural resilience.
As the global population is projected to reach 10.3 billion by 2100, food security remains under constant threat from plant diseases, which cause significant crop losses, disrupt supply chains, and undermine agricultural sustainability.
Every year, up to 40% of global crop production is lost due to plant pests and diseases, costing the global economy an estimated $220 billion, according to the Food and Agriculture Organization.
Nations that rely heavily on food imports, such as the UAE, are particularly vulnerable to supply chain disruptions that can be caused by plant diseases. Advancing detection methods is crucial to mitigating these risks and ensuring food security.
Shortcomings of traditional methods
Traditional plant disease detection methods typically rely on visual inspection by experienced farmers and agricultural experts, analysis that compares the light reflectance of healthy and infected plants, and molecular methods that allows the amplification and quantification of pathogen DNA within plant tissues.
While these methods can be effective, they are often inefficient, costly and labor intensive.
As research progresses, detection methods need to become more accessible, accurate, and scalable.
Recent research from the Technology Innovation Institute’s Autonomous Robotics Research Center and the University of Sharjah in Abu Dhabi highlights the potential of AI-based methods to improve detection.
The study, A Comprehensive Review on Machine Learning Advancements for Plant Disease Detection and Classification, identifies image-based analysis using machine learning, particularly deep learning, as the most promising approach.
More efficient models
Machine learning models can analyze leaf, fruit, or stem images to spot diseases based on characteristics such as color, texture, and shape. Among the most widely used techniques, Convolutional Neural Networks (CNN) extract visual features with high accuracy, improving disease classification significantly.
Some models combine different techniques, such as Random Forest and Histogram of Oriented Gradients (HOG), to further enhance precision. However, CNNs require extensive datasets to be effective, posing a challenge for agricultural settings with limited labeled data.
As innovation progresses, newer technologies like Vision Transformers (ViTs) have shown even greater potential. Originally designed for natural language processing, ViTs apply self-attention mechanisms to images, allowing them to process entire images as sequences of patches. Unlike CNNs, which focus on local image features, ViTs can capture global relationships across an entire image.
ViTs present several advantages. They are highly accurate, they’re scalable since they can analyze vast datasets, and unlike traditional deep learning models, they offer more transparency in their decision-making processes.
Hybrid models combining CNNs and ViTs have also shown they can significantly increase performance and accuracy. For example, CropViT is a lightweight transformer model that can achieve a remarkable accuracy of 98.64% in plant disease classification.
To enhance large-scale monitoring, drones equipped with AI-powered cameras present a promising solution for real-time disease detection. By capturing high-resolution images and analyzing them using machine learning, drones can detect diseases early, reducing the reliance on manual inspections and improving response times.
From research to real-world impact
Despite progress and innovation, several challenges remain in bringing AI-based plant disease detection to widespread adoption.
Many AI models are trained on limited datasets that don’t fully reflect real-world agricultural conditions.
Unlike controlled lab environments, real-world agricultural settings introduce unpredictable factors such as varying light conditions, soil quality, and weather patterns, which can affect AI model accuracy.
To further improve AI models, they must be trained on diverse datasets encompassing various plant species, disease types and environment conditions and must be optimized to perform reliably across diverse geographies, crop types and farming practices.
To fully realize these advancements and contribute to global food security, all stakeholders, including researchers, agritech companies and policymakers must collaborate to develop standardized datasets for AI training, refine AI models, and integrate scalable solutions.
By promoting innovative methods and addressing existing challenges, AI-driven plant disease detection can transition from promising research to real-world impact, strengthening the resilience of global agriculture and securing the future of food production.
Read more:
Eng. Khawla Almazrouei is a robotics engineer at the Autonomous Robotics Research Center (ARRC) under the Technology Innovation Institute (TII) in Abu Dhabi, specializing in perception, sensor fusion, and AI for unmanned ground vehicles. With a background in Computer Engineering and AI from the United Arab Emirates University and a master’s from the University of Sharjah, she focuses on dynamic obstacle avoidance, reinforcement learning for path planning, and sensor architecture. Her research, published in top journals and conferences, advances hardware acceleration, perception algorithms, and real-time sensor integration, improving UGV performance in challenging environments.


Miriam McNabb is the Editor-in-Chief of DRONELIFE and CEO of JobForDrones, a professional drone services marketplace, and a fascinated observer of the emerging drone industry and the regulatory environment for drones. Miriam has penned over 3,000 articles focused on the commercial drone space and is an international speaker and recognized figure in the industry. Miriam has a degree from the University of Chicago and over 20 years of experience in high tech sales and marketing for new technologies.
For drone industry consulting or writing, Email Miriam.
TWITTER:@spaldingbarker
Subscribe to DroneLife here.