Simple Science

Cutting edge science explained simply

# Computer Science# Computer Vision and Pattern Recognition# Artificial Intelligence# Robotics

Smart Drones in Crop Pest Monitoring

Tiny drones enhance pest detection in farming, promoting sustainability and efficiency.

― 5 min read


Drones Transform PestDrones Transform PestControlmodern farming.Mini-drones optimize pest detection for
Table of Contents

Smart farming and precision agriculture are changing how we grow crops, making it more efficient and sustainable. Small Drones, about the size of a palm, can be used as smart Sensors to check crops for early signs of pests. However, building a system that works requires a careful balance of hardware and software to ensure that the drones can detect pests accurately without using too much power or memory.

The Need for Pest Monitoring

Identifying pests quickly is crucial in farming. Timely action can prevent significant damage to crops, reduce economic losses, and lessen the environmental impact of pest control treatments. For instance, instead of spraying chemicals over an entire field, farmers can treat only affected areas or specific plants. Traditional methods used broad strategies for pest control, which are less effective and can harm non-target species.

Previously, monitoring for pests relied on traps that needed human experts to check them and record data, which was time-consuming and costly. The rise of Internet-of-Things (IoT) technologies and small, energy-efficient systems has transformed this process. Embedded devices can now automate monitoring by using small cameras and sensors that are powered by batteries.

Challenges with Existing Solutions

Despite improvements, current automated systems still need external support like servers and high-speed internet connections, leading to high operational costs and energy use. Small battery-powered devices have limited memory and processing power, which can restrict their capabilities. This is where the need for a new approach arises.

A New Approach: Pocket-sized Drones

Our current work introduces a system that uses tiny drones for pest monitoring. These drones can inspect plants by relying solely on their internal sensors and computing abilities. The key advantage of using mini-drones is their flexibility. They can reach places that larger equipment cannot, making them ideal for various settings, including greenhouses.

Yet, designing this system involves overcoming significant hurdles related to the limited resources of these small drones. The challenge is to create powerful Deep Learning Models that can distinguish between harmful and harmless insects, especially since some of them can appear very similar in images.

Choosing the Right Hardware

To achieve this, we selected two types of small, efficient computing boards: the Arduino Portenta H7 and the Greenwaves Technologies (GWT) GAP9. The Portenta H7 is a dual-core board, while the GAP9 has multiple cores, allowing for different levels of processing power.

We used two deep learning models suitable for detecting a specific pest, Popillia japonica. The first model, FOMO-MobileNetV2, is lightweight and works well with the Portenta H7. The second, SSDLite-MobileNetV3, is more complex and fits the GAP9’s capabilities.

Training the Models

Our models started with pre-trained versions that already knew how to recognize objects. We then fine-tuned them using a custom set of images of the target pest. This set included over 3,300 carefully selected images, making sure there were no duplicates.

Training involved adjusting the models to work well with our images, ensuring they could accurately detect the targeted insect. The training included utilizing a process to reduce the size of the models so they could fit within the memory limits of our devices.

Performance Evaluation

After training, we tested how well each model performed. The FOMO-MobileNetV2 model achieved a mean average precision score of 0.66 when detecting Popillia japonica, while the SSDLite-MobileNetV3 reached 0.79. These scores indicate the accuracy of the models and their ability to identify the targeted pest correctly.

In terms of speed, the FOMO-MobileNetV2 could process images at a rate of 16.1 frames per second, while the SSDLite-MobileNetV3 achieved 6.8 frames per second. Both models consumed minimal power, designed to operate within the limits of battery-powered devices.

Advantages of Mini-Drones

The use of mini-drones offers several advantages over traditional methods. They can operate without needing constant communication with external servers, allowing for quicker, localized pest detection. The lower power consumption also means that the drones can run for extended periods without needing frequent recharging.

By implementing our models in mini-drones, we create a system that can dynamically monitor crops and detect pests effectively. The goal is to use these drones to help farmers make informed decisions about pest control, reducing the overall impact on the environment.

Future Prospects

Looking forward, this technology paves the way for more autonomous systems in agriculture. These smart drones can be deployed in various settings, providing real-time data for farmers and allowing for more targeted pest control measures.

The potential applications are vast. Farmers can use these drones to inspect large fields quickly and efficiently, leading to reduced chemical use and better Crop Management. This approach aligns with sustainable farming practices, promoting healthier ecosystems while maintaining productivity.

Conclusion

In summary, our work focuses on developing a pest detection system using small drones powered by advanced deep learning models. This innovation offers a reliable solution for modern farming techniques, emphasizing efficiency and sustainability. By leveraging the capabilities of mini-drones and powerful neural networks, we are setting the stage for a new way to monitor and manage pest populations in agriculture. The future of farming could be brighter with these technologies, leading to healthier crops and a healthier planet.

Original Source

Title: A Deep Learning-based Pest Insect Monitoring System for Ultra-low Power Pocket-sized Drones

Abstract: Smart farming and precision agriculture represent game-changer technologies for efficient and sustainable agribusiness. Miniaturized palm-sized drones can act as flexible smart sensors inspecting crops, looking for early signs of potential pest outbreaking. However, achieving such an ambitious goal requires hardware-software codesign to develop accurate deep learning (DL) detection models while keeping memory and computational needs under an ultra-tight budget, i.e., a few MB on-chip memory and a few 100s mW power envelope. This work presents a novel vertically integrated solution featuring two ultra-low power System-on-Chips (SoCs), i.e., the dual-core STM32H74 and a multi-core GWT GAP9, running two State-of-the-Art DL models for detecting the Popillia japonica bug. We fine-tune both models for our image-based detection task, quantize them in 8-bit integers, and deploy them on the two SoCs. On the STM32H74, we deploy a FOMO-MobileNetV2 model, achieving a mean average precision (mAP) of 0.66 and running at 16.1 frame/s within 498 mW. While on the GAP9 SoC, we deploy a more complex SSDLite-MobileNetV3, which scores an mAP of 0.79 and peaks at 6.8 frame/s within 33 mW. Compared to a top-notch RetinaNet-ResNet101-FPN full-precision baseline, which requires 14.9x more memory and 300x more operations per inference, our best model drops only 15\% in mAP, paving the way toward autonomous palm-sized drones capable of lightweight and precise pest detection.

Authors: Luca Crupi, Luca Butera, Alberto Ferrante, Daniele Palossi

Last Update: 2024-04-02 00:00:00

Language: English

Source URL: https://arxiv.org/abs/2407.00815

Source PDF: https://arxiv.org/pdf/2407.00815

Licence: https://creativecommons.org/licenses/by/4.0/

Changes: This summary was created with assistance from AI and may have inaccuracies. For accurate information, please refer to the original source documents linked here.

Thank you to arxiv for use of its open access interoperability.

Similar Articles