Simple Science

Cutting edge science explained simply

# Computer Science# Computer Vision and Pattern Recognition# Artificial Intelligence

AI Advancements in Poultry Management Using SAM

Examining SAM's capabilities in chicken segmentation and tracking for better poultry practices.

― 5 min read


AI Enhances PoultryAI Enhances PoultryManagementsegmentation in agriculture.SAM improves chicken tracking and
Table of Contents

In recent times, the agricultural world has seen some exciting progress in artificial intelligence (AI), especially with large models that help in various tasks. One of these models is called the Segment Anything Model (SAM), created by Meta AI Research. This model is particularly good at identifying and separating different objects in pictures. While SAM has shown promise in many areas of agriculture, its application in the poultry industry, especially for cage-free hens, is still in its early stages. This article looks into how well SAM can perform with chickens and dives into its usefulness in tracking their movements.

SAM and Its Role in Chicken Segmentation

The main goal of this study was to see how well SAM can segment, or identify parts of, images containing chickens. For this, two different types of chicken images were used: one set of regular images and another set made with thermal imaging that shows heat. We wanted to test SAM’s performance in two main ways: first, to see how well it segments whole chickens and their parts, and second, to track chicken movements.

To do this, we compared SAM with two other advanced methods called SegFormer and SETR. The results showed that SAM performed better than these other models in both whole chicken segmentation and part-based segmentation. This was especially true when using a complete set of prompts that guide SAM in identifying objects. Furthermore, regular images were easier for SAM to work with than thermal images. The thermal images' colors made it hard for SAM to clearly see the chickens against their backgrounds.

Visual Results from Chicken Segmentation

The results from the segmentation tasks show a clear difference in performance. SAM outperformed both SegFormer and SETR when comparing how well they could identify chickens in various datasets. Specifically, we found that when the model had more prompts to work with, it did even better. Additionally, it was easier for SAM to identify the entire chicken body compared to smaller parts like the tail because of color similarities under different lighting conditions.

Tracking Chickens Using SAM

While SAM was not initially designed for tracking, we managed to adapt it to perform this task. We created a custom dataset focused on broiler chickens to see how well SAM could follow their movements over time. The combination of SAM with another model called YOLOX and a tracking tool called ByteTracker proved effective. This new method allows us to keep track of individual chickens as they move around in real-time videos.

As an example, when a chicken moves from one spot to another, SAM first identifies it and gives a bounding box around the chicken. Then, the YOLOX model determines that it's indeed a chicken in that box, while ByteTracker continues to track its movements. This combination allows us to get a clearer view of how chickens behave and move, which is important for improving poultry production operations.

Challenges in Chicken Segmentation and Tracking

Despite SAM's impressive performance, we discovered several challenges that could affect its effectiveness. These include:

  1. Flock Density: When there are too many chickens in one area, their bodies can overlap, making it hard for SAM to identify individual birds. This often happens when there are more than nine birds in a square meter.

  2. Occlusion: In cage-free environments, chickens can be hidden behind objects like feeders or nesting boxes. This makes it tough for SAM to detect them accurately.

  3. Behavioral Changes: When chickens change their posture, like when they are resting and huddled, SAM can struggle to identify them correctly. It may confuse them for different chickens due to their distorted shapes.

Future Directions for Research

Given the encouraging results with SAM in identifying chicken bodies, there are exciting possibilities for future research. One of the next steps could be to look into segmenting other parts of the chicken, such as legs or wings. With this information, we could create models that help predict wing weight for sorting purposes or monitor the overall weight of chickens in real-time.

Another area of focus could be to improve tracking capabilities by using SAM alongside other computer vision models. This could help us monitor chicken behaviors like eating, drinking, and moving around, giving a better picture of their health and wellbeing.

Multimodal Models in Agriculture

Furthermore, using multimodal models, which combine different types of data, can be beneficial in agriculture. By providing various kinds of inputs, these models can become more effective, adapting to different types of poultry species and the unique conditions of their environments.

Conclusion

This article has covered the capabilities of the Segment Anything Model (SAM) in the context of poultry science, particularly for chicken segmentation and tracking. SAM has shown to outperform traditional methods in identifying both whole chickens and their specific parts. The combination of SAM with other tools enabled real-time tracking of chicken movements, which can lead to better management in poultry production.

However, the study also highlighted some limitations that could affect the model’s performance, such as issues with high bird density, occlusion by objects, and changes in behavior. Moving forward, research can focus on addressing these challenges and further refining SAM’s capabilities. Overall, this work underscores the significant potential SAM holds for improving chicken welfare and optimizing operations in the poultry industry.

Original Source

Title: SAM for Poultry Science

Abstract: In recent years, the agricultural industry has witnessed significant advancements in artificial intelligence (AI), particularly with the development of large-scale foundational models. Among these foundation models, the Segment Anything Model (SAM), introduced by Meta AI Research, stands out as a groundbreaking solution for object segmentation tasks. While SAM has shown success in various agricultural applications, its potential in the poultry industry, specifically in the context of cage-free hens, remains relatively unexplored. This study aims to assess the zero-shot segmentation performance of SAM on representative chicken segmentation tasks, including part-based segmentation and the use of infrared thermal images, and to explore chicken-tracking tasks by using SAM as a segmentation tool. The results demonstrate SAM's superior performance compared to SegFormer and SETR in both whole and part-based chicken segmentation. SAM-based object tracking also provides valuable data on the behavior and movement patterns of broiler birds. The findings of this study contribute to a better understanding of SAM's potential in poultry science and lay the foundation for future advancements in chicken segmentation and tracking.

Authors: Xiao Yang, Haixing Dai, Zihao Wu, Ramesh Bist, Sachin Subedi, Jin Sun, Guoyu Lu, Changying Li, Tianming Liu, Lilong Chai

Last Update: 2023-05-17 00:00:00

Language: English

Source URL: https://arxiv.org/abs/2305.10254

Source PDF: https://arxiv.org/pdf/2305.10254

Licence: https://creativecommons.org/licenses/by/4.0/

Changes: This summary was created with assistance from AI and may have inaccuracies. For accurate information, please refer to the original source documents linked here.

Thank you to arxiv for use of its open access interoperability.

More from authors

Similar Articles