Drones: Your New Guide in the Sky
Smart drones assist visually impaired individuals navigate safely and independently.
Suman Raj, Radhika Mittal, Harshil Gupta, Yogesh Simmhan
― 6 min read
Table of Contents
In a world where drones are buzzing around like bees, there’s a push to make these flying machines do more than just take pretty pictures. They’re being transformed into helpful companions for people who can’t see well. These drones, equipped with smart technology, can guide visually impaired individuals (VIPs) through their surroundings, allowing them to live more independently and safely.
What’s the Buzz About?
Imagine a drone as a buddy, keeping a watchful eye on a VIP, ensuring that they navigate through parks, streets, and even grocery stores without tripping over anything. The drones use cameras and special software to analyze video in real time, identifying obstacles and guiding the VIP along the way. Talk about a wingman!
How Do These Drones Work?
These drones can either do their calculations on their own (that’s called edge computing) or send the data to a cloud service (think of it as a distant brain) for more heavy-duty processing. The choice between these two options depends on various factors, like how fast a task needs to be done and how busy the drone is at that moment.
Let’s say the drone spots a curb ahead. It must quickly decide whether to analyze that information locally or send it off to the cloud. Making the right choice is crucial because it can affect how quickly and effectively the drone helps the VIP.
The Problem of Timing
When a VIP is out and about, they don’t have time to waste. If the drone can’t process its video fast enough to warn them about a hazard, it’s not much help. Drones must meet strict deadlines for many tasks simultaneously, and that’s no small feat due to the ever-changing conditions they operate in.
For instance, if a drone is too slow to recognize a quickly approaching car, it could lead to a dangerous situation. Therefore, it’s essential to have a smart Scheduling system that ensures these tasks are executed on time, whether on the drone itself or sent to the cloud.
Making Smart Decisions
To tackle these challenges, the drones use an intelligent scheduling system. This system looks at the urgency and requirements of various tasks, such as detecting obstacles or tracking the VIP. By prioritizing tasks and deciding whether to handle them on the edge or in the cloud, the drone can maximize its efficiency.
Think of it like a restaurant kitchen. The chef must decide whether to cook a meal quickly or take the time to prepare something more complex. The quicker decisions often lead to satisfied guests, just like swift drone processing leads to safer navigation for VIPs.
Testing the Waters
The effectiveness of this system is tested in two ways: emulated environments and real-life scenarios. In controlled settings, simulations with multiple drones can establish a baseline performance. Drones are put through their paces, handling a variety of tasks while coping with varying conditions, such as network speed and task complexity.
Then comes the fun part – real-world testing! Drones are put into actual use, assisting VIPs while being monitored for performance. This not only validates the technology but also helps refine the scheduling algorithms based on actual data.
The Drones in Action
These buddy drones are designed to assist VIPs in numerous ways:
- Obstacle Detection: If there’s a bump in the road or a dog on a leash, the drone will spot it and alert the VIP.
- Navigation Assistance: The drone can help guide the VIP to their desired location, ensuring they don’t stray off course.
- Emergency Help: In case of a sudden fall or distress, the drone can call for assistance or fetch help.
No Two Drones Are Alike
Drones come in different shapes and sizes, each with unique capabilities. Some might be lightweight and perfect for quick flights in urban areas, while others are equipped for heavier tasks, such as carrying supplies or remote medical assistance.
For instance, consider a small quadcopter designed for carrying a camera versus a sturdier drone capable of delivering a package. Each type has specific uses, which can enhance the experience for VIPs based on their needs.
The Drones’ Playground
These developments have led to the creation of many exciting use cases for drones in urban settings. From monitoring safety in crowded environments to aiding disaster response, the potential is vast. The drones could even assist in healthcare settings by delivering supplies directly to hospitals during emergencies.
Connectivity
StrengtheningFor these drones to function optimally, they must communicate effectively with their base stations where the data is processed. Whether over Wi-Fi or cellular networks, the connection must ensure that data moves quickly and reliably to meet deadlines.
With urban environments full of obstacles, maintaining a strong and consistent connection can be a challenge. Think of it as trying to have a phone conversation while walking through a busy market while dodging people and carts. It can get tricky!
Improving the Experience
In addition to focusing on task completion, ensuring that the VIP has a quality experience is also vital. This means that the system must provide timely and accurate information to the user. If a drone is slow to respond, it could create frustration, which defeats its purpose.
By measuring how well the system performs, engineers can make adjustments to enhance both the quality of service and the overall experience for the user. After all, nobody wants to feel like they’re just a bunch of tasks on a conveyor belt!
Building a Smart Future
As more drones are developed, the hope is that they will be able to function seamlessly in urban environments, working together like a well-oiled machine. Drones can share data among themselves, allowing for improved decision-making and quicker response times.
Imagine a swarm of drones flying together, analyzing information in real time, and communicating with each other to deliver the best possible assistance to VIPs. This coordinated effort could make a significant difference in the lives of many people.
Conclusion
The future of drones as assistive technologies for visually impaired individuals is bright. With advancements in scheduling algorithms, connectivity, and intelligent processing, these flying buddies can make a meaningful impact on the lives of many, helping users navigate their surroundings safely and efficiently.
So, the next time you see a drone flying overhead, just know it might be more than just a pretty gadget. It could be your buddy, keeping a watchful eye and making the world a little easier to navigate.
Original Source
Title: Adaptive Heuristics for Scheduling DNN Inferencing on Edge and Cloud for Personalized UAV Fleets
Abstract: Drone fleets with onboard cameras coupled with computer vision and DNN inferencing models can support diverse applications. One such novel domain is for one or more buddy drones to assist Visually Impaired People (VIPs) lead an active lifestyle. Video inferencing tasks from such drones can help both navigate the drone and provide situation awareness to the VIP, and hence have strict execution deadlines. We propose a deadline-driven heuristic, DEMS-A, to schedule diverse DNN tasks generated continuously to perform inferencing over video segments generated by multiple drones linked to an edge, with the option to execute on the cloud. We use strategies like task dropping, work stealing and migration, and dynamic adaptation to cloud variability, to guarantee a Quality of Service (QoS), i.e. maximize the utility and the number of tasks completed. We also introduce an additional Quality of Experience (QoE) metric useful to the assistive drone domain, which values the frequency of success for task types to ensure the responsiveness and reliability of the VIP application. We extend our DEMS solution to GEMS to solve this. We evaluate these strategies, using (i) an emulated setup of a fleet of over 80 drones supporting over 25 VIPs, with real DNN models executing on pre-recorded drone video streams, using Jetson Nano edges and AWS Lambda cloud functions, and (ii) a real-world setup of a Tello drone and a Jetson Orin Nano edge generating drone commands to follow a VIP in real-time. Our strategies present a task completion rate of up to 88%, up to 2.7x higher QoS utility compared to the baselines, a further 16% higher QoS utility while adapting to network variability, and up to 75% higher QoE utility. Our practical validation exhibits task completion of up to 87% for GEMS and 33% higher total utility of GEMS compared to edge-only.
Authors: Suman Raj, Radhika Mittal, Harshil Gupta, Yogesh Simmhan
Last Update: 2024-12-30 00:00:00
Language: English
Source URL: https://arxiv.org/abs/2412.20860
Source PDF: https://arxiv.org/pdf/2412.20860
Licence: https://creativecommons.org/licenses/by-nc-sa/4.0/
Changes: This summary was created with assistance from AI and may have inaccuracies. For accurate information, please refer to the original source documents linked here.
Thank you to arxiv for use of its open access interoperability.
Reference Links
- https://tex.stackexchange.com/questions/664/why-should-i-use-usepackaget1fontenc
- https://tex.stackexchange.com/questions/2369/why-do-the-less-than-symbol-and-the-greater-than-symbol-appear-wrong-as
- https://www.ctan.org/pkg/ifpdf
- https://www.ctan.org/pkg/cite
- https://www.ctan.org/pkg/graphicx
- https://www.ctan.org/pkg/epslatex
- https://www.tug.org/applications/pdftex
- https://www.ctan.org/pkg/amsmath
- https://www.ctan.org/pkg/subfig
- https://comments.gmane.org/gmane.editors.lyx.general/68611
- https://www.ryzerobotics.com/tello
- https://developer.nvidia.com/embedded/learn/get-started-jetson-nano-devkit
- https://developer.nvidia.com/embedded/learn/get-started-jetson-orin-nano-devkit
- https://aws.amazon.com/lambda/pricing/
- https://sumo.dlr.de/docs/index.html
- https://www.nsnam.org/
- https://www.sciencedirect.com/science/article/pii/S0167739X21000704
- https://www.sciencedirect.com/science/article/pii/S0167739X18303327
- https://dl.acm.org/doi/abs/10.1145/3140256
- https://onlinelibrary.wiley.com/doi/abs/10.1002/spe.2699
- https://www.sciencedirect.com/science/article/pii/S0743731518302661
- https://www.sciencedirect.com/science/article/pii/S0743731518301771
- https://ieeexplore.ieee.org/abstract/document/9723632
- https://www.sciencedirect.com/science/article/pii/S0167739X19330961
- https://www.sciencedirect.com/science/article/pii/S0167739X18318703
- https://www.sciencedirect.com/science/article/pii/S0167739X18319770
- https://onlinelibrary.wiley.com/doi/abs/10.1002/spe.2392
- https://github.com/ultralytics/ultralytics
- https://github.com/NVIDIA-AI-IOT/trt_pose_hand
- https://github.com/nianticlabs/monodepth2
- https://github.com/AIZOOTech/FaceMaskDetection
- https://www.tp-link.com/in/home-networking/adapter/tl-wn722n/
- https://drive.google.com/file/d/1qAhXa_NEZlMclfevG0aRnnQGDmbDvGoW/view?usp=sharing