We are excited to announce that Amazon Prime Air, in collaboration with Amazon Web Services (AWS), is hosting the First Workshop on Airborne Object Tracking (AOT) at the International Conference on Computer Vision (ICCV) 2021. The workshop is organized around a new challenge based on a large and unique dataset that Amazon Prime Air has released — the Airborne-Object Tracking Dataset, which benchmarks airborne-object detection and tracking to increase the safety of the airspace.
The challenge, which launched in April, is to track airborne objects across successive frames of video, with the ultimate goal of obstacle avoidance.
The top six finishers in the challenge will give short talks during the workshop. Amazon Prime Air has also sponsored $50,000 in prizes for challenge participants, including $15,000 to the first-place finisher and a $2,500 prize for the “most creative” safety solution, as determined by the judges.
The workshop, which will be held virtually on October 11, 2021, will also feature talks by invited speakers, including Amazon researchers (Amir Navot) and academics (Laura Leal-Taixé from TU Munich, Pascal Fua from EPFL, and Andreas Geiger from the Tübingen Institute).
Sense and avoid
In recent years, applications of drones have grown to include infrastructure inspection, emergency response support, agricultural and environmental surveys, and package delivery, among others. Against this backdrop, the safe operation of drones requires fully autonomous and robust sense-and-avoid systems. By sharing our dataset, defining the problem and evaluation criteria, and hosting a challenge, we hope to expose the computer vision community to a relatively fresh area of autonomous flight applications, while emphasizing the real-world requirements for safe autonomous flight.
To generate the flight sequences in the AOT data set, two aircraft were equipped with sensors and flew planned encounters. Their trajectories were designed to create a wide distribution of distances, closing velocities, and approach angles. In addition to the so-called planned aircraft, AOT contains other unplanned airborne objects. The images have been manually annotated to indicate the locations of the airborne objects.
Airborne objects usually appear quite small at the distances that are relevant for early detection: 0.01% of the image size on average, down to a few pixels in area. In total, AOT includes close to 164 hours of flight data: 4,943 flight sequences of around 120 seconds each, collected at 10 Hz in diverse conditions.
Participants in the challenge may choose to solve either or both of two challenges. One is encounter-level detection, which means successfully tracking an airborne object for three seconds — before the object is within 300 meters of the drone. The tracker must also have a false-alarm rate no higher than one false alarm every two hours.
The other challenge is frame-level object detection. The goal is to maximize the ratio between the number of the detected airborne objects, on a per-frame basis, and all the airborne objects that should be detected. For this challenge, there is a budget of false identifications for each frame.
The workshop’s organizers envision this workshop as the first in a series of annual workshops, which will both catalyze and publicize research on autonomous flight — ultimately leading to even safer skies for drones. We invite you to challenge the baseline and submit your results and papers by July 15.
Important dates
[Revised 7/13/21]
July 15August 2:Final submissions deadline andPaper submission deadline- July 20: Posting of private leaderboard
July 25:Announcement of winners- August
510: Notification to authors - August
1517: Camera-ready-papers deadline - September 1: Challenge submission deadline