Vision-only UAV State Estimation for Fast Flights Without External Localization Systems
VOLLSTÄNDIGE ABSCHRIFT
In this video, we present our approach
to visiononly UAV state estimation for
fast and aggressive flights without
external localization systems. We
develop a fully onboard estimation
pipeline using only an IMU and a single
moninocular camera capable of reliable
operation during agile flight and GPS
denied environments. Visual inertial
odometry or VIO is the standard method
for onboard state estimation using only
a camera and an IMU in GPS denied
environments. However, VIO suffers from
significant drift and delays during
aggressive maneuvers. Therefore, we also
incorporate a landmark detector to
correct VIO drift using detectable
landmarks in the environment. At the
start of the flight, VIO is initialized
at the UAV's position and defines its
own coordinate frame, which is connected
to the world frame through a static
transformation. As the UAV begins flying
and performs fast aggressive maneuvers,
VIO starts to drift and its estimated
states diverge from the ground truth
states across all six degrees of
freedom. Relying on VIO alone for state
estimation often leads to crashes.
Current state-of-the-art methods either
rely on inaccurate VIO estimates such as
linear and angular velocities or the
UAV's attitude or require more complex
hardware including stereo cameras and
rangefinders.
In contrast, our approach compensates
for VIO drift across all UAV states
while using only an RGB camera and an
IMU. Here is our estimation pipeline.
VIO uses IMU and camera data to provide
drifting UAV states which are fused with
camera measurements from the landmark
detector to estimate VIO drift. Then we
correct the VIO odometry using the
estimated drift and fuse it with IMU
data to reduce delay and capture
aggressive UAV motion. Finally, the
estimated states are used by the
controller to track the pre-planned
trajectory.
In our paper, we propose a novel model
of VIO drift, which is incorporated into
a Calman filter to estimate the drift.
We then fuse data from VIO, the
estimated VIO drift, and the IMU to
produce the final UAV state estimate. As
you can see in the equations,
our approach was successfully deployed
at the A2RL drone racing challenge 2025
in Abu Dhabi, where we advanced through
the quarterfinals and semi-finals to
reach the final round among the top four
teams out of a total of 210. The goal of
each round was to complete two laps
through a predefined sequence of 11
gates, and we completed multiple twolap
runs at speeds of up to 45 kmh. Here you
can see one of our flights. The
three-dimensional plot in the top left
corner of this flight shows that the VIO
estimate shown in gray is insufficient
for agile flight in cluttered GPS denied
environments. In contrast, our approach
provides accurate state estimates shown
by the blue to red trajectory indicating
speed from slowest in blue to fastest in
red. We also performed real world
experiments on an outdoor track to
compare our approach against ground
truth values obtained from RTK. The
outdoor track consisted of six gates and
the UAV was required to complete two
laps. The three-dimensional plot in the
top left corner shows ground truth data
from RTK, estimates from VIO and values
from our approach where color indicates
speed. Our approach tracks the ground
truth smoothly while VIO exhibits
significant drift. We conducted numerous
flights and performed a statistical
evaluation comparing our method with
state-of-the-art approaches and RTK
values. Here is the table presenting the
statistical evaluation of our approach
compared to ground truth values and
state-of-the-art methods across all UAV
states including position, orientation,
linear velocity, and angular velocity.
Compared to state-of-the-art methods,
our approach reduces the root mean
square error of linear velocity by 16%,
orientation by 70% and angular velocity
by 88%.
Our novel approach for visiononly UAV
state estimation presents an accurate
onboard pipeline for fast and aggressive
flights using only a moninocular camera
and an IMU. Our approach achieves
significant improvements in linear
velocity, orientation, and angular
velocity estimation accuracy in terms of
root mean square error compared to
current state-of-the-art methods.
Additionally, it incorporates a novel
drift model and directly fuses IMU data
into the final UAV state estimate.
MEHR FREISCHALTEN
Melden Sie sich kostenlos an, um Premium-Funktionen zu nutzen
INTERAKTIVER VIEWER
Sehen Sie sich das Video mit synchronisierten Untertiteln, anpassbarer Überlagerung und voller Wiedergabesteuerung an.
KI-ZUSAMMENFASSUNG
Erhalten Sie eine sofortige KI-generierte Zusammenfassung des Videoinhalts, der wichtigsten Punkte und Erkenntnisse.
ÜBERSETZEN
Übersetzen Sie das Transkript mit einem Klick in über 100 Sprachen. Download in jedem Format.
MIND MAP
Visualisieren Sie das Transkript als interaktive Mind Map. Verstehen Sie die Struktur auf einen Blick.
CHAT MIT TRANSKRIPT
Stellen Sie Fragen zum Videoinhalt. Erhalten Sie Antworten von der KI direkt aus dem Transkript.
HOLEN SIE MEHR AUS IHREN TRANSKRIPTEN HERAUS
Melden Sie sich kostenlos an und schalten Sie interaktiven Viewer, KI-Zusammenfassungen, Übersetzungen, Mind Maps und mehr frei. Keine Kreditkarte erforderlich.