PROJECT ARTEMIS
VISUAL NAVIGATION FOR FLYING ROBOTS Mohammed Kabir
PROJECT ARTEMIS VISUAL NAVIGATION FOR FLYING ROBOTS Mohammed Kabir - - PowerPoint PPT Presentation
PROJECT ARTEMIS VISUAL NAVIGATION FOR FLYING ROBOTS Mohammed Kabir PROJECT ARTEMIS STATE OF THE INDUSTRY Highly dependent on GPS in assisted modes. Requires sufficient piloting skills in non GPS-assisted modes. Chance of
VISUAL NAVIGATION FOR FLYING ROBOTS Mohammed Kabir
PROJECT ARTEMIS
PROJECT ARTEMIS
systems.
require active control strategies for stable flight.
and sensing hardware that can be carried.
cannot be relied on in all environments.
PROJECT ARTEMIS
Navigation Localisation
Perception
Planning Control
Operator interface
PROJECT ARTEMIS
Localisation and Mapping) technique on our robot.
and centimetre-level accurate unlike GPS, and works indoors and
inertial measurements greatly increases robustness and accuracy.
PROJECT ARTEMIS
PROJECT ARTEMIS
information about the environment.
Measurement Unit) are synchronised in time with respect to each other.
used to compute depth images in realtime.
a 3D map of the environment is built incrementally.
PROJECT ARTEMIS
PROJECT ARTEMIS Drift speed
Framerate
(dynamics)
None Medium (spatial) Fast (temporal) Low Medium High Cameras, Laser Rangers GPS IMU Compass Ideal sensor
Drift correction Drift correction
Credit : Stephan Weiss, PhD thesis 2012
PROJECT ARTEMIS
sensors in the environment - GPS, Vision and Lidar.
fusion approach using a hybrid Kalman filter with fault detection is used.
compromised.
PROJECT ARTEMIS
PROJECT ARTEMIS
continuously compute a collision-free trajectory for the vehicle.
lead to a possible collision.
environment.
PROJECT ARTEMIS
the primary air-to-ground datalink.
the tablet runs live FPV (first-person- view). The operator can use this high- definition feed to fly.
PROJECT ARTEMIS Intel i7 computer Stereo cameras
(time-synchronised)
PX4 Autopilot Rocket M5
(wireless link)
Propulsion system
PROJECT ARTEMIS
PROJECT ARTEMIS
multiple iterations, building on top of our previous visual MAVs.
separate monocular camera.
are interfaced via the CAN bus.
PROJECT ARTEMIS
model for maximal reliability. The flight core is isolated from the application-level processing to ensure stability of the core vehicle
attitude estimation run on the PX4 Middleware on the NuttX RTOS.
computer, on top of the ROS (Robot Operating System) Middleware.
PROJECT ARTEMIS
Visit www.uasys.io/research for more! Our open-source software stack is available at www.github.com/ProjectArtemis