Using line-based wind turbine representations for UAV localisation - - PowerPoint PPT Presentation

β–Ά
using line based wind turbine representations for uav
SMART_READER_LITE
LIVE PREVIEW

Using line-based wind turbine representations for UAV localisation - - PowerPoint PPT Presentation

Using line-based wind turbine representations for UAV localisation during autonomous inspection Dr Oliver Moolan-Feroze University of Bristol Overview 1. Why are we using a line-based model for localisation during wind turbine inspection


slide-1
SLIDE 1

Using line-based wind turbine representations for UAV localisation during autonomous inspection

Dr Oliver Moolan-Feroze – University of Bristol

slide-2
SLIDE 2

Overview

  • 1. Why are we using a line-based model for localisation during wind

turbine inspection

  • 2. Defining the line-based turbine model
  • 3. Extracting line-based representations from images
  • 4. Integrating the line model into an optimiser
  • 5. Results
slide-3
SLIDE 3

Why Lines?

  • Wind turbines are quite similar.
  • Tower
  • Nacelle (hub)
  • 3 Blades
  • Model-based tracking approach

makes sense over full SLAM

  • However...
  • What model do we use?
  • How do we associate model features

with image locations?

slide-4
SLIDE 4

Why Lines?

  • Typically, we find correspondences

between model and image using distinguishable features

  • Enough correspondences allow us to

estimate pose

slide-5
SLIDE 5

Why Lines?

  • Typically, we find correspondences

between model and image using distinguishable features

  • Enough correspondences allow us to

estimate pose

  • In certain views we can use point-

based features

slide-6
SLIDE 6

Why Lines?

  • Typically, we find correspondences

between model and image using distinguishable features

  • Enough correspondences allow us to

estimate pose

  • In certain views we can use point-

based features

  • However, a lot of the time, no

features are in view, especially when close up

slide-7
SLIDE 7

Why Lines?

  • We extend lines connecting the point

features

  • Enables us to incroporate image

measurements inbetween feature points

slide-8
SLIDE 8

Turbine Model - definition

Model is defined using a set of points P ∈ 𝑆$

  • Turbine base: π‘ž&
  • Tower top: π‘ž'
  • Blade centre: π‘ž(
  • Blade tips: π‘ž)* where 𝑗 ∈ 1,2,3

And the set of connecting lines L

  • Turbine tower: π‘š' = {π‘ž&, π‘ž'}
  • Nacelle: π‘š( = π‘ž', π‘ž(
  • Blades: π‘š)* = {π‘ž(, π‘ž)*}
slide-9
SLIDE 9

Turbine Model - parameters

The model is parameterised with the following values πœ„

  • The x,y location of the base: 𝒅 ∈ 𝑆7
  • The height of the tower: β„Ž
  • The heading of the turbine: πœ•
  • The length of the nacelle: 𝑠
  • The rotation of the blades: 𝜚
  • The length of the blades: 𝑐
slide-10
SLIDE 10

Turbine Model - instantiation

  • Given a β€˜unit’ version of the model =

β„³ = { ? 𝒬, A β„’}, and parameters πœ„ we instantiate a model with the following functions π›š 𝒬 = π‘ž& = πœ”& ( Μ‚ π‘ž&, πœ„) π‘ž' = πœ”'( Μ‚ π‘ž', πœ„) π‘ž( = πœ”( ( Μ‚ π‘ž(, πœ„) π‘ž)* = πœ”)* ( Μ‚ π‘ž)*, πœ„)

slide-11
SLIDE 11

CNN line model feature extraction

  • To enable matching, we extract a representation of the reprojection
  • f the line-model from the images using a CNN

RGB images as well as 'prior' reprojection images are fed in as inputs Reprojection estimates of line model and point model are the

  • utputs

Encoder Decoder Convolutional layer MaxΒ­pooling layer Linear upsampling layer Sigmoid layer

slide-12
SLIDE 12

CNN line model feature extraction

With priors Without priors

slide-13
SLIDE 13

CNN line model feature extraction

  • Examples of the network output

Top) the extracted line model Bottom) the extracted point model

slide-14
SLIDE 14

Integration with the optimiser

  • The lines extracted reprojection doesn't provide enough information

to fully estimate pose

  • We combine the image information with a keyframe pose graph
  • ptimizer
  • The graph is constrained using image measurements and pose

estimates obtained from IMU / GPS

  • 𝐻 = π‘Š, 𝐹
  • π‘Š = 𝑀L, … , 𝑀N
  • 𝑀N = π‘Ÿ, π‘ˆ
  • 𝐹 = 𝑓L,7, … , 𝑓NRL,N
slide-15
SLIDE 15

Integration with the optimiser

  • Model lines are split into a series of points

and projected into the image to generate constraints

  • Image constraints can be generated in two

different ways

1. Using a perpendicular line search and establishing a 3D -> 2D correspondence

  • Restricts the movement of the model point during
  • ptimisation

2. Using direct image interpolation

  • Allows the model point to move freely over the image

Example matching using perpendicular line search

slide-16
SLIDE 16

Experiments – real Inspection data

slide-17
SLIDE 17

Experiments – flight using synthetic data

slide-18
SLIDE 18

Pose and Model Joint Optimisation

  • Previously, turbine model parameters are estimated and set at the

beginning of flight

  • Error in parameters -> error in estimated poses
  • We now jointly optimize both the set of poses, and the turbine

parameters πœ„

slide-19
SLIDE 19

Pose and Model Joint Optimisation

  • The set of function π›š are designed to be differentiable.
  • When we project model points into the images, they are

transformed using πœ”.

  • Parameters πœ„ are now estimated during optimisation
slide-20
SLIDE 20

Experiments – real data joint optimisation

slide-21
SLIDE 21

Experiments – synthetic joint optimisation

slide-22
SLIDE 22

Experiments – synthetic joint optimisation

2 4 6 8 10 Initial position error (m) 0.0 2.5 5.0 7.5 10.0 Final position error (m)

Error in position

Pose and model Pose only

0.00 0.02 0.04 0.06 0.08 Initial orientation error (rad) 0.0 0.1 0.2 0.3 0.4 Final orientation error (rad)

Error in orientation

  • Using synthetic data, we evaluated

the performance of joint

  • ptimisation
  • Improvement in both position and
  • rientation estimation when doing joint
  • ptimisation
slide-23
SLIDE 23

Thanks!

The work presented today is based on two papers

  • Improving drone localisation around wind turbines using monocular model-based tracking - Oliver

Moolan-Feroze, Konstantinos Karachalios, Dimitrios N. Nikolaidis, and Andrew Calway

  • Simultaneous drone localisation and wind turbine model fitting during autonomous surface

inspection - Oliver Moolan-Feroze, Konstantinos Karachalios, Dimitrios N. Nikolaidis, and Andrew Calway