CS 309: Autonomous Intelligent Robotics FRI I Lecture 17: Starting - - PowerPoint PPT Presentation

cs 309 autonomous intelligent robotics fri i lecture 17
SMART_READER_LITE
LIVE PREVIEW

CS 309: Autonomous Intelligent Robotics FRI I Lecture 17: Starting - - PowerPoint PPT Presentation

CS 309: Autonomous Intelligent Robotics FRI I Lecture 17: Starting the Robot Coordinate Frames TF2 Forming Project Groups Instructor: Justin Hart http://justinhart.net/teaching/2019_spring_cs309/ Pick a BWIBot V2 V4 V2 Startup Find


slide-1
SLIDE 1

CS 309: Autonomous Intelligent Robotics FRI I Lecture 17: Starting the Robot Coordinate Frames TF2 Forming Project Groups Instructor: Justin Hart

http://justinhart.net/teaching/2019_spring_cs309/

slide-2
SLIDE 2

Pick a BWIBot

V2 V4

slide-3
SLIDE 3

V2 Startup

  • Find the back of the robot

– Yes, the screen faces the BACK

slide-4
SLIDE 4

V2 Startup

  • The switch should be in the “charge” position

when you find the robot.

slide-5
SLIDE 5

V2 Startup

  • Set the switch to the neutral position
slide-6
SLIDE 6

V2 Startup

  • Unplug the charging cable
  • Set the switch to the “battery” position
slide-7
SLIDE 7

V2 Startup

  • Go to the front of the robot
slide-8
SLIDE 8

V2 Startup

  • Press the green button.

– It should light up.

slide-9
SLIDE 9

V2 Startup

  • Press the yellow button.

– Yellow button and blue indicator light should light.

slide-10
SLIDE 10

V2 Startup

  • Hit the power button on the laptop
slide-11
SLIDE 11

V4 Startup

  • The V4 is different

– This is actually the front of the robot.

slide-12
SLIDE 12

V4 Startup

  • Disconnect the power supply.

– First, undo the screws on the connector

slide-13
SLIDE 13

V4 Startup

  • Disconnect the power supply.

– Then disconnect the plug.

slide-14
SLIDE 14

V4 Startup

  • Locate the Emergency Stop and Power Button

– On the top at the front of the unit on the V4

slide-15
SLIDE 15

V4 Startup

  • The Emergency Stop may be depressed

– If it is, gently twist it to the right and it will lift.

slide-16
SLIDE 16

V4 Startup

  • Press the power button.
slide-17
SLIDE 17

V4 Startup

  • It will illuminate.

– This indicates that the robot is now on.

slide-18
SLIDE 18

V4 Startup

  • Reach back to the right inside the chasis.

– You can feel the power button to turn it on.

slide-19
SLIDE 19

V4 Startup

  • Note that the V4 has a wireless keyboard

– If the batteries are dead, obtain assistance

slide-20
SLIDE 20

System Startup

  • From here, both systems work the same
slide-21
SLIDE 21

System Startup

  • Select Ubuntu on this screen using the arrow

keys and hit “Enter” or wait for boot

slide-22
SLIDE 22

System Startup

  • Log in as FRI

– Using the FRI password. A mentor can help you.

slide-23
SLIDE 23

Doing your homework

  • For your homework, you will do this on a real

robot.

  • These screenshots come from doing this in

simulation.

– Installing the simulators is easy if you want to try. – Installation instructions for BWI here:

  • https://github.com/utexas-bwi/bwi

– Tutorials on running the robot & simulators here:

  • https://github.com/utexas-bwi/documentation/wiki/Software
slide-24
SLIDE 24

System Startup

  • Open several terminals
  • cd catkin_ws; source devel/setup.bash in each.
slide-25
SLIDE 25

System Startup

  • roscore
slide-26
SLIDE 26

System Startup

  • Startup your robot

– roslaunch bwi_launch segbot_v2.launch (or v4)

slide-27
SLIDE 27

Pick your floor

  • A screen will pop up asking what floor you are
  • n.
  • The AI floor is the 3rd floor, and you are almost

certainly there at this step of the process.

slide-28
SLIDE 28
slide-29
SLIDE 29
slide-30
SLIDE 30

Next, you want to move the robot

  • The next step is to move the robot into the

hallway, where you want to start.

  • I generally do this before localizing the robot,

just to get it out of people’s way.

slide-31
SLIDE 31

System Startup

  • Rosrun segbot_bringup teleop_twist_keyboard
slide-32
SLIDE 32

System Startup

  • Slow down the robot before moving. Hit z.
slide-33
SLIDE 33

Next, you want to move the robot

  • The robot will move quicker than you expect.
  • Press the ‘k’ button to stop its motion.
slide-34
SLIDE 34

System Startup

  • Localize the robot.
  • Hit “2D Pose Estimate”
  • Click on where the front of the robot is on the map

in rviz (the screen is on the BACK), and drag the cursor FORWARD.

– Note that the robot graphic may not be there, because

the real robot does not know where it is yet!!

  • A green arrow will appear showing what you believe

the robot’s pose to be.

slide-35
SLIDE 35
slide-36
SLIDE 36

System Startup

  • The system uses a probabilistic method to find

the robot’s pose, so your localization just aids in this process.

  • Instruct the robot to move a bit by using “2D Nav

Goal”

  • The robot’s localization will improve as it moves.
  • 2D Nav Goal works the same as “2D Pose

Estimate”

slide-37
SLIDE 37

System Startup

  • rosrun bwi_tasks visit_door_list
slide-38
SLIDE 38

System Startup

  • The robot will now start driving around visiting

the doors in the hallway.

slide-39
SLIDE 39

System Shutdown

  • ctrl-c in the terminal to kill visit_door_list
  • teleop_twist_keyboard to drive back into the lab
  • Do the start-up steps in reverse for powering on

the robot in order to power off

  • Plug back into the charger, and go back into

charge mode

slide-40
SLIDE 40

TF Classes & Functions

  • tf::TransformListener listener

– Special class that listens on behalf of the TF library

so it can compute transforms between frames.

  • tf::StampedTransform transform;

– A spacial transformation

  • Listener.lookupTransform("odom", "base_link",

ros::Time::now(), transform);

– Looks up the transformation “base_link” into “odom”

slide-41
SLIDE 41

One more..

  • listener.waitForTransform("odom", "base_link",

ros::Time(0), ros::Duration(4));

– Wait for the transform to be available – TF may not have heard enough data to make this

transform work, and it will throw an error in that event.

slide-42
SLIDE 42

You can also send frames

  • tf::TransformBroadcaster br
  • br.sendTransform(

tf::StampedTransform( transform, ros::Time::now(), fromFrame, toFrame));

  • StampedTransform - simply adds a timestamp to the transform
  • ros::Time::now() - makes that timestamp now
  • fromFrame, toFrame – The names the the frames.
slide-43
SLIDE 43

Poses & Transformation

  • Pose

– The position and orientation of an object in space – Generally expressed as a position and a rotation matrix into the

correct pose

  • Transformation

– The relationship between two poses – Generally expressed as a translation (a position) and a rotation

  • ROS has types for both!

– But they contain basically the same data

slide-44
SLIDE 44

Poses & Transformation

  • geometry_msgs::Pose pose

– pose.position

  • pose.position.x (y,z)

– pose.orientation

  • pose.orientation.x (y,z,w)

– Orientation is expressed

as a quaternion

  • We have software that

handles it.

  • tf::Transform transform

– getOrigin()

  • getOrigin().x() (y(), z())

– tf::Vector3 origin(x,y,z) – setOrigin(origin) – getRotation() – tf::Quaternion q(x,y,z,w) – setRotation(q) – The tf::Quaternion class also

supports

  • Euler angles
  • Axis & angle
slide-45
SLIDE 45

Example: In simulation

  • roscore
  • roslaunch bwi_launch simulation_v2.launch
  • If we select “Global Options” on the left, we can

change “Fixed Frame” to base_footprint, telling us to move rviz so it is focusing on the robot

slide-46
SLIDE 46

TF view_frames

  • rosrun tf view_frames

– This will generate a PDF of the current TF tree

slide-47
SLIDE 47
slide-48
SLIDE 48

A few of these frames

  • BWI uses a special map service to tell the robot which floor it is
  • n
  • “map” is our “global frame,” sometimes called the “inertial frame”

– It is the top level, all frames are relative to it

  • “3rdFloor_map” and “level_mux_map” are from the BWI map

service.

– You can safely ignore them

  • “odom”

– Short for “odometry.” – When we track the robot’s motion, it is relative to “odom.”

slide-49
SLIDE 49
slide-50
SLIDE 50

Where is the robot?

  • odom

– The robot’s motion is tracked relative to odom – odom is generally in a fixed position

  • base_footprint

– Where the robot is – Remember, base_footprint is at (0,0,0) in

base_footprint’s frame

– So we compare to odom to know where the robot is

slide-51
SLIDE 51

Example

  • We can watch this from the command line
  • rosrun tf tf_echo /base_footprint /odom
  • rosrun segbot_bringup teleop_twist_keyboard
  • If we watch tf_echo, we can see the translation changing.

– This is how our system knows where the robot is!

slide-52
SLIDE 52
slide-53
SLIDE 53

More frames

  • The robot has MANY frames
  • base_link

– The physical base of the robot’s mechanics

  • Rather than base_footprint, where it is on the floor
  • Decoupling the two simplifies modeling
  • chasis_base_plate_link

– Where parts are mounted on the BWIBot

  • laser_obstacle

– Where the laser sees the nearest obstacle

  • caster_wheel_link, base_link_left_wheel, base_link_right_wheel

– The robot’s wheels

slide-54
SLIDE 54

AR Tags

  • Let’s take a look at tracking an object and

finding its coordinate frame.

  • Alvar

– An open source library for tracking AR Tags

slide-55
SLIDE 55

Virtual Reality

  • The simulation of a

3D world that can be experienced first- hand

  • 3D

– Video games – Virtual worlds – Headsets

slide-56
SLIDE 56

Augmented Reality

  • Adds 3D content to

images and perception of the real world

  • The photo on the right

merges real content and virtual content in a real-estate application

slide-57
SLIDE 57

Augmented Reality Toolkit

  • What is an AR Tag?
  • Augmented Reality Tag

– The image on the right

comes from a demonstration of Augmented Reality Tool Kit

– Provides a coordinate

frame to render 3D content

  • nto

– We will use a similar

package called Alvar

slide-58
SLIDE 58

How does this work?

  • Recall our transforms

– Translation – Rotation

  • Augmented Reality uses projective

transformations

– Homographies – Projections

slide-59
SLIDE 59

Project Group Formation

  • This group will

– Do homework 5 together – Do homework 6 together – Do the final project together

  • Take 15 minutes to choose a team

– Email me you + your teammates + your EIDs and

email addresses, 1 per team

slide-60
SLIDE 60

For homework 5

  • Work out the Direct Linear Transformation

(DLT) of a homography to resolve its position in space

– Just kidding

  • From our perspective, AR Tags provide us with

– Translation – Rotation – And work with TF2

slide-61
SLIDE 61

ar_track_alvar

  • ROS package for using AR Tags!
  • Needs to be configured for your camera

– This happens in a .launch file

  • Generate AR Tags
  • Print them
  • Attach them to a rigid surface
  • Your system can now track an AR Tag
  • Example