CS 309: Autonomous Intelligent Robotics FRI I Lecture 17: Starting - - PowerPoint PPT Presentation
CS 309: Autonomous Intelligent Robotics FRI I Lecture 17: Starting - - PowerPoint PPT Presentation
CS 309: Autonomous Intelligent Robotics FRI I Lecture 17: Starting the Robot Coordinate Frames TF2 Forming Project Groups Instructor: Justin Hart http://justinhart.net/teaching/2019_spring_cs309/ Pick a BWIBot V2 V4 V2 Startup Find
Pick a BWIBot
V2 V4
V2 Startup
- Find the back of the robot
– Yes, the screen faces the BACK
V2 Startup
- The switch should be in the “charge” position
when you find the robot.
V2 Startup
- Set the switch to the neutral position
V2 Startup
- Unplug the charging cable
- Set the switch to the “battery” position
V2 Startup
- Go to the front of the robot
V2 Startup
- Press the green button.
– It should light up.
V2 Startup
- Press the yellow button.
– Yellow button and blue indicator light should light.
V2 Startup
- Hit the power button on the laptop
V4 Startup
- The V4 is different
– This is actually the front of the robot.
V4 Startup
- Disconnect the power supply.
– First, undo the screws on the connector
V4 Startup
- Disconnect the power supply.
– Then disconnect the plug.
V4 Startup
- Locate the Emergency Stop and Power Button
– On the top at the front of the unit on the V4
V4 Startup
- The Emergency Stop may be depressed
– If it is, gently twist it to the right and it will lift.
V4 Startup
- Press the power button.
V4 Startup
- It will illuminate.
– This indicates that the robot is now on.
V4 Startup
- Reach back to the right inside the chasis.
– You can feel the power button to turn it on.
V4 Startup
- Note that the V4 has a wireless keyboard
– If the batteries are dead, obtain assistance
System Startup
- From here, both systems work the same
System Startup
- Select Ubuntu on this screen using the arrow
keys and hit “Enter” or wait for boot
System Startup
- Log in as FRI
– Using the FRI password. A mentor can help you.
Doing your homework
- For your homework, you will do this on a real
robot.
- These screenshots come from doing this in
simulation.
– Installing the simulators is easy if you want to try. – Installation instructions for BWI here:
- https://github.com/utexas-bwi/bwi
– Tutorials on running the robot & simulators here:
- https://github.com/utexas-bwi/documentation/wiki/Software
System Startup
- Open several terminals
- cd catkin_ws; source devel/setup.bash in each.
System Startup
- roscore
System Startup
- Startup your robot
– roslaunch bwi_launch segbot_v2.launch (or v4)
Pick your floor
- A screen will pop up asking what floor you are
- n.
- The AI floor is the 3rd floor, and you are almost
certainly there at this step of the process.
Next, you want to move the robot
- The next step is to move the robot into the
hallway, where you want to start.
- I generally do this before localizing the robot,
just to get it out of people’s way.
System Startup
- Rosrun segbot_bringup teleop_twist_keyboard
System Startup
- Slow down the robot before moving. Hit z.
Next, you want to move the robot
- The robot will move quicker than you expect.
- Press the ‘k’ button to stop its motion.
System Startup
- Localize the robot.
- Hit “2D Pose Estimate”
- Click on where the front of the robot is on the map
in rviz (the screen is on the BACK), and drag the cursor FORWARD.
– Note that the robot graphic may not be there, because
the real robot does not know where it is yet!!
- A green arrow will appear showing what you believe
the robot’s pose to be.
System Startup
- The system uses a probabilistic method to find
the robot’s pose, so your localization just aids in this process.
- Instruct the robot to move a bit by using “2D Nav
Goal”
- The robot’s localization will improve as it moves.
- 2D Nav Goal works the same as “2D Pose
Estimate”
System Startup
- rosrun bwi_tasks visit_door_list
System Startup
- The robot will now start driving around visiting
the doors in the hallway.
System Shutdown
- ctrl-c in the terminal to kill visit_door_list
- teleop_twist_keyboard to drive back into the lab
- Do the start-up steps in reverse for powering on
the robot in order to power off
- Plug back into the charger, and go back into
charge mode
TF Classes & Functions
- tf::TransformListener listener
– Special class that listens on behalf of the TF library
so it can compute transforms between frames.
- tf::StampedTransform transform;
– A spacial transformation
- Listener.lookupTransform("odom", "base_link",
ros::Time::now(), transform);
– Looks up the transformation “base_link” into “odom”
One more..
- listener.waitForTransform("odom", "base_link",
ros::Time(0), ros::Duration(4));
– Wait for the transform to be available – TF may not have heard enough data to make this
transform work, and it will throw an error in that event.
You can also send frames
- tf::TransformBroadcaster br
- br.sendTransform(
tf::StampedTransform( transform, ros::Time::now(), fromFrame, toFrame));
- StampedTransform - simply adds a timestamp to the transform
- ros::Time::now() - makes that timestamp now
- fromFrame, toFrame – The names the the frames.
Poses & Transformation
- Pose
– The position and orientation of an object in space – Generally expressed as a position and a rotation matrix into the
correct pose
- Transformation
– The relationship between two poses – Generally expressed as a translation (a position) and a rotation
- ROS has types for both!
– But they contain basically the same data
Poses & Transformation
- geometry_msgs::Pose pose
– pose.position
- pose.position.x (y,z)
– pose.orientation
- pose.orientation.x (y,z,w)
– Orientation is expressed
as a quaternion
- We have software that
handles it.
- tf::Transform transform
– getOrigin()
- getOrigin().x() (y(), z())
– tf::Vector3 origin(x,y,z) – setOrigin(origin) – getRotation() – tf::Quaternion q(x,y,z,w) – setRotation(q) – The tf::Quaternion class also
supports
- Euler angles
- Axis & angle
Example: In simulation
- roscore
- roslaunch bwi_launch simulation_v2.launch
- If we select “Global Options” on the left, we can
change “Fixed Frame” to base_footprint, telling us to move rviz so it is focusing on the robot
TF view_frames
- rosrun tf view_frames
– This will generate a PDF of the current TF tree
A few of these frames
- BWI uses a special map service to tell the robot which floor it is
- n
- “map” is our “global frame,” sometimes called the “inertial frame”
– It is the top level, all frames are relative to it
- “3rdFloor_map” and “level_mux_map” are from the BWI map
service.
– You can safely ignore them
- “odom”
– Short for “odometry.” – When we track the robot’s motion, it is relative to “odom.”
Where is the robot?
- odom
– The robot’s motion is tracked relative to odom – odom is generally in a fixed position
- base_footprint
– Where the robot is – Remember, base_footprint is at (0,0,0) in
base_footprint’s frame
– So we compare to odom to know where the robot is
Example
- We can watch this from the command line
- rosrun tf tf_echo /base_footprint /odom
- rosrun segbot_bringup teleop_twist_keyboard
- If we watch tf_echo, we can see the translation changing.
– This is how our system knows where the robot is!
More frames
- The robot has MANY frames
- base_link
– The physical base of the robot’s mechanics
- Rather than base_footprint, where it is on the floor
- Decoupling the two simplifies modeling
- chasis_base_plate_link
– Where parts are mounted on the BWIBot
- laser_obstacle
– Where the laser sees the nearest obstacle
- caster_wheel_link, base_link_left_wheel, base_link_right_wheel
– The robot’s wheels
AR Tags
- Let’s take a look at tracking an object and
finding its coordinate frame.
- Alvar
– An open source library for tracking AR Tags
Virtual Reality
- The simulation of a
3D world that can be experienced first- hand
- 3D
– Video games – Virtual worlds – Headsets
Augmented Reality
- Adds 3D content to
images and perception of the real world
- The photo on the right
merges real content and virtual content in a real-estate application
Augmented Reality Toolkit
- What is an AR Tag?
- Augmented Reality Tag
– The image on the right
comes from a demonstration of Augmented Reality Tool Kit
– Provides a coordinate
frame to render 3D content
- nto
– We will use a similar
package called Alvar
How does this work?
- Recall our transforms
– Translation – Rotation
- Augmented Reality uses projective
transformations
– Homographies – Projections
Project Group Formation
- This group will
– Do homework 5 together – Do homework 6 together – Do the final project together
- Take 15 minutes to choose a team
– Email me you + your teammates + your EIDs and
email addresses, 1 per team
For homework 5
- Work out the Direct Linear Transformation
(DLT) of a homography to resolve its position in space
– Just kidding
- From our perspective, AR Tags provide us with
– Translation – Rotation – And work with TF2
ar_track_alvar
- ROS package for using AR Tags!
- Needs to be configured for your camera
– This happens in a .launch file
- Generate AR Tags
- Print them
- Attach them to a rigid surface
- Your system can now track an AR Tag
- Example