Hands-on your TurtleBot 2 16-311 Qatar - Gianni A. Di Caro October - - PDF document

hands on your turtlebot 2
SMART_READER_LITE
LIVE PREVIEW

Hands-on your TurtleBot 2 16-311 Qatar - Gianni A. Di Caro October - - PDF document

Hands-on your TurtleBot 2 16-311 Qatar - Gianni A. Di Caro October 9, 2017 Abstract This document is a step-by-step guide to learn how to use your TurtleBot 2 robot. It must be used in conjunction with the companion document that describes


slide-1
SLIDE 1

Hands-on your TurtleBot 2

16-311 Qatar - Gianni A. Di Caro∗

October 9, 2017

Abstract This document is a step-by-step guide to learn how to use your TurtleBot 2 robot. It must be used in conjunction with the companion document that describes the use of ROS.

Contents

1 Let’s switch it on! 1 1.1 OS fixes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 1.2 Basic tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.3 Move it on teleoperation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 1.4 Keep running with closed notebook lid . . . . . . . . . . . . . . . . . . . . . . . . 6 2 Network configuration 6 3 Let’s check the camera 7

1 Let’s switch it on!

Your TurtleBot 2 is composed by a mobile base, that in our case is a kobuki base from Yu- jin Robot (S. Korea), plus a rigid frame structure that allows to place a computer (an Asus notebook in our case), that is connected to the base through USB, and one or more sen- sors and actuators. The default sensor is a depth camera, that in our case is an Orbbec Astra (https://orbbec3d.com/product-astra). Moreover, our configuration also features a robotic arm, a PhantomX Reactor Robot Arm, placed on top of the robot platform (http:// www.trossenrobotics.com/p/phantomx-ax-12-reactor-robot-arm.aspx). The robot arm is connected to the notebook via a USB hub. The robot also comes with a docking station that allows the robot to (autonomously) recharge. The robot comes with all cables in the right places. However, it is always useful to give it a look and get an understanding of the wiring.

∗The content of this documents is based on a number of Internet sources, including ROS wiki pages.

1

slide-2
SLIDE 2

Figure 1: Components of our TurtleBot 2.

Let’s focus on the mobile base, as shown in the figure, with an indication for the meaning of the different components. More precisely:

  • 19V/2A: Laptop power supply
  • 12V/5A: Arm power supply
  • 12v/1.5A: Depth camera power supply
  • 5V/1A: General power supply
  • Status LED: Indicates Kobuki’s status

– Green: Kobuki is turned on and battery at high voltage level – Orange: On - Low battery voltage level (please charge soon) – Green blinking: On - Battery charging – Off: Kobuki is turned off

  • LED1/2: Programmable LEDs
  • USB: Data connection
  • BO/1/2: Programmable Buttons
  • Firmware switch: Enable/disables the firmware update mode

Now, locate the On/Off button on the side of the Kobuki base, turn TurtleBot on, and it is ready to go (also the notebook should be switched on, in order to inspect what’s going on, of course). Batteries should have been charged. Keep in mind that by default, when you dock to the charging station the Kobuki base will charge but the notebook will not charge. The Kobuki base includes a 19V 2A power rail that allows you to autonomously charge the notebook as well but some adaptations would be needed.

1.1 OS fixes

The following actions may be required to let the system working properly.

2

slide-3
SLIDE 3

Figure 2: Kobuki base, front and top views.

Connect the notebook to the kobuki base and log in to the notebook: Username: turtlebot password: ros First check that the kobuki device has been created, in a shell execute: $ ls -n /dev | grep kobuki if it returns an empty list, then run the following: $ roscd kobuki_ftdi $ rosrun kobuki_ftdi create_udev_rules $ sudo udevadm trigger and check again that now everything is fine. Network might be unstable: try to remove IP6 forwarding. In the network menu: Edit connections → Your network → Edit → IPv6 Settings → Method: Ignore.

1.2 Basic tests

Open a terminal shell and start ros: $ roscore Check that everything went fine. Now, get connected to the kobuki base and start a few basic nodes to access its functionalities (--screen provides verbose output on the screen): $ roslaunch kobuki_node minimal.launch --screen Let’s take a look at the health status of the robot: $ rosrun rqt_robot_monitor rqt_robot_monitor which should show that everything is fine and report a number of useful information about sensors and status.

3

slide-4
SLIDE 4

✍ Get the same info by getting topics information (i.e., use rosotpic). From the result, you can observe that the data I/O follows a structure:

  • <node_name>/sensor/: Continuous streams of the sensor’s data (note that /odom and

/joints_states are remapped by default, since they are widely used at the highest lev- el/namespace);

  • <node_name>/events/: If a sensor changes its state, e.g. bumper pressed/released, the

new state is published here;

  • <node_name>/commands/: Use these topics to make Kobuki do things like switch on LEDs,

playing sounds and driving around;

  • <node_name>/debug/: Outputs detailed info about a lot of Kobuki’s internals.

Now, let’s run some tests in order to check that everything is in place as expected (keep the minimal node launch up):

  • Battery status:

$ rosrun kobuki_testsuite test_battery_voltage.py

  • Events:

$ rosrun kobuki_testsuite test_events.py Hit the robot on bumpers, move it, . . . , and check whether the event is being detected and measures change accordingly.

  • Sensors:

$ rosrun kobuki_testsuite test_gyro.py

  • Simple actuators:

$ rosrun kobuki_testsuite test_led_array.py $ rosrun kobuki_testsuite test_sounds.py ✍ Now, do the same tests for the different sensor and events, but using directly ROS topics for reading out what’s going on, like, for instance: $ rostopic echo /mobile_base/sensors/imu_data $ rostopic echo /mobile_base/events/bumper ✍ Let’s make the robot doing something by publishing on topics:

  • From the command line, switch on/off or change color to a LED. The available color/com-

mands are: – 0 - off, – 1 - green, – 2 - orange, – 3 - red. Check with rosmsg the message types that are published on the Led topics and publish data using the correct data types / fields. For instance, in the case of Leds:

4

slide-5
SLIDE 5

$ rostopic pub /mobile_base/commands/led1 kobuki_msgs/Led "value: 1"

  • As before, check the data types and play a sound, where the available sequences are:

– 0 - on – 1 - turn off, – 2 - recharge start, – 3 - pressed button, – 4 - error sound, – 5 - start cleaning, – 6 - cleaning end.

  • Move the turtlebot by publishing linear x,y,z and angular x,y,z velocities, such as:

$ rostopic pub -r 1 /mobile_base/commands/velocity geometry_msgs/Twist > ’{linear: {x: 0.05, y: 0.0, z: 0.0}}’

1.3 Move it on teleoperation

Let’s move the turtlebot by controlling the kobuki base with the keyboard: Open in one shell: $ roslaunch kobuki_node minimal.launch --screen Open in second shell: $ roslaunch kobuki_keyop safe_keyop.launch --screen ✍ Read the instructions and move your robot around. On other shells, read relevant topics to monitor what’s going on, including odometry estimates. Note: there’s also a “non-safe” keyop.launch which is a “raw” version of the “safe” one, since it doesn’t smooth velocity out and doesn’t check on bumpers or cliff events. A bit more sophisticated node for keyboard teleoperation is the following, that however requires to bring up more functionalities: $ roslaunch turtlebot_bringup minimal.launch $ roslaunch turtlebot_teleop keyboard_teleop.launch Read the instructions and move the robot around. Since moving the robot while pressing keys on a keyboard is not handy, let’s use a joystick, therefore, instead of using the keyboard teleop launcher let’s use: $ roslaunch turtlebot_teleop logitech.launch ✍ Move the robot around. Is it difficult to control it precisely? Find the launcher files for both keyboard and joystick teleoperations and modify the scaling parameter for the linear and angular velocities. The launcher file for keyboard teleop looks like:

5

slide-6
SLIDE 6

<launch> <!-- turtlebot_teleop_key already has its own built in velocity smoother --> <node pkg="turtlebot_teleop" type="turtlebot_teleop_key" name="turtlebot_teleop_keyboard" output="screen"> <param name="scale_linear" value="0.5" type="double"/> <param name="scale_angular" value="1.5" type="double"/> <remap from="turtlebot_teleop_keyboard/cmd_vel" to="cmd_vel_mux/input/teleop"/> </node> </launch>

Note that the command <remap from="turtlebot_teleop_keyboard/cmd_vel" to="cmd_- vel_mux/input/teleop"/> remaps the topic of the velocity command to cmd_vel_mux/input/teleop, which is the topic that the turtlebot uses to receive velocity commands. In fact, in the turtlebot_teleop_key the programmers used the topic turtlebot_teleop_keyboard/cmd_- vel to publish velocity commands, but the turtlebot uses the cmd vel mux/input/teleop topic to receive velocity commands. The remap command allows to change the definition of the turtle- bot teleop keyboard/cmd vel to the topic defined in the turtlebot robot. Without the remap the keyboard teleoperation will not work as the topic used for publishing velocity commands will be different from the one used to receive them in the robot. The mux part in the topics denotes the fact that the robot can actually get velocity inputs from different sources and then

  • perate a demultiplexing operation to prioritize them.

1.4 Keep running with closed notebook lid

It’s a lot more safe/practical to close the lid of TurtleBot’s notebook when driving around (and is being controlled from a remote node, as shown in the next section). To prevent it from turning

  • ff: System Settings (gear icon on left finder bar) → Search (top right) for power → Power

→ When lid is closed set to Do Nothing for both On battery power & When plugged in.

2 Network configuration

First, be sure that ssh is there and it works. With ifconfig get the IP address of the notebook. With hostname get the host name of the notebook. Try first to ping it, and then check that ssh works by connecting to it from your main laptop: $ ping TURTLEBOT_IP $ ssh turtlebot@TURTLEBOT_IP

  • r,

$ ping TURTLEBOT_HOSTNAME $ ssh turtlebot@TURTLEBOT_HOSTNAME If ssh doens’t work, then remove it and reinstall it (and check it again): $ sudo apt-get purge openssh-server $ sudo apt-get remove openssh-server $ sudo apt-get install openssh-server In order to create a ROS network, a ROS Master needs to be set up. The ROS Master is started with the roscore command, and there is only one master, that we will run on the robot’s

  • notebook. The IP address of the Master must be specified in the .bashrc files, in order to let

all robot being able to connect to it.

6

slide-7
SLIDE 7

In the .bashrc of robot’s notebook, add the following lines (if they are not there yet): $ export ROS_MASTER_URI=http://localhost:11311 $ export ROS_HOSTNAME=TURTLEBOT_IP OR TURTLEBOT_HOSTNAME In the .bashrc of your main laptop (let’s call it, the workstation, add the following lines: $ export ROS_MASTER_URI=http://TURTLEBOT_IP:11311 $ export ROS_HOSTNAME= result from hostname() or ifconfig ✍ Now, test the configuration. In turtlebot’s notebook: $ roscore and in the workstation: $ rostopic list If everything went fine, you should get (assuming that no nodes are running on turtlebot): /rosout /rosout_agg ✍ Publish something on a /hello topic on the workstation, and check that this is received on the master: $ rostopic pub -r10 /hello std_msgs/String "hello" . . .

3 Let’s check the camera

We need to bring-up more node functionalities in the turtlebot, in particular, those that expose camera data: $ roslaunch astra_launch astra.launch On the robot on in the remote workstation: $ rosrun image_view image_view image:=/camera/rgb/image_raw $ rosrun image_view image_view image:=/camera/depth/image_raw

  • r,

$ rosrun rviz rviz In this case, in order to view camera data (RGB and depth), additional steps are necessary. First, the camera must be included in the view: use the Add button to add a Camera.

7

slide-8
SLIDE 8

It is necessary to select a topic being published by the camera sensor, for example: /camera/rgb/image_- color. And select a Fixed Frame, for example: camera_depth_frame

8

slide-9
SLIDE 9

In order to see also a point cloud, pointcloud2 has to be added as before, and select camera_- depth_registered_points as topic.

9