Building a 3D Printed Mini Rescue Robot (based on OARK)

Most robot kits available in the market, such as : LEGO Mindstorm, Fischer Technic, Bioloid, are limited in flexibility and durability. There may be no available parts that suitable for particular need. Those robots will also not be able to traverse on irregular and cluttered terrain which is common, for example, in rescue area.  Open Academic Robot Kit (OARK) is an open source robot kit that exploits the advent of 3D printer technology. It is created by Dr. Raymond Sheh (Curtin University) to lower the barrier for everyone who wants to enter robotics research field, particularly rescue robot. The robot also has a nick name: Emu Mini 2, as it is a mini version of the Emu, an autonomous rescue robot in school of Computer Science and Engineering , UNSW.  The official webpage of OARK is http://oarkit.intelligentrobots.org/.

Here is the compilation of Emu Mini 2 which I demonstrate on a mini rescue arena at RoboCup 2015 in Hefei, China.

I will describe the main components of the robot that we build here at CSE – UNSW based on OARK:

  • 7 servo motors Dynamixel AX-12 from Robotis
    • 4 motors to move the mobile base and 3 others to move the arm
  • USB2Dynamixel from Robotis
    • It is utilised to connect the motors to RaspberryPi via its USB port (not its GPIO pins as usual)
    • OpenCM 9.04 board is also can be used, see here
  • RaspberryPi 2 and a Micro SD card
    • Although the first version can be used, RaspberryPi 2 have much faster processor and it has 4 USB ports which is useful in our system
  • RaspberryPi Camera and a long camera cable (50 cm)
    • A long cable is needed as the camera is located in an articulated arm
  •  Wi-Pi dongle from Element 14
    • A wi-fi dongle for Raspberry Pi
  • Wireless Gamepad Logitech F710
    • A PS3 dualshock controller is also can be used, see here
  • Rechargeable LiPo battery (output : 12V, 1750 mA)
    • To supply the servo motors
  • Power bank (output 5V, 1 A) or attach a voltage regulator to convert 12 V (from battery) to 5V
    • To supply the Pi

Here is the not-so-complete list of the robot’s materials and where to get them.

There are 7 steps to build this robot.

  1. Preparing the mechanical robot parts
  2. Preparing the RaspberryPi 2
  3. Preparing connection between RaspberryPi and Dynamixel AX12 motor
  4.  Controlling Dynamixel AX12 using PyDynamixel Library
  5.  Interfacing a wireless joystick and RaspberryPi
  6.  Intuitive motor control using a wireless joystick
  7.  Video streaming via PiCamera

Step 1 – preparing the mechanical robot parts

In OARK, mechanical robot parts include the chassis of the mobile base, the chassis of the arm, and the wheels. All of these parts are 3D printed.

3d_printed_part

There are 4 main stages in preparing the mechanical robot parts for OARK:

  1. Designing the parts
  2. 3D printing the parts
  3. Post-processing the parts
  4. Assembling all of the mechanical parts

Here are their explanations in detail.

1. Designing

We may design the robot by ourselves, or just use the open source design of OARK (Raymond’s Thingiverse page). The good thing about using open source design is we do not have to start designing from scratch, but we may build on the previous result of other robot enthusiasts. If we do make a modification on the design, it is very good to share it back to the community via website as Thingiverse. This is one of the very spirit of the maker movement.

2. 3D printing

Using a 3D printer, in my experience, is not straightforward. I use a, bit old, Solidoodle 3 printer in our robotics laboratory. Here is the overall guide in using this printer.

Some of take away in our experiences are:

  • Put the 3D printer in the area with fairly strong Air Conditioning breeze could give us problem, as the plastic may get cold faster.
  • The position of plastic roll may also create problem. When you print a large object, the plastic will got tighten and loosen fast, that in my experience can break the printing result.
  • You will be in trouble if printing more than 4 hours. Most of our printing results are successful when it is done less than 4 hours. Instead of printing 2 wheels directly (which may take 7 – 8 hours), it is better to print 1 wheel at a time. I also ever break the glass (plate of the 3D printer) when I print in a long time.
  • Before printing, always give a layer of acetone on the plate base. Giving acetone layer again after several layer in the beginning may also be useful, especially if you print a large object that take long time.
  • If the object is too big to be printed directly, it can be divided using the software (e.g. SketchUp) and we can glue them after that (e.g. using SuperGlue).
  • Be careful when pick the finish object from the base plate. It is easier, and less harmful, to pick it when the base is already cold. Use a tool to pick it if necessary.

3. Post-processing the parts

3D printer does not always give us a perfect result. In this case we must give a final touch to make your part completely ready. We can use a slotted (flat blade) screwdriver to remove the support material from the object. A file can be used to make some section smoother, and a drill can be used to make a hole on it. The mixture of acetone and the ABS slug can be used to fill the small gap between the material layers (see here for other ideas).

4. Assembling all the mechanical parts

When all the parts are ready, assembling process is quite easy and straightforward. We use screws (and washers) from Robotis Bioloid Kit, namely S3 – S4 – S5, which is similar with normal screw with size M2 x 8 mm, M2 x 10mm, M2 x 12 mm. Here are the assembling stages:

  • Assemble the base
    • Attach the RaspberryPi on the base
    • Attach the motors on the base
    • Attach the wheels on the motors
  • Assemble the arm
    • Attach the motors and the arm’s chassis
  • Connecting the base and the arm

Step 2 – Preparing the RaspberryPi 2

In this step, we will configure the RaspberryPi 2 (shorten as Pi now and then) so it will be ready to use as our robot’s controller. Here is a glance of Pi and its components from its official web.

pi_diagramsource : https://www.raspberrypi.org/blog/new-quick-start-guide/

Installing the Operating System for Pi

If you are a beginner in Pi, you may prefer to use NOOBS (New Out Of The Box Software), an easy operating system installer that contain Raspbian (linux debian OS that optimised for Pi). The easiest thing to do this is buy a pre-installed NOOBS SD card. You can also use an empty micro SD card, download NOOBS and follow the guide to install it on your SD card. The official web of Pi contains very good materials to get start.

The network configuration for Pi

Next important step is network configuration for Pi, wired and wireless network. Follow this useful guidance by Simon Monk.

It is also useful if you are able to connect to Pi via remote connection, so you do not have to access Pi using external monitor, keyboard, and mouse. Instead you can just use your laptop to access Pi. There are 2 most popular ways to connect to Pi remotely, using SSH (Secure SHell) and VNC (Virtual Network Computing). Personally I prefer to use VNC as it gives us the full graphic desktop environment of Pi, while we only have access to command line in SSH.

Access Pi locally using an ethernet cable

Whenever you have wired or wi-fi network, you always can connect remotely to your Pi. But how if it is not? I experience this situation when I went to RoboCup 2015 competition at Hefei, China. In this competition, wi-fi network is forbidden and it is not possible to connect to the internet via wired connection all the time. It is also not possible to get the monitor just to access Pi. It turns out that ability to connect Pi without network (and also without external monitor) is precious.

The tutorial to connect locally to Pi using an ethernet cable can be found here. I have tried it, and it was not a successful trial. After discussing with a friend, I am able to do it by using this configuration. Firstly, we need to give static IP (wired network) to Pi. On your terminal, edit this file:

sudo nano /etc/network/interface

by putting this code (for example)

iface eth0 inet static
address 192.168.2.10
netmask 255.255.255.0
gateway 192.168.2.1

Then reboot your Pi.

Secondly, we will configure the computer which will access Pi locally. We also must give static IP to the wired network. This means that we can not access the internet via wired network, as it is just connect to Pi. We still can access it via wireless network. On your wired network setting (section IPv4), change the method from DHCP to manual, and give these configurations :

address : 192.168.2.2
netmask : 255.255.255.0
gateway : 0.0.0.0

Once you are connected to Pi, you can access it via VNC or SSH.

Step 3 – Preparing connection between RaspberryPi and Dynamixel AX12 motor

The motor used in OARK is Dynamixel AX12 from Robotis. We use USB2Dynamixel as a device to interface between Pi and AX12 via Pi’s USB port. Then the motors will connected each other by using daisy chain configuration. The official manual of this device can be found here.

u2d_usage01http://support.robotis.com/en/product/auxdevice/interface/usb2dxl_manual.htm

To connect to AX12, in USB2Dynamixel, we must select the TTL communication mode (number 1) and use its 3P connector. See the figures from the manual below.

u2d_fig01

u2d_fig02

u2d_selectsource : http://support.robotis.com/en/product/auxdevice/interface/usb2dxl_manual.htm

The 3P connection of the USB2Dynamixel should be prepared carefully. There are 3 pins : GND, NOT connected, and DATA (see the 3Pin cable table below).

u2d_pin3source : http://support.robotis.com/en/product/auxdevice/interface/usb2dxl_manual.htm

If we want to power it, we must connect pin 2 to + and pin 1 to – of an external 12V supply (e.g. battery).

u2d_power02source : http://support.robotis.com/en/product/auxdevice/interface/usb2dxl_manual.htm

To do this, I take a 3 pin cable from a Bioloid kit (Robotis), cut two of them, and solder them with cables that goes to + and – port of a battery. See my “untidy” customised cable connection below. If our connection is correct, the AX12 motor will flash a red light once it is connected. The more proper connection can be made by adding a power switch and connect the USB2Dynamixel to the battery while the switch is off.

usb2dynamixel cable

Then we can connect USB port of USB2Dynamixel to USB port of Pi. The 3P connector will be connected to a series of AX-12 servo motors (in daisy chain configuration).

Step 4 – Controlling Dynamixel AX12 using PyDynamixel Library

This section assumes that you are familiar with Python programming. If you are not, you can start to learn it from here.

To control the Dynamixel AX12 motor from Pi using Python, one can use existed PyDynamixel library that created by Ian Danforth. Firstly we have to check the prerequisites for this library:

Afterwards we can install pyDynamixel by entering this command on terminal.

sudo pip install dynamixel

After library installation is successful, make the example.py executable by writing this command.

chmod +x example.py

Then execute it while the USB2Dynamixel connected with the 12 V supply and Dynamixel AX-12 (one or series of them) by using this command.

python example.py

If we answer the questions accordingly, we will get the motors identities as the results and the motors will move to their neutral position. Here is the snippet of the useful code from example.py to control the motor:

# Set up the servos
for actuator in net.get_dynamixels():

# set the speed of the motor
actuator.moving_speed = 50
# determine the desired goal position
actuator.goal_position = 512
# torque control setting
actuator.torque_enable = True
actuator.torque_limit = 800
actuator.max_torque = 800

# Send all the commands to the servos
net.synchronize()                 # synchronise all of the motors

Try to move each motor, then some of them together so we can get the basic movements of the robot (forward, backward, turn left, and turn right). The other thing to consider is if we assign  0 – 1023 value, the motor will move in certain direction, while if we assign 1024 – 2047, the motor will get reverse direction.

Step 5 – Interfacing a wireless joystick and RaspberryPi

The next step is interfacing a wireless joystick, Logitech F710, with the Pi. There is some excellent tutorials (and code example) by Eric Goebelbecker about this. It is very goof to follow the tutorials sequentially to grab the deeper understanding. Although the tutorials are written for Gopigo robot, it is still relevant with OARK robot as it also uses Pi as its controller.

emu mini + joystick

Read this tutorial : Tutorial 1: Reading the Device

In this tutorial, we will learn how to read the general input from the joystick as a device that connects to the USB port of Pi. Then it guides us to install and use “evdev” package to interpret (only) the buttons of the gamepad. It also provides a simple code example.

Read this tutorial : Tutorial 2: controlling the robot

Next, we will learn to control the robot using the buttons. It includes performing basic movements (forward, backward, turn left, turn right, stop) and changing the speed of the motors. Gopigo robot is equipped by several DC motors that connected to Pi via its GPIO port, while OARK robot uses smart servo motors (Dynamixel AX-12) that connected to USB port of Pi via a Dynamixel2USB converter. It means that we have to adapt the program before we can use it to control the AX-12 motors (see again OARK – Step 4). Having said that, we still can use the general structure of the program in this tutorial.

Read this tutorial : Tutorial 3

The last tutorial is bit irrelevant with our project, however it helps to access the analog inputs of the joystick, while the previous tutorials only utilises the joystick’s button.

Step 6 – Intuitive motor control using a wireless joystick

Intuitive control of mobile base

In previous step, I only use simple control of mobile base using Logitech F710 gamepad. For example, when I press the “Forward” or “Backward” push button, I will move the robot forward with constant speed. Similar thing happens if I press “Left” or “Right” push button, the robot will spin to left or right with constant speed. If I want the robot to turn right until it reaches 90 degrees, then I press “Right”, and release it manually if it seems the robot has achieve it.

Above control is not intuitive for human and not helpful when an operator must operate a rescue robot in a cluttered terrain. More useful control for mobile robot can be achieved by just using 1 joystick in the gamepad to control direction and speed. The idea is when I give the joystick full forward movement, it makes sense that I want my robot to move forward with the full speed. When I move the joystick softly, then I want the robot to move a bit slower.  The idea and how to implement it roughly can be found here.

6SDXJ

Intuitive control of arm

To control the arm, we can use the buttons in the gamepad to control each motor of the arm. But, again, it is not intuitive. This arm has camera in its end, so it is expected that the camera can give operator sensible picture (by keeping the camera upright with respect to ground) although the arm move freely. This can be achieved by using simple math equation. Although it’s possible to use the second joystick of gamepad, it is more comfortable to use one button to raise the arm and another button to lower it.

Step 7 – video streaming via PiCamera

In rescue area, it is very common that a robot will be teleoperated by a human operator. This happens because rescue area is very cluttered and unstructured, so it will be very difficult for a robot to perform autonomously. Other reason is there may be a high risk situation that preferred to be evaluated by humans.

We can use PiCamera to perform video streaming so we can control the robot just by looking at the laptop.

1. General

To get to know PiCamera, check its basic stuffs here [1] , [2] .

2. Display streaming video from Pi to computer

There is good resource here (http://zacharybears.com/low-latency-raspberry-pi-video-streaming/), After I follow exactly the above tutorial, it is not worked. Then I try some suggestions in the comments section and it fix the problem. FYI, I use Ubuntu 14.04.

Here is how I fix it. First, I install mplayer2 and cat.

Computer :

Firstly we have to give permission to connect from the computer :

sudo /sbin/iptables -I INPUT 1 -p tcp --dport 5000 -j ACCEPT

Then we may start the video player :

nc.traditional -l -p 5000 | mplayer -noautosub -nosub -noass -nosound -framedrop -nocorrect-pts -nocache -fps 20 -demuxer h264es -vo x11 -

RaspberryPi :

Here are some commands for setting (only done once) :

mkfifo fifo.500

cat fifo.500 | nc.traditional <your ip> 5000 &

This is the main command, with the flip of horizontal and vertical (-vf -hf) :

raspivid -v -w 640 -h 480 -b 10000000 -fps 20 -p '600,200,640,480' -t 0 -vf -hf -o - | nc.traditional 192.94.172.76 5000 &

Note :

  • Make sure the IP address is belong to the laptop that you used for VNC
  • To stop the process, if it returns [2508], then do : kill 2508, then ctrl+C

I will put the whole code of this robot in Github soon.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s