Support Vector Machine

Note: this post meant to help clarify the tutorial question number 2  for COMP 9417 – Week 9, School of Computer Science and Engineering, UNSW (s1 – 2017)

Support Vector Machine

Support Vector Machine (SVM) is essentially an approach to learning linear classifiers  which enables SVM to maximising the margin. Here is the picture, inspired by Flach – Fig. 7.6 – 7.7, that shows the difference between decision boundary produced by SVM, and other linear classifiers (such as: linear regression or perceptron).

To achieve that, SVM utilise below objective function, which attempts to find the values of alpha_1,...,alpha_n that maximise the function.

Continue reading

Algorithm Independent Aspects of Machine Learning

Note: this post meant to help clarify the tutorial questions (number 1, 3, 4, 5)  for COMP 9417 – Week 11, School of Computer Science and Engineering, UNSW (s1 – 2017)

Regardless of machine learning algorithm that we choose, we still may want to know about these stuffs (slide – page 6):

  • How many training examples a learner should have before it converges to correct hypothesis? (sample complexity)
  • How large is a hypothesis space? How complex is a learner’s hypothesis? (hypothesis complexity)
  • How many errors a learner are allowed to misclassify before it finally converges to a successful hypothesis? (mistake bounds)

Let’s take a look at first aspect. Continue reading

How smart is a robot?

Recently, there is a hype about smart robots that will replace humans in many areas. These robots usually has a certain degrees of artificial intelligence. But are most of the robots are truly intelligent? What is intelligence anyway? This post want to give simple explanation about intelligent robot from artificial intelligence (AI) perspective.

An example of “an intelligent robot”

Consider the example of a robot which competes in firefighting robot contest. In this competition, robot should explore a maze, with many obstacles, to find the fire and extinguish it. Although the robot might seem intelligence, but it is very limited. For instance, how if the source of fire is not inside the maze, but outside of it? How if the robot does not see the fire, but it sees a lot of smokes? Suddenly, this robot seems not so smart anymore when its environment, or its task, is changed.

Continue reading

Getting started with Baxter Simulator

To learn robotics properly, one needs a robot to play with. Building a robot is costly and not simple, so a simulated robot can be utilised to replace the physical one. Most of robots for research purpose currently utilise Robot Operating System (ROS), because it is flexible, modular, and open source. Mastering ROS is a must for robotics researchers or enthusiasts nowadays. Baxter robot, created by Rethink Robotics, is a good medium to learn ROS and robotics.

baxter_pick_placeAlthough Baxter is a great robot with unique features, but it is reasonably expensive. For learning purpose, it can be replaced with Baxter simulator. Previously, it is only available for the organisation who has bought Baxter, and it does not have any grippers attached in the Baxter’s arms. Since the end of 2015, the simulator has been updated, one of the biggest change is the grippers addition, and now it is open-source, so everyone can use it. This post will guide you to learn to use Baxter simulator in more systematic way.

Why Baxter simulator?

Here are several reasons to use Baxter simulator:

  • It is fully functioning, and open source, simulator.
  • Baxter is ROS-ready. It provides a user friendly Python API that wrapped ROS interfaces in Python classes. Baxter simulator is developed in Gazebo, another open source program.
  • While one can learn about ROS by using the turtlebot simulator, it is only a mobile robot. On the other hand, Baxter has two manipulators, so one can learn to control the arm control using a simple joints control, an inverse kinematics (IK) solving or a complex trajectory planning. Many other complex extension can be performed on Baxter.
  • Baxter robot is well documented. This post just fills the small gap on that excellent documentation.

Continue reading

Building a 3D Printed Mini Rescue Robot (based on OARK)

Most robot kits available in the market, such as : LEGO Mindstorm, Fischer Technic, Bioloid, are limited in flexibility and durability. There may be no available parts that suitable for particular need. Those robots will also not be able to traverse on irregular and cluttered terrain which is common, for example, in rescue area.  Open Academic Robot Kit (OARK) is an open source robot kit that exploits the advent of 3D printer technology. It is created by Dr. Raymond Sheh (Curtin University) to lower the barrier for everyone who wants to enter robotics research field, particularly rescue robot. The robot also has a nick name: Emu Mini 2, as it is a mini version of the Emu, an autonomous rescue robot in school of Computer Science and Engineering , UNSW.  The official webpage of OARK is http://oarkit.intelligentrobots.org/.

Here is the compilation of Emu Mini 2 which I demonstrate on a mini rescue arena at RoboCup 2015 in Hefei, China.

I will describe the main components of the robot that we build here at CSE – UNSW based on OARK:

  • 7 servo motors Dynamixel AX-12 from Robotis
    • 4 motors to move the mobile base and 3 others to move the arm
  • USB2Dynamixel from Robotis
    • It is utilised to connect the motors to RaspberryPi via its USB port (not its GPIO pins as usual)
    • OpenCM 9.04 board is also can be used, see here
  • RaspberryPi 2 and a Micro SD card
    • Although the first version can be used, RaspberryPi 2 have much faster processor and it has 4 USB ports which is useful in our system
  • RaspberryPi Camera and a long camera cable (50 cm)
    • A long cable is needed as the camera is located in an articulated arm
  •  Wi-Pi dongle from Element 14
    • A wi-fi dongle for Raspberry Pi
  • Wireless Gamepad Logitech F710
    • A PS3 dualshock controller is also can be used, see here
  • Rechargeable LiPo battery (output : 12V, 1750 mA)
    • To supply the servo motors
  • Power bank (output 5V, 1 A) or attach a voltage regulator to convert 12 V (from battery) to 5V
    • To supply the Pi

Here is the not-so-complete list of the robot’s materials and where to get them.

There are 7 steps to build this robot.

  1. Preparing the mechanical robot parts
  2. Preparing the RaspberryPi 2
  3. Preparing connection between RaspberryPi and Dynamixel AX12 motor
  4.  Controlling Dynamixel AX12 using PyDynamixel Library
  5.  Interfacing a wireless joystick and RaspberryPi
  6.  Intuitive motor control using a wireless joystick
  7.  Video streaming via PiCamera

Continue reading

Edison Robot – great way to learn the basics of robotics!

Robot is interesting for everyone! However to learn the basics of robot is complicated and and relatively expensive. You may use, for example, LEGO Mindstorm which provides a wide range of mechanical blocks, several sensors and motors, a quite good controller, and a user-friendly graphical programming software. The only problem is its price. In Australia, its normal price is around 500 AUD. You may also create your own robot using available robotics kit and use a popular micro controller like Arduino. It will be cheaper, but it may turn down the children without previous experiences with programming and electronics, as you have to use a command line programming environment and some (simple) electronics circuits. Thus you have choices in learning robotics, simple but expensive, or cheap but somewhat complex.

This dilemma inspired Microbric, an Australian company, to create Edison robot (http://meetedison.com/). Their vision is to share electronics, robotics and programming with as many people as possible. They come up with smart design of a compact and cheap robot that enable everyone (not just the riches or the geeks) to learn robotics.

Edison_robotThese are the features of Edison robot that I really like :

  • Cheap
    • It is only 65 AUD each, or 49 AUD each if you buy 10 of them altogether. It means that for 1 LEGO Mindstorm robot, you can buy 10 Edison robots.
  • Built-in programs that can be activated by driving over barcodes
    • The children do not have to program anything to see what Edison robot can do. This will encourage them to see what kind of cool things they can do with this robot.
  • Wide range of sensors
    • Edison robot provides light sensors, infra-red sensors (for avoid obstacle and receive signal from universal remote controller), line sensor, and sound sensor.
  • Graphical programming environment
    • It follows the graphical programming style of LEGO Mindstorm, so it is easier to understand for children that never do programming before.
  • Compatibility with LEGO blocks
    • It is compatible with LEGO blocks so the children can decorate their robots with the LEGO blocks and figures! (See my version of EdDigger robot below).
  • Open source (software and learning materials)
    • Its software is open source and free. Microbric also provides free books and robotics lessons which are very helpful for students (and teachers) that want to learn this robot.

eddigger_legoMy decorated EdDigger robot

Of course the Edison robot is not perfect. Its motor are not really strong and accurate. It is hard just to make it moving straightly. Its sensors ability are also bit limited and has fixed locations (unlike LEGO Mindstorm which you can out the sensors anywhere you like). It also does not have complete mechanical blocks as LEGO Mindstorms have.

With above limitations, in my personal opinion, Edison robot is still very appealing for children who want to begin to learn robotics, but only have limited budget. It is also good for them to try Edison robot first to see whether you really like robot or not, before you jump to buy an expensive kit like LEGO Mindstorm or to deal with more complex system in Arduino-based robot.