Big Wheel Bot – Gardening Robot

I have been working on a project for the last few months that I am referring to as the ‘Big wheel bot’. This robot is based around some 250mm diameter wheels that I purchased and this project has evolved as its gone along. Initially I was planning to build a balancing robot but this changed into a differential drive robot with a rear castor as I had a purpose in mind for this robot. That purpose was to help in the garden. I wanted to go a step further than a lawn mower robot and I wanted something that could navigate autonomously and check on the various plants in the garden. I am working towards that goal.

I have written a post on letsmakerobots so I won’t repeat it all here but here is the link.

I have also made a new series of videos showing the process of designing, building and programming the robot. I am still working on this project and I have got as far as producing the first occupancy grid map with the robot.

All the code for this project can be found here:

More Sensors! Sonar and MPU6050 module

I have added some sensing to the RC robot in the form of some sonar sensors and an MPU6050 IMU module. This project was always heading away from being a purely RC robot towards an automated mobile robot platform. The addition of sensing is one of the steps in this process. Adding and interfacing the sensors was quite straight forward and Part 11 of my Youtube series takes you through the process.

For anyone interested, here is the updated Arduino circuit I am now using.

I have also moved the HC05 bluetooth module from interfacing with the Arduino to being connected to the onboard Raspberry Pi. Partly to see if it would work but also because I want to be able to control more functions from the transmitter and it seemed to make sense to have the Pi get the data from the transmitter and send it on the to Arduino. Time will tell if this is a good solution.

The next steps will be to improve the serial communications between the Arduino and Raspberry Pi as I’m not happy with how that is working at the moment. Then I want to log sensor data to a file, along with images captured from the webcam, as the robot is being driven around. I will then use this data to work on some mapping/SLAM solutions I would like to try out.

Capturing video using a Raspberry Pi, OpenCV and Python

I decided it was time to add some sensing to the RC robot. The first of which was to add some vision in the form of a webcam connected to a Raspberry Pi. I had a Raspberry Pi 2 at hand so that is what I am currently using, along with a standard webcam. The aim to start with was to enable to capture of video as the robot drives around under remote control. Ultimately I plan to use the camera, along with other sensors to automate the robot. But I wanted to start simple and build from there. To keep it simple I decided to make a small circuit with  a button to start/stop recording and an RGB LED to indicate whether the Pi was recording video or not. I also 3D printed a simple mount for the camera. These components were attached to the Raspberry Pi case resulting in a compact assembly that could be attached to the robot. One other component was required and that was an additional switch, mounted to the side of the case, that would allow the Raspberry Pi to be shutdown when pressed.

Combined with a battery pack or some other form of power this would make quite a nice stand alone project, maybe as a dashcam or any other device that needs to capture video. In may case I will be using power from the 24V batteries on the RC robot, via a UBEC connected to the GPIO pins.

The next job was to write a python script that would start and stop video capture at the push of the button and store this video for later use. I used OpenCV to capture images from the webcam and store as a video. Each video would be stored with a file name created using a time stamp. I also added the LED functionality so that the LED was green when ready to begin recording and red when recording. The last part of the code was to shut down the Pi when the shut down button was pressed, after flashing the LED a few times to indicate that the button has been pressed. I set it up so that this script runs on start-up of the Pi. The full code is shown below.

import time
import os
import numpy as np
import cv2
import RPi.GPIO as GPIO

print "Starting..."

GPIO.setup(22,GPIO.OUT) #Red LED
GPIO.setup(27,GPIO.OUT) #Green LED
GPIO.setup(17, GPIO.IN, pull_up_down=GPIO.PUD_UP) #Button Input for recording
GPIO.setup(21, GPIO.IN, pull_up_down=GPIO.PUD_UP) #Power off button

recording = False

print "Starting OpenCV"
capture = cv2.VideoCapture(0)

imagewidth = 640
imageheight = 480
capture.set(3,imagewidth) #1024 640 1280 800 384
capture.set(4,imageheight) #600 480 960 600 288

# Define the codec and create VideoWriter object
fourcc = cv2.VideoWriter_fourcc(*'XVID')


def CaptureSaveFrame(outfile):
    ret,img =
    ret,img = #get a few frames to make sure current frame is the most recent
    return img

def LEDGreen():
def LEDRed():

def LEDOff():

def CreateFile():
    timestr = time.strftime("%Y%m%d-%H%M%S")
    print timestr
    out = cv2.VideoWriter('/home/pi/Video_Recorder/'+ timestr +'.avi',fourcc, 5.0, (imagewidth,imageheight ))
    return out

def Shutdown(channel):
    print("Shutting Down")
    os.system("sudo shutdown -h now")

GPIO.add_event_detect(21, GPIO.FALLING, callback=Shutdown, bouncetime=2000)


while True:

    input_state = GPIO.input(17)
    if input_state == False:
        recording = not recording #Toggle bool on button press
        time.sleep(1) #Debounce
        if recording:
            out = CreateFile()

    if recording:

Part 10 of my Youtube video series shows the robot in action and capturing video as it drives around.

This set-up works great and I have already started work using the video and OpenCV to see how I can get the robot driving around autonomously using the video input. I will also be adding some additional sonar sensors to the robot for obstacle detection/avoidance as I don’t want to rely on the visual input alone to avoid crashes! I also intend to reconfigure the robot control so that the Raspberry Pi is the master of the system and the Arduino is the slave, taking commands from the Raspberry Pi. Thats it for now, thanks for taking the time to read this and I’ll be back soon with more updates to the project.

RC Robot

I have been busy over the last couple of months with a new project. Due to my lack of imagination I am calling it the RC Robot. The aim of the project was to build a sturdy, reliable mobile robot platform to use for future development. I am a bit of a purist and don’t really consider a remote controlled vehicle a robot. However, to start with I wanted to make the robot remote controlled, as I thought this would be a good fun place to begin.  I have every intention of developing this project and adding autonomy at a later date. I also vowed to use some of the many parts that I have accumulated over the years of robot building, basing the drive system around some 12-24V brushed gear motors that I have a number of from a previous project. I also wanted to document the build in the form of a series of Youtube videos.

I kicked off the project by designing the drive assembly. As mentioned, the motor/gearboxes were in my parts box and are 12-24V como drills units, geared at 30:1. These were still a bit quick for my needs so I designed and built a simple gearbox, complete with bearings for support of the output shaft. I initially attempted to design and build a belt drive system, but for various reasons abandoned this in favour of using gears. I decided to use SLA batteries for this project and two 12V, 2.1Ah batteries in series gives a good solid 24V to work with. Check out the below video for a breakdown of the design and initial testing.



The next step of the project was to add some electronics to drive the two motors. An L298 based motor driver was ideal for the job. Add in some HC-05 bluetooth modules and an Arduino or two and I had myself a way to remotely control the speeds of the motors using a joystick. Part 2 below shows the development and testing of the electronics system along with circuit diagrams.



Once confident that the drive assemblies and electronics were up to the task, the next job was to design and fabricate the robot chassis itself. Some 2mm aluminium sheet served as a sturdy chassis plate for the robot and a strong castor was selected to serve as the 3rd wheel. My initial design for the base plate and the mounting for the castor was disappointing and not particularly aesthetically pleasing. After 3D printing a fancy castor mount I was much happier with the look of the robot. Part 3 of the video series covers the chassis build process and the first tentative test drive before the electronics were properly mounted.



The next job was to take the electronics from prototype breadboard to a more permanent stripboard circuit, ready for mounting to the robot. All of the required electronics were mounted to the top plate of the robot and the final wiring completed ready for the first proper test run!! It was a sunny day and I had a fun hour test driving the robot in the sunshine. Part 4 shows the results of the test drive and an appraisal of the robots performance.



During the initial test run I found that the robot was a bit of a handful to control. I had the controls set up for a skid steer type arrangement, with the raw joystick values being sent to the robot and converted into motor speeds with very little additional processing. Whilst great fun and a good challenge, I wanted a bit more control of the robot when manually driving it. I decided that encoders would help the situation by allowing for some closed loop control of motor speeds. I knocked up some homemade incremental encoders to allow the motor speeds to be measured and set about adding these to the robot. Part 5 is a more tutorial type video to show you how I added encoders to the robot.



And this brings us right up to date. I have promised myself to update this blog a bit more often, particularly when I have a new video to share. Stay tuned as I have just finished work on the controller and will have a new video to share very soon.

If anyone reading this would like more information on any of my robots, please feel free to leave a comment, either here or on the Youtube video and I will always do my best to help.

WXPython GUI

Part 5 of my video series following the development of a desktop robot head was uploaded a couple of weeks ago. The video covers more progress on the robot head project including constructing a new circuit board for the Arduino Nano to replace the prototype breadboard circuit. This video shows the GUI built using WXPython that can now control the robot. OpenCV images and matplotlib plots have been embedded in the GUI and some initial image processing and robot modelling functionality is working.



I am now thinking carefully where I go next with this project. What I really wanted to try next was coming up with a way for the robot to identify objects of interest in the environment and log these, adding them to some kind of map/plot. From here the robot can then try and find these objects again using the camera to locate itself within the world. This isn’t a new idea but I am not sure how to progress yet. OpenCV has inbuilt functions that can identify good features to track and several algorithms to match these points to what a camera is seeing. However, these are quite abstract points; corners, edges etc. I would like to robot to be able to pick out objects from the environment, that a human could also identify. To do this I think there will need to be a training step, where a person looks at an image and tells the robot that an object is present. Then I can use something like template matching to identify the object in the future. In theory as this is a static robot, the angle and distance to objects shouldn’t vary too much and this technique may work. It’s something I want to try, and I will be sure to let you know the outcome.

The next question is; What next? I always reach this stage with all of my robot projects. I really enjoy designing and building robots, and it’s a rewarding experience when the robot comes to life and starts moving around. But I am the first to admit that my creations are somewhat useless. As a learning experience and a fun hobby, they are a worthwhile endeavour, but they are never created with an end goal in mind.  Maybe this is something to address in my next robot project!


Latest robot project – A desktop social robot

I have been working on a new robot project for the last few months.  I like desktop robots and having read about social robots like Jibo and Buddy I decided I would like to try and create a low cost version of one of these that can sit next to my computer and maybe interact with me and my children. I wanted to try and design some more complex components for 3D printing and thought that this project would be a good opportunity. I decided to give FreeCAD a shot as it looked like a promising open source 3D design package.  After negotiating the expected learning curve I was impressed with the range of functionality that FreeCAD has and I was able to design some cool looking parts for my new robot. The idea was to design a desktop robot with a display and a camera integrated into the head and use servos to pan and tilt the head to enable to robot to look around the room. As the project progressed, as is often the case, it evolved and the robot ended up with an extra servo that enabled the head to roll as well. I wanted to incorporate an accelerometer into the head to track its position and went with a cheap MP6050.  For the display I used an adafruit 2.2″ tft screen that I have used in previous projects. I had a webcam, that I stole from my mobile robot that I mounted in the head as well. I also used this project as an excuse to learn how to use kiCAD for pcb design. I used this to design a shield for the arduino mega to interface all of the sensors and servos.

Below is a picture of the finished robot.

Social robot

Social robot

I was particularly pleased with the lettering on the bottom servo housing, easy to do with FreeCad. I was also pleased with the bracket that connects the roll and tilt servos, as shown below.

Roll to tilt head bracket

Roll to tilt head bracket

The accelerometer mounts to the head and, using this library, is able to spit out roll, pitch and yaw angles.

MP6050 mounted to head

MP6050 mounted to head

After some calibration I have been able to control the servo positions using the accelerometer readings very reliably. A simple control loop for each servo is all that is required to position the head at any combination of roll, pitch and yaw angles.

I have a lot of work to do on the software, but I wanted to share the project now that the mechanical and electronic build is complete. I was going to use the raspberry pi for this project but I have decided to use my desktop computer for now, but I may decide to use the pi at a later date. Previously I was driving the tft screen from the pi but I am now writing to the screen from the Arduino. The screen will display a face, probably just simple shapes for now, to allow the robot to display some emotions.

I am also planning to design a robot arm it some point, and I would like this robot to be able to control the robot arm. I am thinking of possibly having a few modular parts, like arms or other sensors, that can work together. I am not sure how this will happen at that moment but its fun to think about.

Come back soon as I hope to have a video of the head moving around and controlling its position up here when its working.



New battery and some design revisions. BFR4WD MK2

Work was progressing with BFR4WD and I was experimenting with software for object/landmark recognition as well as developing BFR-Code for controlling the robot. However, I started having problems with the old NI-MH battery pack I was using to power the servos.  It wasn’t holding its charge for long and the servos were constantly struggling to move the robot. This battery pack had sometimes struggled to deliver the current required when all the wheel servos were working hard when it was working well. I decided it was time for a new battery. This led to a succession of design revisions and many parts of BFR4WD have been redesigned and rebuilt.

Lets start with the choice of a new battery. I looked into the various options available  and settled on a 6V SLA battery to power all the servos. SLA batteries are cheap, easy to charge and able to supply a lot of current if required. The downside is that they are heavy! During the original design of BFR4WD I made the decision to gear the servos to increase the top speed of the robot at the expense of torque to the wheels. This was fine using the previous battery but the additional weight of the SLA battery means this gear ratio would no longer be suitable. I decided to redesign the drive train for each wheel so that the required torque could be delivered. This meant designing some more 3D printed gears with a ratio that increased torque at the expense of speed. I ended up with a 20 tooth gear attached to the servo and a 24 tooth gear on the wheel axle.

I used this redesign as an opportunity to review the design of the wheel encoders as well. Whilst the previous design worked well, the encoders were mounted in a place the meant most of the robot had to be dismantled in order to adjust, repair or replace them. Looking into various options I settled on sticking with an optical encoder but instead of a slotted disk I would use a printed pattern. This design involves printing the encoder disk pattern onto transparency film using a laser printer. A slotted opto-interrupter looks at this disk and switches depending on whether a printed line is present or not. The final design ended up with the patterned disk attached to the servo gear meaning I could also do away with the third gear/slotted disk of the previous design. This has the advantages of fewer parts to print, fewer holes to drill and fewer moving parts. This all adds up to a simpler design which is lighter and runs more quietly. The picture below shows an encoder disk mounted to the servo gear.

Encoder disk

Encoder disk

With the alterations to the gear ratio and the new design for the encoders I was also able to increase the encoder resolution from 180 pulses per revolution to 218.

A redesign of the encoder circuit was also required and I decided to use Sharp GP1S093HCZ0F opto-interrupters. These devices are very small and allow a very compact encoder to be built. A test on a breadboard allowed me to find the correct resistor values to use (10K and 220 Ohm) and test that the device could read the encoder disks as required. The pictures below show the breadboard version and the final encoder circuit which was built on stripboard.

Encoder circuit on breadboard

Encoder circuit on breadboard

Encoder circuit

Encoder circuit

To allow the new design drive train to fit in the aluminium box section I’m using to house it, I had to space the servos off from the box section by 3mm. I designed and printed a spacer that would hold both of the servos required for one side of the robot chassis.

Servo Spacer

Servo Spacer

A 3D printed bracket was designed that would mount the encoder circuits in the correct position and allow for some adjustment. These encoders are mounted in the centre of the chassis rail as opposed to the ends as in the previous design. This configuration makes them a lot easier to adjust. Shown below is the finished encoder arrangement mounted in the chassis.

Mounted encoder

Mounted encoder

This encoder design and gear configuration also meant that I could reduce the overall size of the robot. This design is now 40mm shorter in length but 10mm wider. This has the advantage of setting the wheels closer together along the length of the robot but spacing them further apart across the width of the robot. This helps when the robot is turning on the spot.

I had to fabricate all new chassis parts to fit the new design, one of which is shown below.

Chassis rail

Chassis rail

The picture below shows one of the chassis rails with the servos, gears and encoders mounted. I reused the end joining pieces from the previous design. I also changed the bushes that the axles run in from a push fit design to ones that glue into place.


Due to reducing the overall size of the robot and the addition of a larger battery, space on the chassis to mount everything had decreased significantly. I was forced to start building upwards! I designed some 3D printed stand-offs that would allow me to mount another sheet of HDPE above the first, as shown below.

3D printed stand-offs

3D printed stand-offs

This meant that the two batteries could sit on the bottom layer which allowed good access to the connect / disconnect power to them. The Raspberry Pi, Arduino and USB hub could go on the top layer, again giving good access to connect to them. I also added a compass module to the robot. A CMPS03 that I have had lying around for a long time.  I designed and printed a mount for the compass and used a length of aluminium box section to raise the compass into the air. My experience is the further the compass is from any other components the better as it reduces interference.

The picture below shows the completed robot. So many things have changed with this redesign I think this will have to be referred to as BFR4WD MK 2. Initial testing confirms that I have much more torque at the wheels due to the change in gear ratio and better current delivering capacity of the battery. The encoders are working perfectly and I have increased resolution meaning more accurate positioning and speed control. The compass works and will be used for navigation. I am very keen to try some more map building using a mobile robot with the addition of data from visual images.



Facts and Thoughts on Technological Progress

Turing's Radiator

Pleasantly Warm Topics in Computational Philosophy

Mind the Leap

One small step for the brain. One giant leap for the mind.

Steve Grand's Blog

Artificial Life in real life


Hi, I'm Jim. Read what I write and I'll write more to read.