Big Wheel Bot – Gardening Robot

I have been working on a project for the last few months that I am referring to as the ‘Big wheel bot’. This robot is based around some 250mm diameter wheels that I purchased and this project has evolved as its gone along. Initially I was planning to build a balancing robot but this changed into a differential drive robot with a rear castor as I had a purpose in mind for this robot. That purpose was to help in the garden. I wanted to go a step further than a lawn mower robot and I wanted something that could navigate autonomously and check on the various plants in the garden. I am working towards that goal.

I have written a post on letsmakerobots so I won’t repeat it all here but here is the link. https://www.robotshop.com/community/robots/show/big-wheel-bot

I have also made a new series of videos showing the process of designing, building and programming the robot. I am still working on this project and I have got as far as producing the first occupancy grid map with the robot.

All the code for this project can be found here: https://github.com/BigFace83/Big-Wheel-Bot

More Sensors! Sonar and MPU6050 module

I have added some sensing to the RC robot in the form of some sonar sensors and an MPU6050 IMU module. This project was always heading away from being a purely RC robot towards an automated mobile robot platform. The addition of sensing is one of the steps in this process. Adding and interfacing the sensors was quite straight forward and Part 11 of my Youtube series takes you through the process.

For anyone interested, here is the updated Arduino circuit I am now using.

I have also moved the HC05 bluetooth module from interfacing with the Arduino to being connected to the onboard Raspberry Pi. Partly to see if it would work but also because I want to be able to control more functions from the transmitter and it seemed to make sense to have the Pi get the data from the transmitter and send it on the to Arduino. Time will tell if this is a good solution.

The next steps will be to improve the serial communications between the Arduino and Raspberry Pi as I’m not happy with how that is working at the moment. Then I want to log sensor data to a file, along with images captured from the webcam, as the robot is being driven around. I will then use this data to work on some mapping/SLAM solutions I would like to try out.

Capturing video using a Raspberry Pi, OpenCV and Python

I decided it was time to add some sensing to the RC robot. The first of which was to add some vision in the form of a webcam connected to a Raspberry Pi. I had a Raspberry Pi 2 at hand so that is what I am currently using, along with a standard webcam. The aim to start with was to enable to capture of video as the robot drives around under remote control. Ultimately I plan to use the camera, along with other sensors to automate the robot. But I wanted to start simple and build from there. To keep it simple I decided to make a small circuit with  a button to start/stop recording and an RGB LED to indicate whether the Pi was recording video or not. I also 3D printed a simple mount for the camera. These components were attached to the Raspberry Pi case resulting in a compact assembly that could be attached to the robot. One other component was required and that was an additional switch, mounted to the side of the case, that would allow the Raspberry Pi to be shutdown when pressed.

Combined with a battery pack or some other form of power this would make quite a nice stand alone project, maybe as a dashcam or any other device that needs to capture video. In may case I will be using power from the 24V batteries on the RC robot, via a UBEC connected to the GPIO pins.

The next job was to write a python script that would start and stop video capture at the push of the button and store this video for later use. I used OpenCV to capture images from the webcam and store as a video. Each video would be stored with a file name created using a time stamp. I also added the LED functionality so that the LED was green when ready to begin recording and red when recording. The last part of the code was to shut down the Pi when the shut down button was pressed, after flashing the LED a few times to indicate that the button has been pressed. I set it up so that this script runs on start-up of the Pi. The full code is shown below.

import time
import os
import numpy as np
import cv2
import RPi.GPIO as GPIO

print "Starting..."

GPIO.setmode(GPIO.BCM)
GPIO.setwarnings(False)
GPIO.setup(22,GPIO.OUT) #Red LED
GPIO.setup(27,GPIO.OUT) #Green LED
GPIO.setup(17, GPIO.IN, pull_up_down=GPIO.PUD_UP) #Button Input for recording
GPIO.setup(21, GPIO.IN, pull_up_down=GPIO.PUD_UP) #Power off button

recording = False

print "Starting OpenCV"
capture = cv2.VideoCapture(0)

imagewidth = 640
imageheight = 480
capture.set(3,imagewidth) #1024 640 1280 800 384
capture.set(4,imageheight) #600 480 960 600 288

# Define the codec and create VideoWriter object
fourcc = cv2.VideoWriter_fourcc(*'XVID')

cv2.waitKey(50)

def CaptureSaveFrame(outfile):
    ret,img = capture.read()
    ret,img = capture.read() #get a few frames to make sure current frame is the most recent
    outfile.write(img)
    cv2.waitKey(1)
    return img

def LEDGreen():
    GPIO.output(22,GPIO.HIGH) 
    GPIO.output(27,GPIO.LOW)
    
def LEDRed():
    GPIO.output(22,GPIO.LOW)
    GPIO.output(27,GPIO.HIGH)

def LEDOff():
    GPIO.output(22,GPIO.HIGH)
    GPIO.output(27,GPIO.HIGH)

def CreateFile():
    timestr = time.strftime("%Y%m%d-%H%M%S")
    print timestr
    out = cv2.VideoWriter('/home/pi/Video_Recorder/'+ timestr +'.avi',fourcc, 5.0, (imagewidth,imageheight ))
    return out

def Shutdown(channel):
    print("Shutting Down")
    LEDOff()
    time.sleep(0.5)
    LEDGreen()
    time.sleep(0.5)
    LEDOff()
    time.sleep(0.5)
    LEDGreen()
    time.sleep(0.5)
    LEDOff()
    time.sleep(0.5)
    LEDRed()
    os.system("sudo shutdown -h now")

GPIO.add_event_detect(21, GPIO.FALLING, callback=Shutdown, bouncetime=2000)

LEDGreen()

while True:

    input_state = GPIO.input(17)
    if input_state == False:
        recording = not recording #Toggle bool on button press
        time.sleep(1) #Debounce
        if recording:
            LEDRed()
            out = CreateFile()
        else:
            LEDGreen()

    if recording:
        CaptureSaveFrame(out)

Part 10 of my Youtube video series shows the robot in action and capturing video as it drives around.

This set-up works great and I have already started work using the video and OpenCV to see how I can get the robot driving around autonomously using the video input. I will also be adding some additional sonar sensors to the robot for obstacle detection/avoidance as I don’t want to rely on the visual input alone to avoid crashes! I also intend to reconfigure the robot control so that the Raspberry Pi is the master of the system and the Arduino is the slave, taking commands from the Raspberry Pi. Thats it for now, thanks for taking the time to read this and I’ll be back soon with more updates to the project.

3D Printed worm drive gearboxes

With the aim to slow the robot down a bit and combat the issue that the robot would not stop straight away when the joystick was released, I started looking at alternative gearbox options. The one option I knew that would resolve both of these issues and give a very compact gearbox solution was to use a worm drive. These enable high reduction ratios and are very difficult to back drive, meaning that when the motor stops turning, the wheel will stop very quickly and won’t run on.
I had a look around online and knew that it was possible to 3D print a worm drive gearbox but I wasn’t sure how practical or long lasting these would be. I decided to give it a shot and set about designing and printing a prototype. Part 8 of the RC Robot video series shows the design, build and testing of the gearbox.

 

 

The gears were generated using this OpenSCAD generator: https://github.com/chrisspen/gears I modified the gears to include hubs for attaching them to a shaft. I also designed and printed a custom housing for the gearbox complete with bearings to support the drive shaft and one end of the worm gear. The worm gear was tricky to print well and I ended up printing it in one piece, stood on its end. I had to print this very slowly and still the print was not perfect but was good enough. I had a few failures of the worm gear early on and had to go through a few design iterations to add strength where it was needed to get a functional part. I also had to adjust the gear spacing a couple of times, once by modifying the gearbox housing and second time by altering the size of the spur gear. I found that if the gears were meshed too tight it would put too much force on the worm gear and cause damage, if meshed too loose, the backlash in the gearbox would be excessive. The video shows the assembly and testing of the gearbox and it works really well. Its a bit noisy but some lubrication helped a lot. I am hoping that as the gearbox is used, the gears will wear in slightly and the gearbox will operate more smoothly and quietly. How well the gears wear over the long term will need to be gauged as the gearbox is used.

I went on to build a second gearbox that needed to be a mirror of the first so that I had a gearbox for each side of the robot. At this stage I decided that I would strip the RC Robot down and build a new chassis to mount the worm drive gearboxes to and make a few more improvements along the way. Part 9 of the video series shows the rebuilt robot and details the design changes.

 

 

I’m really pleased with this version of the RC Robot platform. I drives nice and steadily with plenty of torque. The original spur gear gearboxes provided a gear reduction of around 3:1. The worm drive gearboxes give a reduction of 7:1 in a very compact unit. This slows the wheels down considerably and gives a good amount of torque to the wheels. I have tested the robot quite a bit now and it is easy to control and stops immediately when the joystick is released.

I now have a nice sturdy platform to work with and I will be continuing this project by adding more functionality to the robot. My aim has always been to automate this robot, even though I am calling it the RC Robot (I may need to rename it at some point in the future). The first step will be the addition of some more sensors so come back soon to check on the progress.

Homemade RC Controller and PID wheel control

With the RC robot test drive completed it was time to make a more permanent solution for the hand held controller. I designed and built a controller with 2 analogue control sticks and a TFT screen, powered by rechargeable NiMH batteries. Inside there is an Arduino Nano with a HC-05 module for bluetooth communications to the robot. I used some expanded PVC sheet along with 3D printed parts to make a case. Part 6 of the RC Robot video series shows the build of the controller.

I was really pleased with how the controller turned out. It works really well and fits in the hands nicely.

With the controller build completed, I turned my attention to the software for controlling the robots wheel speed. Initially I just had the wheel speeds controlling proportionally to the joystick positions. This worked ok but I wanted to implement closed loop speed control with feedback from the incremental encoders. I also wanted to be able to control the robot using only one of the analogue joysticks, which turned out to be trickier than I had first thought.
Part 7 of the video series covers the PID control and converting the control to using only one analogue joystick, along with some fun testing of the robot in the garden.

I really struggled to work out how to control the two wheel speeds and turning using just a single analogue joystick until I found a great explanation that can be found here http://home.kendra.com/mauser/Joystick.html
This page explains the theory, and the equations that pop out at the end enable the wheel speeds and directions to be calculated based on the input from the single analogue joystick.

After some more testing of the robot with the improved control software it became clear that the robot had a few design issues. The main one being that the robot was a real handful to control accurately. It is great fun to drive but I need to be aware that I am making a robot platform, not an RC car. One issue was that the robot was simply a bit too quick. This was easy enough to remedy by limiting the PWM output to the motor driver to limit the top speed. I also noticed that when you released the joystick, the robot had a tendency to continue rolling for a bit, due to its momentum. This sometimes didn’t matter too much but sometimes one wheel would continue while the other didn’t, putting the robot off course. I turned on motor braking on the motor driver when the joystick was centred and this helped a bit but didn’t cure the problem.

Therefore I had some decisions to make about the next steps of the project. I will go into more details in my next blog.

EDIT:  I have been asked to share the design and code for the controller so below is the circuit I am using.

Nano_HC_05_bb

I have also been asked to share the code. Other than writing to the TFT screen, the code is pretty straight forward. The joystick positions are read using an analogue read and then this data is formatted into a string to be sent via the serial port. I am using a software serial port to send data through the HC05 module as this keeps the main serial port free for debugging. I wanted to keep the data sent from the transmitter as simple as possible and the work of decoding and using the data is performed by whatever is receiving the data.

#include "SPI.h"
#include "Adafruit_GFX.h"
#include "Adafruit_ILI9340.h"

#include <SoftwareSerial.h>
SoftwareSerial BTSerial(8, 9); //  TX,RX

#if defined(__SAM3X8E__)
    #undef __FlashStringHelper::F(string_literal)
    #define F(string_literal) string_literal
#endif

// These are the pins used for the UNO
// for Due/Mega/Leonardo use the hardware SPI pins (which are different)
#define _sclk 13
#define _miso 12
#define _mosi 11
#define _cs 7
#define _dc 5
#define _rst 6

#define XCENTRE 506
#define YCENTRE 528

Adafruit_ILI9340 tft = Adafruit_ILI9340(_cs, _dc, _rst);

const int LeftButton = A2;     // the number of the pushbutton pin
const int RightButton = A5;     // the number of the pushbutton pin

String X = "X";
String Y = "Y";

const int LeftXin = A1;  // Analog input pin for left joystick X
const int LeftYin = A0;  // Analog input pin for left joystick Y
const int RightXin = A6;  // Analog input pin for right joystick X
const int RightYin = A7;  // Analog input pin for right joystick Y

int prevLXDisplay = 0;
int prevLYDisplay = 0;
int prevRXDisplay = 0;
int prevRYDisplay = 0;

void setup() {

  tft.begin();
  delay(300);
  tft.setRotation(3);
  tft.fillScreen(ILI9340_BLACK);
  delay(300);
  tft.setCursor(20, 60);
  tft.setTextColor(ILI9340_BLUE);  tft.setTextSize(6);
  tft.println("BIG FACE");
  tft.setCursor(20, 120);
  tft.println("ROBOTICS");
  delay(500);

  Serial.begin(9600);
  BTSerial.begin(9600); //Bluetooth software serial
  
  pinMode(LeftButton, INPUT_PULLUP);
  pinMode(RightButton, INPUT_PULLUP);
 

  while(digitalRead(RightButton) == HIGH){ //Wait right here until right joystick button is pressed
  }
  tft.fillScreen(ILI9340_BLACK);
}

void loop(void) {

  tft.fillCircle(prevLXDisplay, prevLYDisplay, 10, ILI9340_BLACK);
  tft.fillCircle(prevRXDisplay, prevRYDisplay, 10, ILI9340_BLACK);
  drawGuides();
  
  int LXValue = analogRead(LeftXin);
  int LXDisplay = map(LXValue, 1023, 0, 20, 140);
  int LYValue = analogRead(LeftYin);
  int LYDisplay = map(LYValue, 0, 1023, 60, 180);
  
  int RXValue = analogRead(RightXin);
  int RXDisplay = map(RXValue, 1023, 0, 180, 300);
  int RYValue = analogRead(RightYin);
  int RYDisplay = map(RYValue, 0, 1023, 60, 180);

  tft.fillCircle(LXDisplay, LYDisplay, 10, ILI9340_RED);
  tft.fillCircle(RXDisplay, RYDisplay, 10, ILI9340_RED);
  prevLXDisplay = LXDisplay;
  prevLYDisplay = LYDisplay;
  prevRXDisplay = RXDisplay;
  prevRYDisplay = RYDisplay;

  int XValue = (XCENTRE-RXValue)/2;
  if (XValue < -255){
    XValue = -255;}
  if (XValue > 255){
    XValue = 255;}
  
  int YValue = (YCENTRE-RYValue)/2;
  if (YValue < -255){
    YValue = -255;}
  if (YValue > 255){
    YValue = 255;}
 

  // print the results to the serial monitor:
  String XString = X + XValue;
  String YString = Y + YValue; 
  Serial.print(XString);
  Serial.println(YString);

  BTSerial.print(XString);
  BTSerial.println(YString);


  delay(100);


}

void drawGuides(){
  //tft.drawLine(x1, y1, x2, y2, color);

  int LeftCentX = 80;
  int LeftCentY = 120;
  int RightCentX = 240;
  int RightCentY = 120;
  
  tft.drawLine(LeftCentX, LeftCentY, LeftCentX-60, LeftCentY, ILI9340_WHITE);
  tft.drawLine(LeftCentX, LeftCentY, LeftCentX+60, LeftCentY, ILI9340_WHITE);
  tft.drawLine(LeftCentX, LeftCentY, LeftCentX, LeftCentY-60, ILI9340_WHITE);
  tft.drawLine(LeftCentX, LeftCentY, LeftCentX, LeftCentY+60, ILI9340_WHITE);

  tft.drawLine(RightCentX, RightCentY, RightCentX-60, RightCentY, ILI9340_WHITE);
  tft.drawLine(RightCentX, RightCentY, RightCentX+60, RightCentY, ILI9340_WHITE);
  tft.drawLine(RightCentX, RightCentY, RightCentX, RightCentY-60, ILI9340_WHITE);
  tft.drawLine(RightCentX, RightCentY, RightCentX, RightCentY+60, ILI9340_WHITE);
  
}

RC Robot

I have been busy over the last couple of months with a new project. Due to my lack of imagination I am calling it the RC Robot. The aim of the project was to build a sturdy, reliable mobile robot platform to use for future development. I am a bit of a purist and don’t really consider a remote controlled vehicle a robot. However, to start with I wanted to make the robot remote controlled, as I thought this would be a good fun place to begin.  I have every intention of developing this project and adding autonomy at a later date. I also vowed to use some of the many parts that I have accumulated over the years of robot building, basing the drive system around some 12-24V brushed gear motors that I have a number of from a previous project. I also wanted to document the build in the form of a series of Youtube videos.

I kicked off the project by designing the drive assembly. As mentioned, the motor/gearboxes were in my parts box and are 12-24V como drills units, geared at 30:1. These were still a bit quick for my needs so I designed and built a simple gearbox, complete with bearings for support of the output shaft. I initially attempted to design and build a belt drive system, but for various reasons abandoned this in favour of using gears. I decided to use SLA batteries for this project and two 12V, 2.1Ah batteries in series gives a good solid 24V to work with. Check out the below video for a breakdown of the design and initial testing.

 

 

The next step of the project was to add some electronics to drive the two motors. An L298 based motor driver was ideal for the job. Add in some HC-05 bluetooth modules and an Arduino or two and I had myself a way to remotely control the speeds of the motors using a joystick. Part 2 below shows the development and testing of the electronics system along with circuit diagrams.

 

 

Once confident that the drive assemblies and electronics were up to the task, the next job was to design and fabricate the robot chassis itself. Some 2mm aluminium sheet served as a sturdy chassis plate for the robot and a strong castor was selected to serve as the 3rd wheel. My initial design for the base plate and the mounting for the castor was disappointing and not particularly aesthetically pleasing. After 3D printing a fancy castor mount I was much happier with the look of the robot. Part 3 of the video series covers the chassis build process and the first tentative test drive before the electronics were properly mounted.

 

 

The next job was to take the electronics from prototype breadboard to a more permanent stripboard circuit, ready for mounting to the robot. All of the required electronics were mounted to the top plate of the robot and the final wiring completed ready for the first proper test run!! It was a sunny day and I had a fun hour test driving the robot in the sunshine. Part 4 shows the results of the test drive and an appraisal of the robots performance.

 

 

During the initial test run I found that the robot was a bit of a handful to control. I had the controls set up for a skid steer type arrangement, with the raw joystick values being sent to the robot and converted into motor speeds with very little additional processing. Whilst great fun and a good challenge, I wanted a bit more control of the robot when manually driving it. I decided that encoders would help the situation by allowing for some closed loop control of motor speeds. I knocked up some homemade incremental encoders to allow the motor speeds to be measured and set about adding these to the robot. Part 5 is a more tutorial type video to show you how I added encoders to the robot.

 

 

And this brings us right up to date. I have promised myself to update this blog a bit more often, particularly when I have a new video to share. Stay tuned as I have just finished work on the controller and will have a new video to share very soon.

If anyone reading this would like more information on any of my robots, please feel free to leave a comment, either here or on the Youtube video and I will always do my best to help.

Population coding neural network

This was an interesting weekend project that I completed a few months ago. I challenged myself to write some slightly different code for my robot head that would lead to an interesting visual representation of the robot head position. Partly inspired by real neural networks in the brain, I used population coding of a large number of neurons to represent the head pan and tilt position. I was able to extend this further to actually control the head position. Check out the video below for a more in depth description.

 

Template matching and tracking

I have been playing around recently with template matching and locking on to a target with the Robot Head MK 2. Parts 6 and 7 of the video series about this robot are available on Youtube.

 

After a training session where the user manually identifies and names a template, the robot can now match the template in the current image from the camera. The head will then move to centre the detected template in the robots field of view. The 3D position of the object is then calculated using the robot model and the reading from the sonar sensor. I have got as far as plotting these positions in a 3D matplotlib plot.

It was at this point that I noticed some problems. Its known that sonar sensors have a wide beam angle and this is particularly apparent when the robot is looking at something far away. The issue manifests itself as objects being detected as closer than they actually are, due to the wide beam of the sonar detecting objects that are either side of the head, closer than the target object. I could combat this with a different, probably more expensive type of sensor but I am going to try a different approach.

As explained in Part 7 above, I don’t really need a 3D model to show the robots head position, as this can be represented in pan/tilt coordinates. Whilst I have learned quite a bit from playing with template matching, its not the best method for matching scenes that the robot sees. It doesn’t work at different scales and its susceptible to false detection’s. I am going to try something different and I’ll be honest, I’m not sure what just yet! I have always been interested in how mapping works in the brain, and I think what I am trying to achieve  is similar to RatSLAM. On with some more reading and research and I will be back with an update again soon.

 

WXPython GUI

Part 5 of my video series following the development of a desktop robot head was uploaded a couple of weeks ago. The video covers more progress on the robot head project including constructing a new circuit board for the Arduino Nano to replace the prototype breadboard circuit. This video shows the GUI built using WXPython that can now control the robot. OpenCV images and matplotlib plots have been embedded in the GUI and some initial image processing and robot modelling functionality is working.

 

 

I am now thinking carefully where I go next with this project. What I really wanted to try next was coming up with a way for the robot to identify objects of interest in the environment and log these, adding them to some kind of map/plot. From here the robot can then try and find these objects again using the camera to locate itself within the world. This isn’t a new idea but I am not sure how to progress yet. OpenCV has inbuilt functions that can identify good features to track and several algorithms to match these points to what a camera is seeing. However, these are quite abstract points; corners, edges etc. I would like to robot to be able to pick out objects from the environment, that a human could also identify. To do this I think there will need to be a training step, where a person looks at an image and tells the robot that an object is present. Then I can use something like template matching to identify the object in the future. In theory as this is a static robot, the angle and distance to objects shouldn’t vary too much and this technique may work. It’s something I want to try, and I will be sure to let you know the outcome.

The next question is; What next? I always reach this stage with all of my robot projects. I really enjoy designing and building robots, and it’s a rewarding experience when the robot comes to life and starts moving around. But I am the first to admit that my creations are somewhat useless. As a learning experience and a fun hobby, they are a worthwhile endeavour, but they are never created with an end goal in mind.  Maybe this is something to address in my next robot project!

 

Robot Head Mark 2

Around 3 months ago I started a new robot project. I decided that I wanted to build another robot head that could sit on a desktop. I had a few goals in mind when starting the project. I wanted to 3D print the parts and try out some finishing techniques on the parts. Nothing too fancy, just sand them smooth and paint, to see how good a finish I could achieve. I also wanted the robot to be low-cost and simple. I mean really simple, just two servos and a couple of sensors. This was very intentional. I want to expand on the work of my previous project, the desktop robot head and arm, by concentrating on the software. I found I was too often dealing with mechanical issues or limitations and these were an all to welcome distraction from the software. I like making things and having something physical to show for the hard work, whilst I find that with software there isn’t always that sense of satisfaction of having created something. However, the software is where the robot really comes to life and I need to get better and more focused at writing it and implementing the ideas that I have. To that end I wanted a simple robot with very little to go wrong that can serve as a platform to develop some interesting functions. This project will hopefully pull together a lot of the work from previous projects into one robot, and I intend to craft the software with care. I want to squeeze as much functionality out of this simple robot platform as possible. I also wanted to take the opportunity of a new project to switch from using TKinter for the GUI to using wxPython.

I started the process by designing the robot in FreeCad. I did more design work on this than in previous projects and really spent my time on the modelling. I even put the parts together into an assembly to help decide on the colour scheme. I have started another youtube series following the design and build of the robot. Part 1 is below.

This video shows the design and printing of the robot and at the end of it I had a pile of 3D printed parts ready for finishing. The design is a simple pan/tilt robot head, with a camera and a sonar sensor mounted in the head. I want to build on my previous work with point clouds so I wanted a sonar mounted on the head. I don’t think I will end up creating point clouds with this robot but I will likely use the techniques to come up with a different way to represent the environment. However, I didn’t want the sonar sensor looking like the robots ‘eyes’ as they often do, so I tried something new with this project. I printed a mesh to mount in front of the sonar and, combined with a piece of filter fabric, made a cover for the sensor. Later testing would reveal if this was successful or not.  The only other feature of the head was the robots actual ‘eye’. In place of the TFT screen used in the last project I opted for an RGB LED in a custom enclosure that would furnish the robot with a single colour changing eye that it can use to attract interest or convey emotions.

Part 2 shows the finishing of the 3D printed parts.

This involved a lot of sanding and priming before finally painting the parts. I was overall really pleased with the finished parts. It become clear from this process that the quality of the final finish is directly proportional to the amount of finishing work put in. I also took away a few other lessons from this process when designing the parts for finishing. Sharp corners and deep recesses are difficult to sand and remove all of the build lines without removing too much material. Smooth, rounded surfaces or large flat surfaces are easier to sand and look good when painted.

Part 3 of the video series documents the assembly of the robot.

With the robot assembled I was able to do some testing. I decided to use an Arduino Nano for this project, in-keeping with the minimalist and simplicity goals of the project. I knocked up a bread board circuit for testing and set about testing the RGB LED eye and the sonar sensor. Good news was that the sonar still seemed to function just fine from behind its cover, and the RGB LED worked as expected.

I initially connected the RGB LED to digital outputs, thinking that the few colours that this yields would suffice. I subsequently decided to use PWM outputs, so that a wider range of colours could be generated. Given that the robot was so simple, I thought this was a reasonable extravagance.

Part 4 shows testing of the RGB connected to PWM outputs and the first test of the servos.

This video also represents my first simple programs written using wxPython. So far, I like it. The GUI’s look better than their TKinter counterparts and overall it is not much more complex, if at all. I created a couple of test programs, one that sends commands via serial to set R,G and B values for the LEDs and another that sends servo positions. Code for the Arduino has also been written to receive these commands and set the RGB outputs and servo positions accordingly. I can also send get commands to the Arduino and it will return values as appropriate.

This brings us almost up to date on this project. There will be a new video soon, so be sure to subscribe to my youtube channel. I have been working on embedding OpenCV images and MatPlotLib plots in wxPython windows, as well as making a more permanent circuit for the robots electronics.

excitingtechnology.net

Facts and Thoughts on Technological Progress

Turing's Radiator

Diffucult Problems in Futurism

Mind the Leap

One small step for the brain. One giant leap for the mind.

Steve Grand's Blog

Artificial Life in real life

jimsickstee

Hi, I'm Jim. Read what I write and I'll write more to read.