3D Printed worm drive gearboxes

With the aim to slow the robot down a bit and combat the issue that the robot would not stop straight away when the joystick was released, I started looking at alternative gearbox options. The one option I knew that would resolve both of these issues and give a very compact gearbox solution was to use a worm drive. These enable high reduction ratios and are very difficult to back drive, meaning that when the motor stops turning, the wheel will stop very quickly and won’t run on.
I had a look around online and knew that it was possible to 3D print a worm drive gearbox but I wasn’t sure how practical or long lasting these would be. I decided to give it a shot and set about designing and printing a prototype. Part 8 of the RC Robot video series shows the design, build and testing of the gearbox.

 

 

The gears were generated using this OpenSCAD generator: https://github.com/chrisspen/gears I modified the gears to include hubs for attaching them to a shaft. I also designed and printed a custom housing for the gearbox complete with bearings to support the drive shaft and one end of the worm gear. The worm gear was tricky to print well and I ended up printing it in one piece, stood on its end. I had to print this very slowly and still the print was not perfect but was good enough. I had a few failures of the worm gear early on and had to go through a few design iterations to add strength where it was needed to get a functional part. I also had to adjust the gear spacing a couple of times, once by modifying the gearbox housing and second time by altering the size of the spur gear. I found that if the gears were meshed too tight it would put too much force on the worm gear and cause damage, if meshed too loose, the backlash in the gearbox would be excessive. The video shows the assembly and testing of the gearbox and it works really well. Its a bit noisy but some lubrication helped a lot. I am hoping that as the gearbox is used, the gears will wear in slightly and the gearbox will operate more smoothly and quietly. How well the gears wear over the long term will need to be gauged as the gearbox is used.

I went on to build a second gearbox that needed to be a mirror of the first so that I had a gearbox for each side of the robot. At this stage I decided that I would strip the RC Robot down and build a new chassis to mount the worm drive gearboxes to and make a few more improvements along the way. Part 9 of the video series shows the rebuilt robot and details the design changes.

 

 

I’m really pleased with this version of the RC Robot platform. I drives nice and steadily with plenty of torque. The original spur gear gearboxes provided a gear reduction of around 3:1. The worm drive gearboxes give a reduction of 7:1 in a very compact unit. This slows the wheels down considerably and gives a good amount of torque to the wheels. I have tested the robot quite a bit now and it is easy to control and stops immediately when the joystick is released.

I now have a nice sturdy platform to work with and I will be continuing this project by adding more functionality to the robot. My aim has always been to automate this robot, even though I am calling it the RC Robot (I may need to rename it at some point in the future). The first step will be the addition of some more sensors so come back soon to check on the progress.

Homemade RC Controller and PID wheel control

With the RC robot test drive completed it was time to make a more permanent solution for the hand held controller. I designed and built a controller with 2 analogue control sticks and a TFT screen, powered by rechargeable NiMH batteries. Inside there is an Arduino Nano with a HC-05 module for bluetooth communications to the robot. I used some expanded PVC sheet along with 3D printed parts to make a case. Part 6 of the RC Robot video series shows the build of the controller.

I was really pleased with how the controller turned out. It works really well and fits in the hands nicely.

With the controller build completed, I turned my attention to the software for controlling the robots wheel speed. Initially I just had the wheel speeds controlling proportionally to the joystick positions. This worked ok but I wanted to implement closed loop speed control with feedback from the incremental encoders. I also wanted to be able to control the robot using only one of the analogue joysticks, which turned out to be trickier than I had first thought.
Part 7 of the video series covers the PID control and converting the control to using only one analogue joystick, along with some fun testing of the robot in the garden.

I really struggled to work out how to control the two wheel speeds and turning using just a single analogue joystick until I found a great explanation that can be found here http://home.kendra.com/mauser/Joystick.html
This page explains the theory, and the equations that pop out at the end enable the wheel speeds and directions to be calculated based on the input from the single analogue joystick.

After some more testing of the robot with the improved control software it became clear that the robot had a few design issues. The main one being that the robot was a real handful to control accurately. It is great fun to drive but I need to be aware that I am making a robot platform, not an RC car. One issue was that the robot was simply a bit too quick. This was easy enough to remedy by limiting the PWM output to the motor driver to limit the top speed. I also noticed that when you released the joystick, the robot had a tendency to continue rolling for a bit, due to its momentum. This sometimes didn’t matter too much but sometimes one wheel would continue while the other didn’t, putting the robot off course. I turned on motor braking on the motor driver when the joystick was centred and this helped a bit but didn’t cure the problem.

Therefore I had some decisions to make about the next steps of the project. I will go into more details in my next blog.

EDIT:  I have been asked to share the design and code for the controller so below is the circuit I am using.

Nano_HC_05_bb

I have also been asked to share the code. Other than writing to the TFT screen, the code is pretty straight forward. The joystick positions are read using an analogue read and then this data is formatted into a string to be sent via the serial port. I am using a software serial port to send data through the HC05 module as this keeps the main serial port free for debugging. I wanted to keep the data sent from the transmitter as simple as possible and the work of decoding and using the data is performed by whatever is receiving the data.

#include "SPI.h"
#include "Adafruit_GFX.h"
#include "Adafruit_ILI9340.h"

#include <SoftwareSerial.h>
SoftwareSerial BTSerial(8, 9); //  TX,RX

#if defined(__SAM3X8E__)
    #undef __FlashStringHelper::F(string_literal)
    #define F(string_literal) string_literal
#endif

// These are the pins used for the UNO
// for Due/Mega/Leonardo use the hardware SPI pins (which are different)
#define _sclk 13
#define _miso 12
#define _mosi 11
#define _cs 7
#define _dc 5
#define _rst 6

#define XCENTRE 506
#define YCENTRE 528

Adafruit_ILI9340 tft = Adafruit_ILI9340(_cs, _dc, _rst);

const int LeftButton = A2;     // the number of the pushbutton pin
const int RightButton = A5;     // the number of the pushbutton pin

String X = "X";
String Y = "Y";

const int LeftXin = A1;  // Analog input pin for left joystick X
const int LeftYin = A0;  // Analog input pin for left joystick Y
const int RightXin = A6;  // Analog input pin for right joystick X
const int RightYin = A7;  // Analog input pin for right joystick Y

int prevLXDisplay = 0;
int prevLYDisplay = 0;
int prevRXDisplay = 0;
int prevRYDisplay = 0;

void setup() {

  tft.begin();
  delay(300);
  tft.setRotation(3);
  tft.fillScreen(ILI9340_BLACK);
  delay(300);
  tft.setCursor(20, 60);
  tft.setTextColor(ILI9340_BLUE);  tft.setTextSize(6);
  tft.println("BIG FACE");
  tft.setCursor(20, 120);
  tft.println("ROBOTICS");
  delay(500);

  Serial.begin(9600);
  BTSerial.begin(9600); //Bluetooth software serial
  
  pinMode(LeftButton, INPUT_PULLUP);
  pinMode(RightButton, INPUT_PULLUP);
 

  while(digitalRead(RightButton) == HIGH){ //Wait right here until right joystick button is pressed
  }
  tft.fillScreen(ILI9340_BLACK);
}

void loop(void) {

  tft.fillCircle(prevLXDisplay, prevLYDisplay, 10, ILI9340_BLACK);
  tft.fillCircle(prevRXDisplay, prevRYDisplay, 10, ILI9340_BLACK);
  drawGuides();
  
  int LXValue = analogRead(LeftXin);
  int LXDisplay = map(LXValue, 1023, 0, 20, 140);
  int LYValue = analogRead(LeftYin);
  int LYDisplay = map(LYValue, 0, 1023, 60, 180);
  
  int RXValue = analogRead(RightXin);
  int RXDisplay = map(RXValue, 1023, 0, 180, 300);
  int RYValue = analogRead(RightYin);
  int RYDisplay = map(RYValue, 0, 1023, 60, 180);

  tft.fillCircle(LXDisplay, LYDisplay, 10, ILI9340_RED);
  tft.fillCircle(RXDisplay, RYDisplay, 10, ILI9340_RED);
  prevLXDisplay = LXDisplay;
  prevLYDisplay = LYDisplay;
  prevRXDisplay = RXDisplay;
  prevRYDisplay = RYDisplay;

  int XValue = (XCENTRE-RXValue)/2;
  if (XValue < -255){
    XValue = -255;}
  if (XValue > 255){
    XValue = 255;}
  
  int YValue = (YCENTRE-RYValue)/2;
  if (YValue < -255){
    YValue = -255;}
  if (YValue > 255){
    YValue = 255;}
 

  // print the results to the serial monitor:
  String XString = X + XValue;
  String YString = Y + YValue; 
  Serial.print(XString);
  Serial.println(YString);

  BTSerial.print(XString);
  BTSerial.println(YString);


  delay(100);


}

void drawGuides(){
  //tft.drawLine(x1, y1, x2, y2, color);

  int LeftCentX = 80;
  int LeftCentY = 120;
  int RightCentX = 240;
  int RightCentY = 120;
  
  tft.drawLine(LeftCentX, LeftCentY, LeftCentX-60, LeftCentY, ILI9340_WHITE);
  tft.drawLine(LeftCentX, LeftCentY, LeftCentX+60, LeftCentY, ILI9340_WHITE);
  tft.drawLine(LeftCentX, LeftCentY, LeftCentX, LeftCentY-60, ILI9340_WHITE);
  tft.drawLine(LeftCentX, LeftCentY, LeftCentX, LeftCentY+60, ILI9340_WHITE);

  tft.drawLine(RightCentX, RightCentY, RightCentX-60, RightCentY, ILI9340_WHITE);
  tft.drawLine(RightCentX, RightCentY, RightCentX+60, RightCentY, ILI9340_WHITE);
  tft.drawLine(RightCentX, RightCentY, RightCentX, RightCentY-60, ILI9340_WHITE);
  tft.drawLine(RightCentX, RightCentY, RightCentX, RightCentY+60, ILI9340_WHITE);
  
}

RC Robot

I have been busy over the last couple of months with a new project. Due to my lack of imagination I am calling it the RC Robot. The aim of the project was to build a sturdy, reliable mobile robot platform to use for future development. I am a bit of a purist and don’t really consider a remote controlled vehicle a robot. However, to start with I wanted to make the robot remote controlled, as I thought this would be a good fun place to begin.  I have every intention of developing this project and adding autonomy at a later date. I also vowed to use some of the many parts that I have accumulated over the years of robot building, basing the drive system around some 12-24V brushed gear motors that I have a number of from a previous project. I also wanted to document the build in the form of a series of Youtube videos.

I kicked off the project by designing the drive assembly. As mentioned, the motor/gearboxes were in my parts box and are 12-24V como drills units, geared at 30:1. These were still a bit quick for my needs so I designed and built a simple gearbox, complete with bearings for support of the output shaft. I initially attempted to design and build a belt drive system, but for various reasons abandoned this in favour of using gears. I decided to use SLA batteries for this project and two 12V, 2.1Ah batteries in series gives a good solid 24V to work with. Check out the below video for a breakdown of the design and initial testing.

 

 

The next step of the project was to add some electronics to drive the two motors. An L298 based motor driver was ideal for the job. Add in some HC-05 bluetooth modules and an Arduino or two and I had myself a way to remotely control the speeds of the motors using a joystick. Part 2 below shows the development and testing of the electronics system along with circuit diagrams.

 

 

Once confident that the drive assemblies and electronics were up to the task, the next job was to design and fabricate the robot chassis itself. Some 2mm aluminium sheet served as a sturdy chassis plate for the robot and a strong castor was selected to serve as the 3rd wheel. My initial design for the base plate and the mounting for the castor was disappointing and not particularly aesthetically pleasing. After 3D printing a fancy castor mount I was much happier with the look of the robot. Part 3 of the video series covers the chassis build process and the first tentative test drive before the electronics were properly mounted.

 

 

The next job was to take the electronics from prototype breadboard to a more permanent stripboard circuit, ready for mounting to the robot. All of the required electronics were mounted to the top plate of the robot and the final wiring completed ready for the first proper test run!! It was a sunny day and I had a fun hour test driving the robot in the sunshine. Part 4 shows the results of the test drive and an appraisal of the robots performance.

 

 

During the initial test run I found that the robot was a bit of a handful to control. I had the controls set up for a skid steer type arrangement, with the raw joystick values being sent to the robot and converted into motor speeds with very little additional processing. Whilst great fun and a good challenge, I wanted a bit more control of the robot when manually driving it. I decided that encoders would help the situation by allowing for some closed loop control of motor speeds. I knocked up some homemade incremental encoders to allow the motor speeds to be measured and set about adding these to the robot. Part 5 is a more tutorial type video to show you how I added encoders to the robot.

 

 

And this brings us right up to date. I have promised myself to update this blog a bit more often, particularly when I have a new video to share. Stay tuned as I have just finished work on the controller and will have a new video to share very soon.

If anyone reading this would like more information on any of my robots, please feel free to leave a comment, either here or on the Youtube video and I will always do my best to help.

Robot Head Mark 2

Around 3 months ago I started a new robot project. I decided that I wanted to build another robot head that could sit on a desktop. I had a few goals in mind when starting the project. I wanted to 3D print the parts and try out some finishing techniques on the parts. Nothing too fancy, just sand them smooth and paint, to see how good a finish I could achieve. I also wanted the robot to be low-cost and simple. I mean really simple, just two servos and a couple of sensors. This was very intentional. I want to expand on the work of my previous project, the desktop robot head and arm, by concentrating on the software. I found I was too often dealing with mechanical issues or limitations and these were an all to welcome distraction from the software. I like making things and having something physical to show for the hard work, whilst I find that with software there isn’t always that sense of satisfaction of having created something. However, the software is where the robot really comes to life and I need to get better and more focused at writing it and implementing the ideas that I have. To that end I wanted a simple robot with very little to go wrong that can serve as a platform to develop some interesting functions. This project will hopefully pull together a lot of the work from previous projects into one robot, and I intend to craft the software with care. I want to squeeze as much functionality out of this simple robot platform as possible. I also wanted to take the opportunity of a new project to switch from using TKinter for the GUI to using wxPython.

I started the process by designing the robot in FreeCad. I did more design work on this than in previous projects and really spent my time on the modelling. I even put the parts together into an assembly to help decide on the colour scheme. I have started another youtube series following the design and build of the robot. Part 1 is below.

This video shows the design and printing of the robot and at the end of it I had a pile of 3D printed parts ready for finishing. The design is a simple pan/tilt robot head, with a camera and a sonar sensor mounted in the head. I want to build on my previous work with point clouds so I wanted a sonar mounted on the head. I don’t think I will end up creating point clouds with this robot but I will likely use the techniques to come up with a different way to represent the environment. However, I didn’t want the sonar sensor looking like the robots ‘eyes’ as they often do, so I tried something new with this project. I printed a mesh to mount in front of the sonar and, combined with a piece of filter fabric, made a cover for the sensor. Later testing would reveal if this was successful or not.  The only other feature of the head was the robots actual ‘eye’. In place of the TFT screen used in the last project I opted for an RGB LED in a custom enclosure that would furnish the robot with a single colour changing eye that it can use to attract interest or convey emotions.

Part 2 shows the finishing of the 3D printed parts.

This involved a lot of sanding and priming before finally painting the parts. I was overall really pleased with the finished parts. It become clear from this process that the quality of the final finish is directly proportional to the amount of finishing work put in. I also took away a few other lessons from this process when designing the parts for finishing. Sharp corners and deep recesses are difficult to sand and remove all of the build lines without removing too much material. Smooth, rounded surfaces or large flat surfaces are easier to sand and look good when painted.

Part 3 of the video series documents the assembly of the robot.

With the robot assembled I was able to do some testing. I decided to use an Arduino Nano for this project, in-keeping with the minimalist and simplicity goals of the project. I knocked up a bread board circuit for testing and set about testing the RGB LED eye and the sonar sensor. Good news was that the sonar still seemed to function just fine from behind its cover, and the RGB LED worked as expected.

I initially connected the RGB LED to digital outputs, thinking that the few colours that this yields would suffice. I subsequently decided to use PWM outputs, so that a wider range of colours could be generated. Given that the robot was so simple, I thought this was a reasonable extravagance.

Part 4 shows testing of the RGB connected to PWM outputs and the first test of the servos.

This video also represents my first simple programs written using wxPython. So far, I like it. The GUI’s look better than their TKinter counterparts and overall it is not much more complex, if at all. I created a couple of test programs, one that sends commands via serial to set R,G and B values for the LEDs and another that sends servo positions. Code for the Arduino has also been written to receive these commands and set the RGB outputs and servo positions accordingly. I can also send get commands to the Arduino and it will return values as appropriate.

This brings us almost up to date on this project. There will be a new video soon, so be sure to subscribe to my youtube channel. I have been working on embedding OpenCV images and MatPlotLib plots in wxPython windows, as well as making a more permanent circuit for the robots electronics.

Youtube series Part 2 and 3 now available

Part 2 of my Youtube series following the development of my latest robot project, a desktop robot head and arm, has been up for a week or so now and I have just finished Part 3. Part 3 covers the testing of the hand switch and how I am starting to develop the code for both the Arduino and the controlling PC.

 

 

 

 

 

If you enjoy the videos, please subscribe as I plan to continue making these as often as time allows. If you would like more information on any aspects of the robot, drop me a message and I can go into more detail in a future video.

YouTube video series

Happy Robot

 

I am managing to find a bit more time lately to progress some of my projects, in particular the desktop robot. I have started work on a robot arm to accompany the robot head as I suggested in the last post. So far the base and lower joint of the robot arm have been designed in Freecad and 3D printed. I am currently working towards completing the robot arm. At the moment I don’t intend to fit a gripper to the end of the arm but instead use a custom designed touch sensor to allow the arm to get feedback from the environment. But you can bet that a gripper will be on the cards at some point in the future! To document and share the work on the project, I am making a series of YouTube videos to show you what I am up to. I admit, I would have liked to have started doing this at the beginning of the project but better late than never, right? I will also admit that I am not a fan of talking in my videos, or the sound of my own voice, but its the easiest way to explain what I am doing. I will get used to it and I’m sure I will get more comfortable as time goes on.

If anyone out there would like more information on any part of the build, let me know via a comment on here or on YouTube. I do intend to focus on particular parts of the project in future videos, with a sort of tutorial feel to show you how I implemented various features.

Below is the first video of the series, which gives an introduction to the project and shows some of the work that I have been doing. My aim is to release a video every week, but time will tell how realistic this aim is. I hope you enjoy the videos and I am working on the next one now, which should be available tomorrow.

 

Latest robot project – A desktop social robot

I have been working on a new robot project for the last few months.  I like desktop robots and having read about social robots like Jibo and Buddy I decided I would like to try and create a low cost version of one of these that can sit next to my computer and maybe interact with me and my children. I wanted to try and design some more complex components for 3D printing and thought that this project would be a good opportunity. I decided to give FreeCAD a shot as it looked like a promising open source 3D design package.  After negotiating the expected learning curve I was impressed with the range of functionality that FreeCAD has and I was able to design some cool looking parts for my new robot. The idea was to design a desktop robot with a display and a camera integrated into the head and use servos to pan and tilt the head to enable to robot to look around the room. As the project progressed, as is often the case, it evolved and the robot ended up with an extra servo that enabled the head to roll as well. I wanted to incorporate an accelerometer into the head to track its position and went with a cheap MP6050.  For the display I used an adafruit 2.2″ tft screen that I have used in previous projects. I had a webcam, that I stole from my mobile robot that I mounted in the head as well. I also used this project as an excuse to learn how to use kiCAD for pcb design. I used this to design a shield for the arduino mega to interface all of the sensors and servos.

Below is a picture of the finished robot.

Social robot

Social robot

I was particularly pleased with the lettering on the bottom servo housing, easy to do with FreeCad. I was also pleased with the bracket that connects the roll and tilt servos, as shown below.

Roll to tilt head bracket

Roll to tilt head bracket

The accelerometer mounts to the head and, using this library, is able to spit out roll, pitch and yaw angles.

MP6050 mounted to head

MP6050 mounted to head

After some calibration I have been able to control the servo positions using the accelerometer readings very reliably. A simple control loop for each servo is all that is required to position the head at any combination of roll, pitch and yaw angles.

I have a lot of work to do on the software, but I wanted to share the project now that the mechanical and electronic build is complete. I was going to use the raspberry pi for this project but I have decided to use my desktop computer for now, but I may decide to use the pi at a later date. Previously I was driving the tft screen from the pi but I am now writing to the screen from the Arduino. The screen will display a face, probably just simple shapes for now, to allow the robot to display some emotions.

I am also planning to design a robot arm it some point, and I would like this robot to be able to control the robot arm. I am thinking of possibly having a few modular parts, like arms or other sensors, that can work together. I am not sure how this will happen at that moment but its fun to think about.

Come back soon as I hope to have a video of the head moving around and controlling its position up here when its working.

 


 

excitingtechnology.net

Facts and Thoughts on Technological Progress

Turing's Radiator

Pleasantly Warm Topics in Computational Philosophy

Mind the Leap

One small step for the brain. One giant leap for the mind.

Steve Grand's Blog

Artificial Life in real life

jimsickstee

Hi, I'm Jim. Read what I write and I'll write more to read.