Replaying sequences, and some thoughts…

Part 4 of my Youtube series on my desktop robot head and arm build is now available. This episode shows the robot replaying some sequences of movements, along with some new facial expressions. I was exploring how well the robot was able to convey emotions and, even with quite basic movements and facial expressions, I think it does a reasonable job. Check out the video here.

Now for some thoughts…

It’s at this point in a project that I normally reach a cross roads. On the one hand the mechanical build is complete and the electronics, although there is more to come in this project, are working as required. These are the aspects of robot building that I enjoy the most. However, I really want the robot to do something cool.  I find myself shying away from the software in favour of starting a new project where I can fire up the 3D printer and the mill. I often put this down to not having an end goal in mind for the robot and the associated software. So I am going to use this space to jot down some thoughts about this that may help me keep on track. I have made notes about some features that I would like to implement in the project which I will list below. Some are quick and fairly easy, others are going to take some time. Whether I ever get them all completed will remain to be seen, but this will prove a helpful reminder should I need to come back to the list.

  • Program and replay poses/sequences from the GUI
  • 3D model of the robot on the GUI for offline programming or real-time monitoring
  • Face detection and tracking
  • Face expression detection (smile or frown) and react accordingly
  • Automatically stop the arm when the hand switch is activated
  • Detect someone and shake their hand
  • Gripper?
  • Remote control/input to the robot via bluetooth (I have a module on the way) maybe an android app?
  • Program the robot to play simple games
  • Object detection and recognition
  • Voice input and sound/speech output
  • Mapping of visual surroundings and reacting to changes
  • Use the robot as a platform for AI development. I have worked on this in the past, trying to use neural networks to allow the robot to have some hand-eye coordination
  • Sonar/IR sensors to sense the area in front of the robot and react to changes

This is just a preliminary list of features that I think would be interesting. I will certainly return to this list myself as a reminder to keep me on track. If anyone has any other suggestions, please leave a comment as I am interested in what other people would consider useful/fun.

My ultimate goal for this project is to have a robot that can sit on my desk or in my lounge, and interact with me and my family. It may sit idle, waiting for someone to activate it, before offering a suggested game or activity. It may begin moving under its own volition, learning its environment, building a model of its world and own self, ready to react to any sensed changes. It may get bored and lonely, and make noises until someone comes to see what the robot wants and play a game. I am not sure but this is where I want the project to head. Ultimately, I will want all processing to be done on-board, so that the robot can be moved around (a Raspberry Pi is a likely candidate to achieve this).

I will keep you all updated on the progress of this project. I think small steps are required to implement each of the features above in turn. I am hoping that eventually I will be able to join all of the features together into an interesting final product.  Until next time, thanks for visiting!

EDIT: I have put my code on to Github here. This is early days in this project but I like to share, especially if it helps someone out!

Advertisements

Youtube series Part 2 and 3 now available

Part 2 of my Youtube series following the development of my latest robot project, a desktop robot head and arm, has been up for a week or so now and I have just finished Part 3. Part 3 covers the testing of the hand switch and how I am starting to develop the code for both the Arduino and the controlling PC.

 

 

 

 

 

If you enjoy the videos, please subscribe as I plan to continue making these as often as time allows. If you would like more information on any aspects of the robot, drop me a message and I can go into more detail in a future video.

YouTube video series

Happy Robot

 

I am managing to find a bit more time lately to progress some of my projects, in particular the desktop robot. I have started work on a robot arm to accompany the robot head as I suggested in the last post. So far the base and lower joint of the robot arm have been designed in Freecad and 3D printed. I am currently working towards completing the robot arm. At the moment I don’t intend to fit a gripper to the end of the arm but instead use a custom designed touch sensor to allow the arm to get feedback from the environment. But you can bet that a gripper will be on the cards at some point in the future! To document and share the work on the project, I am making a series of YouTube videos to show you what I am up to. I admit, I would have liked to have started doing this at the beginning of the project but better late than never, right? I will also admit that I am not a fan of talking in my videos, or the sound of my own voice, but its the easiest way to explain what I am doing. I will get used to it and I’m sure I will get more comfortable as time goes on.

If anyone out there would like more information on any part of the build, let me know via a comment on here or on YouTube. I do intend to focus on particular parts of the project in future videos, with a sort of tutorial feel to show you how I implemented various features.

Below is the first video of the series, which gives an introduction to the project and shows some of the work that I have been doing. My aim is to release a video every week, but time will tell how realistic this aim is. I hope you enjoy the videos and I am working on the next one now, which should be available tomorrow.

 

Latest robot project – A desktop social robot

I have been working on a new robot project for the last few months.  I like desktop robots and having read about social robots like Jibo and Buddy I decided I would like to try and create a low cost version of one of these that can sit next to my computer and maybe interact with me and my children. I wanted to try and design some more complex components for 3D printing and thought that this project would be a good opportunity. I decided to give FreeCAD a shot as it looked like a promising open source 3D design package.  After negotiating the expected learning curve I was impressed with the range of functionality that FreeCAD has and I was able to design some cool looking parts for my new robot. The idea was to design a desktop robot with a display and a camera integrated into the head and use servos to pan and tilt the head to enable to robot to look around the room. As the project progressed, as is often the case, it evolved and the robot ended up with an extra servo that enabled the head to roll as well. I wanted to incorporate an accelerometer into the head to track its position and went with a cheap MP6050.  For the display I used an adafruit 2.2″ tft screen that I have used in previous projects. I had a webcam, that I stole from my mobile robot that I mounted in the head as well. I also used this project as an excuse to learn how to use kiCAD for pcb design. I used this to design a shield for the arduino mega to interface all of the sensors and servos.

Below is a picture of the finished robot.

Social robot

Social robot

I was particularly pleased with the lettering on the bottom servo housing, easy to do with FreeCad. I was also pleased with the bracket that connects the roll and tilt servos, as shown below.

Roll to tilt head bracket

Roll to tilt head bracket

The accelerometer mounts to the head and, using this library, is able to spit out roll, pitch and yaw angles.

MP6050 mounted to head

MP6050 mounted to head

After some calibration I have been able to control the servo positions using the accelerometer readings very reliably. A simple control loop for each servo is all that is required to position the head at any combination of roll, pitch and yaw angles.

I have a lot of work to do on the software, but I wanted to share the project now that the mechanical and electronic build is complete. I was going to use the raspberry pi for this project but I have decided to use my desktop computer for now, but I may decide to use the pi at a later date. Previously I was driving the tft screen from the pi but I am now writing to the screen from the Arduino. The screen will display a face, probably just simple shapes for now, to allow the robot to display some emotions.

I am also planning to design a robot arm it some point, and I would like this robot to be able to control the robot arm. I am thinking of possibly having a few modular parts, like arms or other sensors, that can work together. I am not sure how this will happen at that moment but its fun to think about.

Come back soon as I hope to have a video of the head moving around and controlling its position up here when its working.

 


 

BFRCode

With the redesign of my robot BFR4WD complete I have moved back to developing the software to control the robot. As I have eluded to in previous posts, I have been working on a protocol for sending commands from the Raspberry Pi to the Arduino. The idea is that the Pi carries out the high level control, i.e. move forward 50cm, turn 30 degrees, etc. as well as image processing, and the Arduino is in charge of the low level control associated with these commands. I took inspiration from Gcode and started developing what I’m now calling BFRCode. I’m sure this isn’t a new idea but it is my take on it and I can tailor it to meet the requirements for my projects. BFRCode consists of a list of alpha-numeric command strings that can be sent by the Pi and interpreted and executed by the Arduino. The current list of commands (BFRCode_commands.xls), along with all of the other code I am working on can be found on github here. The command to move the robot forward 50cm for example would look like W1D50, a turn anti-clockwise of 30 degrees would be W3D30. I have also added functions for driving an arc shaped path and turning the robot to face a given direction as measured by the compass. The code also allows the head to be moved, sensor readings to be returned and power to the servos to be turned on and off. Currently all move commands return a status code to indicate if the move was completed successfully or not, as may be the case if an obstacle was encountered during the movement. The Arduino is in charge of detecting obstacles during movements.

This control scheme has the benefit of separating the high level control from the low level. Functions can be developed on the arduino and tested in isolation to make sure they do what they should and can simply be called by the python script running on the pi. Likewise, when developing the high level code on the Raspberry Pi, very little thought needs to be put in to moving the robot, just issue a command and check that it was executed correctly. Complex sequences of movements can be created by putting together a list of commands and storing as a text file, like you would expect from a Gcode file. A python script can then read through the file and issue the commands one at a time, checking each has completed successfully before issuing the next. I have found that sending strings with a newline termination is a very reliable method of exchanging data and can be done at a reasonably high baud rate. The other advantage to controlling the robot like this is that data is only sent between the Raspberry Pi and the Arduino when a command is issued or data is required. This is in contrast to previous approaches I have taken where data is constantly being sent back and forth.

To send commands, up until now,  I have been using a python script that I wrote that takes typed commands from the command line and sends them to the Arduino. This was OK but I decided I wanted a more user friendly and fun way to control the robot manually, for testing purposes and to show people what the robot can do whilst I’m working on more autonomous functions. I have started making a GUI in Tkinter that will send commands at the touch of a button. If I use VNC to connect to the Raspberry Pi it means I can control the robot manually using any device I choose (laptop, phone or tablet). I have also set the Raspberry Pi up as a wi-fi access point so I can access it without connecting to a network, ideal if I take to robot anywhere to show it off. Below is a screenshot of the GUI I am working on.

BFRGui

I created some custom graphics that are saved as .gif images so that a Tkinter canvas can display them. There are controls for moving and turning the robot and pan and tilt controls for the head. The compass graphic shows the current compass reading. If the compass graphic is clicked, the user can drag a line to a new bearing and on release of the mouse button, a command will be issued to turn the robot to the new heading. I have buttons for turning servo power on and off and a display showing the current sonar reading. I have incorporated a display for the image captured by the webcam. I am using OpenCV to grab the image and then converting it to be displayed on a Tkinter canvas. I’m really pleased with the way that BFRCode and the GUI are turning out. My 3 year old boy has had his first go at manually controlling a robot with the GUI and that is a success in itself!

I have made a very quick video of me controlling the robot using the GUI after connecting to the Raspberry Pi using VNC from a tablet.

Something I would like to develop is a BFRCode generator that allows a path to be drawn on the screen that can then be turned into a BFRCode file. The generated file could then be run by the robot. Head moves and image capture could be incorporated into the instructions. This could be useful for security robots that patrol an area in a fixed pattern. I am still very keen to develop some mapping software so the robot can then plot a map of its environment autonomously. The map could than be used in conjunction with the BFRCode generator to plot a path that relates to the real world.

New battery and some design revisions. BFR4WD MK2

Work was progressing with BFR4WD and I was experimenting with software for object/landmark recognition as well as developing BFR-Code for controlling the robot. However, I started having problems with the old NI-MH battery pack I was using to power the servos.  It wasn’t holding its charge for long and the servos were constantly struggling to move the robot. This battery pack had sometimes struggled to deliver the current required when all the wheel servos were working hard when it was working well. I decided it was time for a new battery. This led to a succession of design revisions and many parts of BFR4WD have been redesigned and rebuilt.

Lets start with the choice of a new battery. I looked into the various options available  and settled on a 6V SLA battery to power all the servos. SLA batteries are cheap, easy to charge and able to supply a lot of current if required. The downside is that they are heavy! During the original design of BFR4WD I made the decision to gear the servos to increase the top speed of the robot at the expense of torque to the wheels. This was fine using the previous battery but the additional weight of the SLA battery means this gear ratio would no longer be suitable. I decided to redesign the drive train for each wheel so that the required torque could be delivered. This meant designing some more 3D printed gears with a ratio that increased torque at the expense of speed. I ended up with a 20 tooth gear attached to the servo and a 24 tooth gear on the wheel axle.

I used this redesign as an opportunity to review the design of the wheel encoders as well. Whilst the previous design worked well, the encoders were mounted in a place the meant most of the robot had to be dismantled in order to adjust, repair or replace them. Looking into various options I settled on sticking with an optical encoder but instead of a slotted disk I would use a printed pattern. This design involves printing the encoder disk pattern onto transparency film using a laser printer. A slotted opto-interrupter looks at this disk and switches depending on whether a printed line is present or not. The final design ended up with the patterned disk attached to the servo gear meaning I could also do away with the third gear/slotted disk of the previous design. This has the advantages of fewer parts to print, fewer holes to drill and fewer moving parts. This all adds up to a simpler design which is lighter and runs more quietly. The picture below shows an encoder disk mounted to the servo gear.

Encoder disk

Encoder disk

With the alterations to the gear ratio and the new design for the encoders I was also able to increase the encoder resolution from 180 pulses per revolution to 218.

A redesign of the encoder circuit was also required and I decided to use Sharp GP1S093HCZ0F opto-interrupters. These devices are very small and allow a very compact encoder to be built. A test on a breadboard allowed me to find the correct resistor values to use (10K and 220 Ohm) and test that the device could read the encoder disks as required. The pictures below show the breadboard version and the final encoder circuit which was built on stripboard.

Encoder circuit on breadboard

Encoder circuit on breadboard

Encoder circuit

Encoder circuit

To allow the new design drive train to fit in the aluminium box section I’m using to house it, I had to space the servos off from the box section by 3mm. I designed and printed a spacer that would hold both of the servos required for one side of the robot chassis.

Servo Spacer

Servo Spacer

A 3D printed bracket was designed that would mount the encoder circuits in the correct position and allow for some adjustment. These encoders are mounted in the centre of the chassis rail as opposed to the ends as in the previous design. This configuration makes them a lot easier to adjust. Shown below is the finished encoder arrangement mounted in the chassis.

Mounted encoder

Mounted encoder

This encoder design and gear configuration also meant that I could reduce the overall size of the robot. This design is now 40mm shorter in length but 10mm wider. This has the advantage of setting the wheels closer together along the length of the robot but spacing them further apart across the width of the robot. This helps when the robot is turning on the spot.

I had to fabricate all new chassis parts to fit the new design, one of which is shown below.

Chassis rail

Chassis rail

The picture below shows one of the chassis rails with the servos, gears and encoders mounted. I reused the end joining pieces from the previous design. I also changed the bushes that the axles run in from a push fit design to ones that glue into place.

IMG_1608

Due to reducing the overall size of the robot and the addition of a larger battery, space on the chassis to mount everything had decreased significantly. I was forced to start building upwards! I designed some 3D printed stand-offs that would allow me to mount another sheet of HDPE above the first, as shown below.

3D printed stand-offs

3D printed stand-offs

This meant that the two batteries could sit on the bottom layer which allowed good access to the connect / disconnect power to them. The Raspberry Pi, Arduino and USB hub could go on the top layer, again giving good access to connect to them. I also added a compass module to the robot. A CMPS03 that I have had lying around for a long time.  I designed and printed a mount for the compass and used a length of aluminium box section to raise the compass into the air. My experience is the further the compass is from any other components the better as it reduces interference.

The picture below shows the completed robot. So many things have changed with this redesign I think this will have to be referred to as BFR4WD MK 2. Initial testing confirms that I have much more torque at the wheels due to the change in gear ratio and better current delivering capacity of the battery. The encoders are working perfectly and I have increased resolution meaning more accurate positioning and speed control. The compass works and will be used for navigation. I am very keen to try some more map building using a mobile robot with the addition of data from visual images.

BFR4WD - MK2

BFR4WD – MK2

3D printing a new robot

I was very pleased with the way BFRMR1 turned out, but it had some design flaws that I needed to address. The servos used for the drive wheels were a bit too slow. The drive wheels were positioned near the centre of the robot to help limit the size of the turning circle, but meant that the robot would tip forwards when stopping. It also meant that I couldn’t mount anything to the front of the robot, such as a gripper. Wheel encoder resolution was also a bit limited. An idea started forming in my mind for a new robot.

The idea was to make a four wheeled robot, with each wheel driven independently. I wanted to stick with using servos to drive the wheels. I love servos. They are cheap and very easy to control. But they can be slow! My idea evolved to making a gearbox to speed the servos up a bit, whilst taking the hit of losing a bit of torque. However, for this new robot it wouldn’t matter too much as I was doubling the number of drive wheels.  I thought of several ways of gearing the servos. Drive belt and pulleys was the first option but I decided to go for gears instead. I could have bought the required gears but I thought that this project was as good an excuse as any to invest in a new piece of equipment, a 3D printer!

After a bit of research I decided on a prusa i3 printer bought as a kit. I painted the aluminium frame and after a few days and a couple of long nights I had my printer assembled and working.

Prusa i3 3d printer

Prusa i3 3d printer

After calibration and a number of test prints, I set about designing some gears to form a gearbox.  I used OpenSCAD to design all the parts for the robot. To design the gears I downloaded a gear generator from thingiverse http://www.thingiverse.com/thing:3575. I started with a 25 tooth gear that would be connected directly to a servo horn, then a 14 tooth gear that would be connected to the wheel drive shaft. Attached to the 14 tooth gear is a 45 tooth gear with a finer pitch to drive an encoder disc with a 14 tooth gear attached. All of this together would increase the top speed of the servo and give me an encoder resolution of 180 pulses per wheel revolution. It took a few tries to get each of these gears right and some of the prototypes are shown in the picture below.

3D printed gear prototypes

3D printed gear prototypes

With all of the gears designed I made a trial gearbox. I wanted to use aluminium rectangular box section to house the gearbox. This means all of the gears are hidden and contained and also means that the gearbox could form a part of the robots chassis. The prototype gearbox just used a short section of the aluminium box as a test. The picture below shows the gearbox from the end with the encoder disc nearest the camera.

Prototype gearbox

Prototype gearbox

The final gearbox design used a long length of aluminium box section with two servos attached and two gearboxes within. This would form the drive for one side of the robot. Access holes were cut into the box section so allow assembly and adjustment of the gearbox and the picture below shows the view into one of the access holes. You can see the two servos mounted with gears attached and the drive shaft passing through the box section with its gear attached. The encoder discs are hidden.

One completed gearbox

One completed gearbox

I also designed and printed some bushes for the drive shaft to run in that clipped into holes drilled in the aluminium box section. The hole through the middle of these is slightly undersized so that they can be drilled out to exactly the right size for the shaft to fit in.

3D printed bushes

3D printed bushes

The encoders consists of a 28 slot encoder disc and a photo-interrupter to detect each of the slots as the disc turns. I decided on using sharp GP1A52LRJ00F slotted optical switches. These have photologic outputs so only a minimum of external circuitry is required to interface these with the Arduino. In fact only one resistor is needed so I used stripboard to make four encoder circuits that were then mounted inside the aluminium box section with the encoder discs turning between the sensors.

With two gearbox/chassis sections made I had two sides of the chassis. To join these together and make a complete chassis I needed to design some brackets. These brackets attached aluminium box section cross members to the gearbox sections to make a rectangular chassis. These are shown in the picture below.

Chassis bracket

Chassis bracket

One feature I wanted for this robot was the ability to separate the electronics and sensors from the chassis easily. To achieve this I decided to mount the Arduino mega, the Raspberry Pi, the batteries and the USB hub on a sheet of HDPE plastic that would then be bolted to the chassis with four bolts. Should I need to work on the chassis in the future I could just undo these four bolts, disconnect the encoders and drive servos from the Arduino and remove the electronics board. I also decided to mount the head pan/tilt mechanism to this board as well. The picture below shows the chassis with the electronics board attached.

Assembles chassis and electronics

Assembled chassis and electronics

The head pan/tilt mechanism consists of two regular servos and some 3D printed brackets. The picture below shows the bracket that attaches the pan servo to the electronics board.

Servo bracket for head pan/tilt

Servo pan bracket

Attached to the pan servo is the tilt servo via another 3D printed bracket. I designed a further piece that fixes to the tilt servo that the head can be bolted to, all shown in the picture below.

Head tilt servo bracket

Head tilt servo bracket

The head of the robot houses a sonar sensor and a webcam. See the picture below showing the assembled head attached to the pan/tilt mechanism.

3D printed head

3D printed head

With all of this done the robot is almost mechanically complete. I need to design and print some mounts for two IR sensors that will probably mount to the electronics board either side of the pan/tilt mechanism. The other job to do is to design and print a housing for a small screen and some buttons for controlling the robot without having to connect to it with another PC.

BFR4WD almost complete

BFR4WD almost complete

I have been developing software for the new robot alongside the mechanical build. I have modified the wheel control loop software from my previous robot to now control 4 wheels at the same time. A lot of the software from the BFRMR1 can be used in this project but one thing that I knew needed work was the communications between the Arduino and the Raspberry Pi. I was using serial communications but I never really liked the protocol I was using, that I developed so I can’t even blame anyone else for it. I am sticking with serial comms but wanted an improved protocol. Inspired by G-code as used on 3D printers I decided to come up with my own protocol to send commands in the form of strings to the robot. I’m calling it BFR-Code for now! The basic idea is that movement commands or requests for data can be sent to the robot along with some data to determine how to move. So a move command string will start with a capital letter M followed by a number to determine the type of move and then any data required proceeded by a capital D. So the command M1 D200 would drive the robot forward 200 encoder ticks. Error codes and data can be returned to the Raspberry Pi in a similar manner. This whole thing is a work in progress and I will make a blog post in the future with full details if this works out well.

For now I am continuing work on the software but I am near to making a video of the robot in action so check in again soon!

excitingtechnology.net

Facts and Thoughts on Technological Progress

Turing's Radiator

Pleasantly Warm Topics in Computational Philosophy

Mind the Leap

One small step for the brain. One giant leap for the mind.

Steve Grand's Blog

Artificial Life in real life

jimsickstee

Hi, I'm Jim. Read what I write and I'll write more to read.