Modelling the robot arm – Denavit Hartenberg parameters

Part 7 of my Youtube series, documenting the building and software development of my desktop robot head and arm, is now available.

 

I have designed a new attachment for the end of the robot arm. For the time being I have removed the touch sensitive ‘hand’ and I have replaced it with a sonar sensor, mounted via an additional micro-servo. The idea is that the sonar can measure the environment and use this information, in conjunction with the camera mounted i the head, to build up a visual and spacial model of its environment. I’ll be honest, I’m still a bit fuzzy on how this is going to work but it should keep me busy for a while.

This episode also demonstrates the progress that I have made in modelling the robot arm. With the model in place, I have been able to generate a mimic of the robot arm, embedded in the GUI. I have used matplotlib to generate a 3D plot that I use to display the model of the robot arm.

The first step in modelling the arm, was to find its Denavit Hartenberg parameters. There are lots of great resources online that detail how this is done, so I will only cover it briefly. I assigned reference frames to each of the robot arm joints, namely the base servo, lower arm servo, elbow servo and end effector (sonar) servo. From these reference frames, and some measurements taken from the robot, I was able to find the Denavit Hartenberg parameters as shown in the table below.

 

Description Link a α d θ
Base 0 15mm 90 102mm θBase
Lower 1 150mm 0 0 θLower
Elbow 2 161mm 0 0 θElbow
End 3 38mm 0 0 θEnd
Sonar 4 Sonar Distance 0 0 0

 

The variables in this case, are the θ values, which are the joint angles. You will notice the addition of the Sonar ‘link’ in the table. I will explain more about this in a moment. With these values identified, it is then a case of using these in a Denavit Hartenberg matrix to find the Cartesian coordinates of each joint. Multiplying the XYZ coordinates for a joint by the successive Denavit Hartenberg  matrix, gives the coordinates of the next joint on the arm.

I was able to calculate these coordinates for the arm, initially using joint angles given by sliders in the GUI, and plot them on a 3D matplotlib plot. This was embedded in the GUI and the robot arm model moved as the sliders were altered. It was then possible to read live joint angles from the robot so that the model reflected the actual position of the robot arm at any time.

The additional sonar ‘joint’ in the table above was used to calculate where in 3D space the sonar sensor, mounted to the arm, was measuring to. I treated the sonar sensor as an additional prismatic joint on the robot and as such the variable is the distance measured by the sonar sensor whilst the angle of the joint remains constant. I was then able to plot a line that represents the sonar sensor reading on to the robot model.

I plan to do the same bit of work for the robot head and have a model of that on the screen as well. This is likely to be the content for the next video in the series. I also want to explore a way to use the sonar readings to plot the environment as the robot arm moves around. At the moment I am thinking either a point cloud type data structure or a 3D occupancy grid type approach, but this is very early days so the approach may change.

For now, please enjoy the most recent video and subscribe to my Youtube channel for notifications of future videos. Any feedback or recommendations are welcome.

Advertisements
excitingtechnology.net

Facts and Thoughts on Technological Progress

Turing's Radiator

Pleasantly Warm Topics in Computational Philosophy

Mind the Leap

One small step for the brain. One giant leap for the mind.

Steve Grand's Blog

Artificial Life in real life

jimsickstee

Hi, I'm Jim. Read what I write and I'll write more to read.