About this Robot Arm Tutorial
The robot arm is probably the most mathematically complex robot you could ever build.
As such, this tutorial can't tell you everything you need to know. Instead, I will
cut to the chase and talk about the bare minimum you need to know to build an
effective robot arm. Enjoy!
To get you started, here is a video of a robot arm assignment I had when I took
Robotic Manipulation back in college. My group programmed it to type
the current time into the keyboard . . . (lesson learned, don't crash robot arms into your
keyboard at full speed while testing in front of your professor)
Denavit-Hartenberg (DH) Convention
The Robot Arm Free Body Diagram (FBD)
The Denavit-Hartenberg (DH) Convention is the accepted method of drawing robot arms in
FBD's. There are only two motions a joint could make: translate and rotate.
There are only three axes this could happen on: x, y, and z (out of plane).
Below I will show a few robot arms, and then draw a FBD next to it,
to demonstrate the DOF relationships and symbols. Note that I did not count the DOF
on the gripper (otherwise known as the end effector). The gripper is often complex
with multiple DOF, so for simplicity it is treated as separate in basic robot arm design.
4 DOF Robot Arm, three are out of plane:
3 DOF Robot Arm, with a translation joint:
5 DOF Robot Arm:
Notice between each DOF there is a linkage of some particular length.
Sometimes a joint can have multiple DOF in the same location. An example would
be the human shoulder. The shoulder actually has three coincident DOF. If you were
to mathematically represent this, you would just say link length = 0.
Also note that a DOF has its limitations, known as the configuration space.
Not all joints can swivel 360 degrees! A joint
has some max angle restriction. For example, no human joint can rotate more than about 200 degrees.
Limitations could be from wire wrapping, actuator capabilities, servo max angle, etc.
It is a good idea to label each link length and joint max angle on the FBD.
Your robot arm can also be on a mobile base, adding additional DOF. If the wheeled robot can rotate,
that is a rotation joint, if it can move forward, then that is a translational joint.
This mobile manipulator robot is an example of a 1 DOF arm on a 2 DOF robot (3 DOF total).
Now rotating that by the base joint another 180 degrees to get 3D, we have this workspace image.
Remember that because it uses servos, all joints are limited to a max of 180 degrees. This creates a workspace
of a shelled semi-sphere (its a shape because I said so).
If you change the link lengths you can get very different sizes of workspaces,
but this would be the general shape. Any location outside of this space is a location
the arm cant reach. If there are objects in the way of the arm, the workspace
can get even more complicated.
Here are a few more robot workspace examples:
Cartesian Gantry Robot Arm
Cylindrical Robot Arm
Spherical Robot Arm
Scara Robot Arm
Articulated Robot Arm
Mobile Manipulators
A moving robot with a robot arm is a sub-class of robotic arms.
They work just like other robotic arms, but the DOF of the vehicle is added to the DOF of the arm.
If say you have a differential drive
robot (2 DOF) with a robot arm (5 DOF) attached (see yellow robot below), that
would give the robot arm a total sum of 7 DOF. What do you think the workspace on this
type of robot would be?
The point of doing force calculations is for motor selection. You must make sure that
the motor you choose can not only support the weight of the robot arm, but also what the
robot arm will carry (the blue ball in the image below).
The first step is to label your FBD, with the robot arm stretched out to its maximum length.
Choose these parameters:
weight of each linkage
weight of each joint
weight of object to lift
length of each linkage
Next you do a moment arm calculation, multiplying downward force times the linkage lengths.
This calculation must be done for each lifting actuator. This particular design has just
two DOF that requires lifting, and the center of mass of each linkage is assumed
to be Length/2.
As you can see, for each DOF you add the math gets more complicated, and the joint weights
get heavier. You will also see that shorter arm lengths allow for smaller torque requirements.
Too lazy to calculate forces and torques yourself?
Try my robot arm calculator to do the math for you.
But it gets harder . . . the above equation is for rotational motion and not for straight line motions.
Look up something called a Jacobian if you enjoy mathematical pain =P
Another Video!
In order to better understand robot arm dynamics, we had a robot arm bowling competition using the same DENSO 6DOF robot arms
as in the clocks video.
Each team programs an arm to do two tasks:
Try to place all three of its pegs in the opponents' goal
Block opponent pegs from going in your own goal
Enjoy! (notice the different arm trajectories)
Arm Sagging
Arm sagging is a common affliction of badly designed robot arms. This is when an arm is too long and heavy,
bending when outwardly stretched. When designing your arm, make sure the arm is reinforced and lightweight.
Do a finite element analysis to determine bending deflection/stress
such as I did on my ERP robot:
Keep the heaviest components, such as motors, as close to the robot arm base as possible.
It might be a good idea for the middle arm joint to be chain/belt driven by a motor located at the base
(to keep the heavy motor on the base and off the arm).
The sagging problem is even worse when the arm wobbles between stop-start motions.
The solve this, implement a PID controller so as to slow
the arm down before it makes a full stop.
A robot arm without video sensing is like an artist painting with his eyes closed. Using basic
visual feedback algorithms, a robot arm could go from point to point on its own without a list of
preprogrammed positions. Giving the arm a red ball, it could actually reach for it (visual tracking and servoing). If the arm can
locate a position in X-Y space of an image, it could then direct the end effector to go to that same X-Y location
(by using inverse kinematics). If you are interested in learning more about the vision aspect of visual servoing, please
read the Computer Vision Tutorials for more information.
Haptic sensing is a little different in that there is a human in the loop. The human controls the robot arm movements
remotely. This could be done by wearing a special glove, or by operating a miniature model with position sensors.
Robotic arms for amputees are doing a form of haptic sensing. Also to note, some robot arms have feed back sensors (such as touch)
that gets directed back to the human (vibrating the glove, locking model joints, etc.).
Tactile sensing (sensing by touch) usually involves force feedback sensors and
current sensors.
These sensors detect collisions by detecting unexpected force/current spikes, meaning a collision has occurred. A robot
end effector can detect a successful grasp, and not grasp too tight or too lightly, just by measuring force.
Another method would be to use current limiters - sudden large current draws generally mean a collision/contact
has occurred. An arm
could also adjust end effector velocity by knowing if it is carrying a heavy object or a light object - perhaps even
identify the object by its weight.
Try this. Close your eyes, and put both of your hands in your lap. Now keeping your eyes closed, move your hand
slowly to reach for your computer mouse. Do it!!!! You will see why soon . . . Now what will happen is
that your hand will partially miss, but at least one of your fingers will touch the mouse. After that finger touches, your
hand will suddenly re-adjust its position because it now knows exactly where that mouse is. This is the benefit
of tactile sensing - no precision encoders required for perfect contact!