Software > Software

Digital versus Analog Servo control

<< < (2/2)

Okay i think i did the 1000us wrong.

I figured out my servo gets a signal every 3.33 ms instead of the standard 20 ms, so the time that the pin is out i gonig to being somwhere between .1 - .001, so ill test that thanks!

Hmmmm its worth a try but dont think that is the problem. The robot I am working on now, the digital servos get a pulse every 5.6ms.

Im not sure about atmel, but I do know that PIC's dont allow time delays stored as variables to exceed 255.

Can you post the group of code that you use to command the servo?

Do you have an oscilliscope to measure the pulse width?

Before i answer your questions above, i have a quick question.

All this PWM research ive been doing has confused everything i once thought was right....

lets say the PWM is 20 ms, and i set my pin to high for 1.5 ms, then does that mean it had to be low for 18.5 ms and cyclced at like that?

Servos dont really care how long the low signal is. If its within say 50ms low, and 1.5ms high, it will work fine. If it is low too long, the servo will 'forget' its supposed to do something. The only part that matters is how long the high signal lasts, as that directly determines the servo angle. A low signal that lasts only 5ms will also work fine.


[0] Message Index

[*] Previous page

Go to full version