1. How do you know exactly how long a cycle in delay_cycles and delay_us takes? (just curious)
I put it on an oscilloscope and measured the exact time. You can also use a multimeter Hz function, flash an LED and time it, or just plain guess until your servo works.
2. If the program is compiled with different optimization settings, isn't it possible for delay_cycles or delay_us to take a different number of CPU cycles per iteration?
No. A clock cycle is a hardware event. Also, my delay_cycles code will not work with optimization turned on because the compiler will think it doesnt do anything but waste cycles, thereby removing it.
3. How come an iteration in delay_us takes 5 CPU cycles (according to the comment), and an iteration in delay_cycles takes about 0.992/23 = 0.043 ms (according to the comment), which, if I understand correctly, should be 43 CPU cycles, if the frequency is 1.0Mhz? Am I mistaken to assume that if the frequency is 1.0Mhz then a single CPU cycle takes 1e-6 seconds, or 0.001 ms?
I dont really understand the clock on the microcontroller enough to give you a good answer . . . I determined all this with experimentation. I defined a cycle as the minimum amount of time it takes for the microcontroller to process an empty while loop. This is what really matters to a programmer, anyway. If I remember correctly, PIC's take I think 4x more clock cycles than AVR to process the same event.
4. If the frequency in Atmega8 is 1.0Mhz by default, according to one page in the site(http://www.societyofrobots.com/step_by_step_atmega168_swapout.shtml), then why is it defined by default as 3.69Mhz in global.h?
That number is only used by AVRlib stuff . . . for my $50 robot that number isnt used and so I just left it as default. If you were to use the UART or a timer interrupt than the number will be important to get right.
I hope that helps . . . Im no expert on clock cycles . . . but Im learning!