Hi. I'm building an RC5 decoder via interrupts. It worked flawlessly with _delay_us (1778), yet it sucks with sampling timers.
Basically, I have a 16MHz atmega16.
RC5 has a sample time of 1778 us (if we use only one phase of manchester). That goes to something around 562 hz, if we divide 16Mhz at that 562 something, we get exactly 28448 instructions.
Divided by 256 prescaler, i get 111. it works on timer0 only if it goes down to 108.
It works on timer1 if it goes down to 27648 (basically, 108 * 256).
I'm using, of course CTC mode, and sampling my data inside the interrupt. It works, but sometimes it drifts (from some point, all 0 bytes become ones, because the chip samples the other half of the signal). Any hints?
Of course, i can also resort to using an interrupt pin and a free running timer, every time the interrupt triggers i measure the timer, and via some histeresis, i decide if i got a 0 or 1. But I don't want to do that.