Author Topic: how to determine clock error rate of microcontroller?  (Read 4686 times)

0 Members and 1 Guest are viewing this topic.

Offline Aberg098Topic starter

  • Full Member
  • ***
  • Posts: 66
  • Helpful? 0
how to determine clock error rate of microcontroller?
« on: February 17, 2011, 12:05:22 PM »
Apologies for reviving an old thread...

How exactly did you accomplish this testing Admin? What did you hook up your oscilloscope to? What did you use as a reference time?

I am trying to use the timing function to measure the time needed to execute a portion of code. In essence, similar to what you did for measuring the delay function.


Offline Admin

  • Administrator
  • Supreme Robot
  • *****
  • Posts: 11,703
  • Helpful? 173
    • Society of Robots
how to determine clock error rate of microcontroller?
« Reply #1 on: February 23, 2011, 11:40:00 AM »
Apologies for reviving an old thread...

How exactly did you accomplish this testing Admin? What did you hook up your oscilloscope to? What did you use as a reference time?

I am trying to use the timing function to measure the time needed to execute a portion of code. In essence, similar to what you did for measuring the delay function.
method 1:
Have your mcu keep track of time for an hour. When the hour is done, check your actual clock to see how much real time has passed.

method 2:
Have your mcu creating a square wave using delay_ms(10). Then measuring the period using an scope, see how much it's deviated from what the oscope considers 10 ms.

Offline Aberg098Topic starter

  • Full Member
  • ***
  • Posts: 66
  • Helpful? 0
how to determine clock error rate of microcontroller?
« Reply #2 on: February 23, 2011, 12:01:30 PM »
So in essence, you were using your oscilloscope as the standard reference point for time?

Also, you stated that your results were accurate to 0.58% of your reference time: This is unclear to me.

Offline Admin

  • Administrator
  • Supreme Robot
  • *****
  • Posts: 11,703
  • Helpful? 173
    • Society of Robots
how to determine clock error rate of microcontroller?
« Reply #3 on: February 23, 2011, 12:10:24 PM »
So in essence, you were using your oscilloscope as the standard reference point for time?
yeap

Also, you stated that your results were accurate to 0.58% of your reference time: This is unclear to me.
It means the error was 0.58% off. So if I let it run for an hour, the cumulative error would have been ~21 seconds. (60 minutes * .0058 = 0.348 minutes)

Offline Webbot

  • Expert Roboticist
  • Supreme Robot
  • *****
  • Posts: 2,165
  • Helpful? 111
    • Webbot stuff
how to determine clock error rate of microcontroller?
« Reply #4 on: February 23, 2011, 12:51:23 PM »
And don't forget the oscillator that is supplying the clock to mcu will have a tolerance - it may not be exactly 16MHz in all conditions (eg temperature etc)
Webbot Home: http://webbot.org.uk/
WebbotLib online docs: http://webbot.org.uk/WebbotLibDocs
If your in the neighbourhood: http://www.hovinghamspa.co.uk

Offline Aberg098Topic starter

  • Full Member
  • ***
  • Posts: 66
  • Helpful? 0
how to determine clock error rate of microcontroller?
« Reply #5 on: February 23, 2011, 04:41:33 PM »
Is the clock tolerance something that is easily known? As in typical information you'd find in a datasheet?

Offline Admin

  • Administrator
  • Supreme Robot
  • *****
  • Posts: 11,703
  • Helpful? 173
    • Society of Robots
how to determine clock error rate of microcontroller?
« Reply #6 on: February 23, 2011, 05:39:31 PM »
Is the clock tolerance something that is easily known? As in typical information you'd find in a datasheet?
Nope, it can only be determined experimentally. Good design technique and luck can improve accuracy, but not determine the exact amount of error.

And don't forget the oscillator that is supplying the clock to mcu will have a tolerance - it may not be exactly 16MHz in all conditions (eg temperature etc)
It's also a function of the capacitors tolerances, the capacitance difference between both capacitors, and even the PCB tracings.

Aberg098, would a real time clock RTC be what you are looking for? You can buy one and interface it with the Axon, if you need very accurate timing over long periods of time.


ps - this has gone off topic, so I've split this from the original thread

Offline Aberg098Topic starter

  • Full Member
  • ***
  • Posts: 66
  • Helpful? 0
Re: how to determine clock error rate of microcontroller?
« Reply #7 on: February 24, 2011, 09:24:30 AM »
Well, to be honest, I'm trying to get away using only the Axon clock for my application, however I'm unsure as to whether or not it's sufficient.

I am Using the Axon for datalogging inertial measurements, and sending those measurements back through the UART to a computer. For this, I need some pretty accurate time stamps to go with the measurements. I'm currently getting about a 70Hz measurement frequency, and I'm aiming to be able to run tests for at least one hour.

As for Interfacing a RTC to the Axon: I assume you read this much like you would read a sensor... Any suggestions. All the ones I can seem to find don't give me anything past seconds. Since I'm running at a pretty high frequency, I'd need something in the millisecond range.

Offline Webbot

  • Expert Roboticist
  • Supreme Robot
  • *****
  • Posts: 2,165
  • Helpful? 111
    • Webbot stuff
Re: how to determine clock error rate of microcontroller?
« Reply #8 on: February 24, 2011, 01:14:15 PM »
I know that madsci has used the scheduler in WebbotLib to sample a sensor on a reular basis - can't remember if it was every 10Hz or every 10ms. Think he was using it for a Kalman filter.

Madsci may want to explain
Webbot Home: http://webbot.org.uk/
WebbotLib online docs: http://webbot.org.uk/WebbotLibDocs
If your in the neighbourhood: http://www.hovinghamspa.co.uk

 


data_list