Subject: | |
From: | |
Reply To: | |
Date: | Tue, 9 Feb 1999 13:01:36 +0200 |
Content-Type: | text/plain |
Parts/Attachments: |
|
|
Earl Truss wrote:
>
> I've been asked to explain why the current time shown in many applications
> is often so far off from a wall clock - often losing or gaining several
> seconds a day. I was wondering if someone could point me in the direction
> for a fairly technical reference.
>
> I once read a magazine article about this but can no longer find it. It
> was written in DOS days and I believe their primary reason was because the
> PC reads the current time from the real-time, battery-backed-up clock at
> startup but then the operating system maintains the time from the timer
> interrupt which is unreliable. Here's where my memory fails me. Could
> someone fill in the gaps ... Why is this interrupt unreliable? Is it a
> matter of the clock signal varying or is it just poorly synchronized with
> real-time? Where do the clock signals for the various components - CPU,
> bus, real-time clock, etc. - come from? I don't have an extensive
> background in basic electronics but do know quite a bit about the theory of
> how a computer works at this level. I think I could understand a pretty
> detailed discussion of this. I'd really appreciate it if someone could
> help me out in the next couple of days. Thanks.
One of the possible causes could be this:
some programs hook incorrectly to timer interrupt (timer interrupt
was used very heavily in DOS days to produce regular tasks, for example
switch a screen, or play a sound sample), and they either don't pass
the interrupt over to the timer, or pass it too often/too little,
this disrupts the clock.
>
> PCBUILD's List Owner's:
> Bob Wright<[log in to unmask]>
> Drew Dunn<[log in to unmask]>
Curious about the people moderating your
messages? Visit our staff web site:
http://nospin.com/pc/staff.html
|
|
|