An application can read the time from the real time clock, or from the OS
(such as DOS). If the OS supplies the information, and as in the case of
DOS, the current time is based on interrupt zero, then the time can be
significantly off for several reasons. One reason is that the CPU system
clock may not be accurate. Another is that some interrupts and system
activity may cause a skipped clock update cycle. While this does not amount
to much each time it happens, over a period of time it can add up.
That is the way I understand it. Perhaps someone else may have more input on
this as well as how the different OSs handle the clock situation.
===========================
Peter Shkabara, P.E.
Computer Science Instructor
Rogue Community College
[log in to unmask]http://www2.rogue.cc/PShkabara
-----Original Message-----
I've been asked to explain why the current time shown in many applications
is often so far off from a wall clock - often losing or gaining several
seconds a day. I was wondering if someone could point me in the direction
for a fairly technical reference.
PCBUILD's List Owner's:
Bob Wright<[log in to unmask]>
Drew Dunn<[log in to unmask]>