Please pardon my ignorance about this. I may have read this a long time ago, many years ago in Charles Petzold's Programming Windows, 4th Ed. book, and I must have forgotten it all. I remember the very first chapter had something to say about timer resolutions and clock-speeds of the CPU but I forget the details. I think in that he answers this question. But I could be imagining things. It's been a long while.
While there is a related question here, the answer is not to my satisfaction as it does not answer the finer nuances I am interested in, which are mentioned below.
It'll be nice to have someone provide a systematic answer to this question.
Is there some kind of a clock built into the hardware that tells it the universal time? And then it computes the local time in the user selected locale by adding the timezone offset?
What is the unit in which this clock stores time information? How does that translate to our human concept of date and time? Who does this translation -- the hardware or the CPU?
I am assuming there is an operating system agnostic way of getting the UTC time in all operating systems.
Under Windows, I recall there used to be a Win32 function that you would call passing to it a pointer to a struct
, which it would populate. How does that function get to know what the UTC time is?
For that matter, how does any OS know?
Finally, is the computation a CPU-bound one or a device/hardware/IO bound one?
Of course, the answer to (1) will answer (2) as well.
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…