Re: java Date to c# ticks



Eric Sosman wrote:

Intervals between contemporary events can (sometimes) be
measured to nanosecond precision. In the laboratory, femtosecond
precision may be attainable. But extending the scale to longer
periods is pure fiction! Claim: You cannot measure the time
between an event at lunchtime yesterday and one at lunchtime today
with nanosecond precision.

With intervals of that size, nobody will anyway. Point is that
you don't want to change data-structures in dependence of the
size of the interval. As well, you want to keep some kind of
reserve for the future to avoid the problem the runtime libraries
of Borland TurboPascal had where a cycle-counter-value became
larger than the maximum value that could be represented by a Word.

You probably can't measure it with
millisecond precision, and even one-second precision would require
a good deal of care.

Like with all physical measures you have an error. Assuming it
to be constant (e.g. 0.01%) an interval of 10 µs can be expected
to be in the range of 9999 ns and 1001 ns where in terms of
a day, the error alone is plus or minus 9 seconds.

Even in one single lunch hour, you cannot measure the time
between the swallow and the belch with nanosecond precision.

Most measurements in IT I'm aware of are about the time of a
method-call, the execution time of an SQL-query, the round-
trip-time of a network request, etc. Hopefully most of them
are in the range of micro- or milliseconds, so having a data-
structure with some kind of "reserve" for the future isn't the
badest thing to have.


Regards, Lothar
--
Lothar Kimmeringer E-Mail: spamfang@xxxxxxxxxxxxxx
PGP-encrypted mails preferred (Key-ID: 0x8BC3CD81)

Always remember: The answer is forty-two, there can only be wrong
questions!
.