Please only answer WHEN you fully comprehend the question.
Do not close down, as there does not exist a similar question.
I am aware of System.nanoTime() gives ns from an arbitrary "random" point after the JVM starts. And I am aware that System.currentTimeMillis() only gives ms precision.
What I am looking for is for the PROOF and keep an open mind, to the hypothesis that the ms changes are not exact once we try to define what exact means.
Exact would in my world mean that everytime we were to register a new ms say, we go from 97ms, 98ms, 99ms and so forth, on every time we get an update, through whatever mechanisms, we can not expect at least observed Java to give us nanosecond precision at the switches.
I know, i know. It sounds weird to expect that, but then the question comes, how accurate are the ms switches then?
It appears to be that when you ask System.nanoTime() repeatedly you would be able to get a linear graph with nanosecond resolution.
If we at the same time ask System.currentTimeMillis() right after System.nanoTime() and we disregard the variance in cost of commands, it appears as if there would be not a linear graph on the same resolution. The ms graph would +-250ns.
This is the to be expected, yet I can not find any information on the error margin, or the accuracy of the ms.
This issue is there for second precision as well, or hour precision, day, year, and so forth. When the year comes, how big is the error?
When the ms comes, how big is the error in terms on ns?
System.currenTimeMillis() can not be trusted to stay linear against System.nanoTime() and we can not expect System.currenTimeMillis() to keep up with ns precision.
But how big is the error? In computing? In Java, in unix systems?