Once a multiplayer server starts filling up, any calls to time or sleep start to become unreliable.
sleep calls take longer than you've specified, and calls to time also show a slower passage of time than is real for the players.
For accurate timing information, we really need a method to retrieve system time from the underlying Windows OS.
I understand internally there are precision restrictions, so I propose a systemServerTime method that could behave like the current serverTime method with one important difference. In this method, the time figure is compared with the system clock, and therefore would eliminate the time drift experienced under high server load.
Hundredths of a second precision would be nice for accurate timing, but second resolution would work as a minimum implementation.