I'm developing an application in C# that requires many function calls (100-1000 per second) to happen at very specific times. However, there are extremely tight specs on the latency of this application, and so due to the latency increase associated with garbage collection, it's not feasible for me to use DateTime or Timer objects. Is there some way I can access the system time as a primitive type, without having to create DateTime objects?
TL;DR: Is there an analogue for Java's System.currentTimeMillis() for C#?
What makes you think DateTime
allocates objects? It's a value type. No need for a heap allocation, and thus no need for garbage collection. (As TomTom says, if you have hard latency requirements, you'll need a real-time operating system etc. If you just have "low" latency requirements, that's a different matter.)
You should be able to use DateTime.Now
or DateTime.UtcNow
without any issues - UtcNow
is faster, as it doesn't perform any time zone conversions.
As an example, I just time 100 million calls to DateTime.UtcNow
and then using the Hour
property, and on my laptop that takes about 3.5 seconds. Using the Ticks
property (which doesn't involve as much computation) takes about 1.2 seconds. Without using any property, it only takes 1 second.
So basically if you're only performing 1000 calls per second, it's going to be irrelevant.
See more on this question at Stackoverflow