Is .NET DateTime truncating my seconds?

I am trying to format some precise dates, converting them from a Unix timestamp to a DateTime object. I noticed that the AddSeconds method has an overload that accepts a floating point number.

My expectation is that I may pass in a number such as 1413459415.93417 and it will give me a DateTime object with tick-level precision. Is this a decent assumption, or does the AddSeconds method still provide no better than millisecond precision? In the conversion, do I have to add the ticks myself?

My conversion code is below:

public static DateTime CalendarDateFromUnix(double unixTime)
    {
        DateTime calendarTime = UnixEpoch.AddSeconds(unixTime);
        return calendarTime;
    }

I expect to format the ToString of this date like 16 Oct 2014 11:36:55.93417 using the format string below:

dd MMM yyyy HH:mm:ss.fffff

Instead of giving me 16 Oct 2014 11:36:55.93417, it is giving me 16 Oct 2014 11:36:55.93400

Am I doing something wrong or is .NET truncating my floating-point seconds representation? I am new to .NET, so the former is quite possible.

Thanks

Jon Skeet
people
quotationmark

From the documentation of DateTime.AddSeconds:

The value parameter is rounded to the nearest millisecond.

An alternative would be to multiply by TimeSpan.TicksPerSecond and then add that to the ticks of UnixEpoch:

return new DateTime(
    UnixEpoch.Ticks + (long) (unixTime * Timespan.TicksPerSecond),
    DateTimeKind.Utc);

people

See more on this question at Stackoverflow