Java Vs C# Long to DateTime Conversion

In Java I have the following test that passes fine

// 42 bits of time is good enough for the next 100 years.
// An IEEE double has 52 bits of mantissa, so our dates can be easily fit.
@Test
public void testMaxBits() throws ParseException {
    // Maximum 42 bit integer
    long millis = (1L << 42) - 1;
    Date date = new Date(millis);
    //DateTime maxDate = new DateTime(2109, 5, 15, 8, 35, 11, 103);
    Date maxDate = new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss.SSS").parse("2109-05-15T08:35:11.103");
    Assert.assertEquals(maxDate, date);
}

Now, I want to do the same sort of thing in C#, so I have a test in LinqPAD that test the C# implementation for correctness

DateTime maxDate = new DateTime(2109, 5, 15, 8, 35, 11, 103);
long beginTicks = new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc).Ticks;
long l = (1L << 42) - 1;
DateTime date = new DateTime(beginTicks + l, DateTimeKind.Utc);
maxDate.Dump();
date.Dump();

The output don't match, the values outputted ToString() values are

maxDate = 15/05/2109 08:35:11 date = 06/01/1970 02:10:04

What am I missing here?


Edit. I have see a great answer below from @zmitrok, I have changed

DateTime date = new DateTime(beginTicks + l, DateTimeKind.Utc);

to

DateTime date = new DateTime(beginTicks + 
    l * TimeSpan.TicksPerMillisecond, DateTimeKind.Utc);

but now get

date = 15/05/2109 07:35:11

Where has the hour gone?

Jon Skeet
people
quotationmark

Your test is basically confusing ticks with milliseconds. If you only need to store a number of milliseconds since the unix epoch, then do so - but I'd recommend using something like this to perform the conversion:

public static readonly DateTime UnixEpoch
    = new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc);

public DateTime FromMillisecondsSinceUnixEpoch(long milliseconds)
{
     return UnixEpoch.AddMilliseconds(milliseconds);
}

(As a side-note, that method already exists in my Noda Time project... hint hint :)

Your test would then be:

[TestMethod]
public void TestMaxBits()
{
    long maxMillis = (1L << 42) - 1;
    DateTime maxDate = DateTimeHelper.FromMillisecondsSinceUnixEpoch(maxMillis);
    Assert.Greater(maxDate, new DateTime(2100, 1, 1, 0, 0, 0));
}

Note that:

  • This code doesn't mention ticks at all, because you're not interested in ticks
  • This code doesn't assert that the maximum date is some very specific value, because that's not what you care about; you care that 42 bits of time will carry you until the end of the century. (The "next 100 years" comment is somewhat specious, as 2109 is less than 100 years away from now, so I'll assume it really means "until the end of the 21st century.")

That of course make your question of "Where has the hour gone?" irrelevant - but the answer to that is simply that SimpleDateFormat defaults to using the system time zone, so you're actually relying on the time zone of the system you're running the test on, which is a really bad idea. If you set the time zone of the SimpleDateFormat to UTC, you'll find that it's 07:35:11 in Java as well.

people

See more on this question at Stackoverflow