Timestamp calculated in Javascript and C# are different

I am calculating TimeStamp in javascript using following code

var timestamp = new Date;

that gives me result 1436504446858 contain 13 digits depend upon current time.

The same thing calculated in C# using following code

DateTime centuryBegin = new DateTime(1970, 1, 1);
DateTime currentDate = DateTime.Now;
long elapsedTicks = currentDate.Ticks - centuryBegin.Ticks;

Here i got result 14365252465308044 that contains 17 digits

Why the two methods gives so much of difference ?

Jon Skeet
people
quotationmark

Why the two methods gives so much of difference ?

The Javascript code is giving you the number of milliseconds since the Unix epoch.

The .NET code (if you'd got it right - more on that in a second) is giving you the number of ticks since the Unix epoch. There are 10 million ticks per second, so 10,000 ticks per millisecond. That's why you're getting 4 more digits.

Now, the other problem is that your .NET code is using the local time - whereas it should reflect the UTC time, given that you're finding the number of ticks (or milliseconds) since the Unix epoch, which is midnight January 1st in 1970 UTC. So you want:

DateTime unixEpoch = new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc);
DateTime utcNow = DateTime.UtcNow;
TimeSpan elapsedTime = utcNow - unixEpoch;
double millis = elapsedTime.TotalMilliseconds;

Note how if you represent the difference in time as a TimeSpan, you can convert it to the units you're interested in. (Although a TimeSpan is just "a length of time" which doesn't know a start/end, so you can't use it to find a difference in variable-length units such as years and months.)

people

See more on this question at Stackoverflow