C# DateTime to SQL DateTime losing precision

Have a small problem where if I save a DateTime field as an SQL command parameter it loses precision, like often less than a milisecond.

e.g. The parameter's Value is: TimeOfDay {16:59:35.4002017}

But its SqlValue is: TimeOfDay {16:59:35.4000000} And that's the time that's saved in the database.

Now I'm not particularly bothered about a couple of microseconds, but it causes problems later on when I'm comparing values, they show up as not equal. (Also in some comparisons the type of the field is not known until run-time so I'm not even sure at dev-time whether I'll even need special DateTime "rounding" logic)

Is there any easy fix for this when adding the parameter?

Jon Skeet
people
quotationmark

You're using DateTime, which is documented with:

Accuracy: Rounded to increments of .000, .003, or .007 seconds

It sounds like you want DateTime2:

Precision, scale: 0 to 7 digits, with an accuracy of 100ns. The default precision is 7 digits.

That 100ns precision is the same as DateTime (1 tick = 100ns)

Or just live with the difference and write methods to round DateTime before comparing - that may end up being simpler.

people

See more on this question at Stackoverflow