I have a small issue I believe the code is doing exactly what its suppose to be doing I have a function whereby I pass in an amount and I add a flat fee of 40 cents to it for the surcharge.
Below is how my current code is constructed
Double surcharge;
surcharge = 0.4 * moneyIn / 100;
If I pass 999.00m in as moneyIn it returns 0.3996 when in fact it should return 0.4 I'm unsure what I need to do to make it be 0.4 I'm learning percentages within c# so please bare with me.
You're not using decimal
- you're using double
. Use decimal
everywhere (so moneyIn
should be a decimal
too). If you're actually using 999.00m
for moneyIn
, that would make it a decimal
and your current code wouldn't even compile (as there are no implicit conversions between decimal
and double
).
Now your code doesn't actually talk about a flat fee of 40 cents - it's taking 0.4% of the original value. You should have something like:
decimal surcharge = 0.40m; // 40 cents
decimal total = moneyIn + surcharge;
See more on this question at Stackoverflow