I recently encountered an error in my code where I was attempting to "dismantle" an enum flag into an array of its constituent values, but unexpected results would sometimes be returned.
Long story short, certain integer types assigned to my values appear to cause some behaviour that is unexpected (well, to me at least). I boiled down the issue into the following unit test:
public enum TestEnum
{
ITEM1 = 30104,
ITEM2 = 30201,
}
[TestClass]
public class EnumFlagTest
{
[TestMethod]
public void SanityCheck()
{
var flag2 = TestEnum.ITEM2;
Assert.IsFalse(flag2.HasFlag(TestEnum.ITEM1));
}
}
I did not expect flag2
to report that it "has the flag" of ITEM1
, as I do not believe that it does.
I guess that it has something to do with the Int32 values I've assigned to the items, but please could someone explain what's going on?? - Why does this test fail?
Basically, you shouldn't use HasFlag
for a non flags-based enum... and a flags-based enum should use a separate bit for each flag.
The problem is indeed because of the values. Writing this in binary, you've got:
public enum TestEnum
{
ITEM1 = 0b111010110011000,
ITEM2 = 0b111010111111001
}
Note how every bit that's set in ITEM1
is also set in ITEM2
- and that's what's checked by HasFlag
. From the documentation:
Returns:
true
if the bit field or bit fields that are set inflag
are also set in the current instance; otherwise,false
.
That is true for TestEnum.ITEM2.HasFlag(TestEnum.Item1)
, hence it returns true.
See more on this question at Stackoverflow