Why does (boolean ^ int > 0) work?

When you try to do something like this:

if (true ^ 1) {
  //do something
}

the compiler reasonably says that operator ^ is not defined for argument types boolean and int. But if you use it like this:

if (true ^ 1 > 0) {
  //do something
}

the code compiles (for Java 8 at least) and flawlessly works. Basically these operations:

false ^ -1 > 0 
false ^ 1 > 0
true ^ -1 > 0
true ^ 1 > 0

Act like a valid logical XOR:

     | ^
-----+--
 F F | F
 F T | T
 T F | T
 T T | F

Could anybody please explain what happens under the hood?

Jon Skeet
people
quotationmark

It's simple: > has higher precedence than ^, so

if (true ^ 1 > 0) {

is equivalent to

if (true ^ (1 > 0)) {

which is equivalent to

if (true ^ true)

... which is just logical XOR.

I would never write code like this, mind you. I would be surprised to see an example which couldn't be written more clearly in a different way.

people

See more on this question at Stackoverflow