%X format specifier to represent an integer

I've been taking a look a Zed A. Shaw's "Learn C the hard way" tutorials. One of his extra credit questions (in exercise 9) requested me to represent a char[4] as an int.

After scouting around I came over a SO question that was about the same thing and therefore managed to come up with the answer:

int i;
int intName = 0;

for(i = 0; i < 4; i++){
    intName = intName + name[i];
}

printf("intName: %d\n", intName);
printf("intName: %X\n", intName);

Output:

intName: 291
intName: 123

The name at hand was, "Zed", I calculated the ASCII decimal values and they did indeed add up to 291, which is fine. However after searching I couldn't find any explanation as to why %X would return 123.

In this scenario, what does this format specifier do? It would be awesome if I were to get somewhat of a detailed answer as to what's going on. Learning C seems far more in-depth and I feel I should completely understand these basics such as these before moving on to anything else.

Jon Skeet
people
quotationmark

However after searching I couldn't find any explanation as to why %X would return 123.

It's just printing the hex representation of the number. Hex 123 = decimal 291. From a printf man page (the first I happened to find):

o, u, x, X
The unsigned int argument is converted to unsigned octal (o), unsigned decimal (u), or unsigned hexadecimal (x and X) notation. The letters abcdef are used for x conversions; the letters ABCDEF are used for X conversions. The precision, if any, gives the minimum number of digits that must appear; if the converted value requires fewer digits, it is padded on the left with zeros. The default precision is 1. When 0 is printed with an explicit precision 0, the output is empty.

people

See more on this question at Stackoverflow