0

Here is a simple c statement:

uint32_t x = 0x04000000;

On my little endian machine I assumed x would equal 4. But instead it's 67108864.

So there must be something very basic that I don't understand. Could you help explain please?

Glenn Slayden
  • 17,543
  • 3
  • 114
  • 108
Adham Zahran
  • 1,973
  • 2
  • 18
  • 35
  • 5
    Can you imagine how confusing it would be if `uint32_t x = 0x00000004` sometimes represented 4 and sometimes 67108864, depending on what kind of machine you were on? – Nate Eldredge Sep 26 '20 at 22:50
  • 1
    A slightly different question, for C++ instead of C, but which explains the issue perfectly: https://stackoverflow.com/questions/27551167/what-is-the-endianness-of-binary-literals-in-c14 – Nate Eldredge Sep 26 '20 at 23:00

3 Answers3

6

When you do an assignment to a variable, you're setting the value, not the representation.

The hexadecimal number 0x04000000 is the same as the decimal number 67108864 so that is what gets assigned. The fact that the number is represented in hex doesn't change how the assignment works.

If you did something like this:

unsigned char n[4] = "\x4\x0\x0\x0";
memcpy(&x, n, sizeof(x));

Then x would contain the value 4 on a little-endian machine.

dbush
  • 205,898
  • 23
  • 218
  • 273
3

The literal 0x04000000 means the number 67108864. It will be stored differently in memory depending on whether little-endian or big-endian byte order is used, but that doesn't change its value.

mkrieger1
  • 19,194
  • 5
  • 54
  • 65
1
uint32_t x = 0x04000000;

This operation is machine endianness agnostic. 0x04000000 is a hex literal with the value 67'108'864 (decimal). This is true on any endianness.

If you want to see the effects of endianness systems you need to fiddle with the underlying memory. E.g.:

uint32_t x;
memcpy(&x,  (const unsigned char[4]) {0x04u, 0, 0, 0}, 4);
bolov
  • 72,283
  • 15
  • 145
  • 224