The code has undefined behavior, it may print 2 !
on your system, but it may do something entirely different on a different system, and indeed I don’t want to be flying a plane that runs it on its navigation system.
Reformatting the code makes it a little more explicit:
#include <stdio.h>
int main() {
int b = 10;
char ch = 33 ^ b & 1;
for (;"what"[b++ + 21];)
printf("%c", ch);
}
Here is what is happening:
b
is initialized with value10
.ch
is initialized with value(33 ^ (b & 1))
. Since10
is even,b & 1
is0
, soch
has value33
, in hex:0x21
, which is the character!
in ASCII.- the
for
loops checks the value of an element from the string literal"what"
, which is an array of 5char
with values{ 'w', 'h', 'a', 't', 0 }
. The index is computed asb++ + 21
. The first value is31
, andb
is incremented to11
. Here you have undefined behavior because you are referencing the 32nd element of a 5 byte array.
Anything can happen. Reading this byte from memory can cause a crash or just return some random value. Further iterations of the loop dig deeper into the unknown. On your computer it takes 2 iterations to find a null byte, so the printf
is run twice, but on some other machine or just some other time, anything could happen.
This test is bogus, a more reliable alternative would be:
#include <stdio.h>
int main(){
int b=10,ch=33^b&1;
for(;"what"[b+++-8];)
printf("%c",ch);
}
solved Can you please explain the output of this C program? [closed]