I changed your program to
#include <stdio.h>
int main() {
int n;
printf("%d\n", n);
scanf("%d", &n);
printf("%d\n", n);
}
When I run it with the input “a”, it prints
32767
32767
So whatever is causing n
to start out with the bit pattern for 32767, it is not anything that scanf
is doing.
At one level this is interesting, perhaps even fascinating: despite the fact I’m using a very different system (clang-600.0.57 under MacOS 10.9.5), I got a remarkably similar result. But no matter how intriguing this is, I’m not going to waste time trying to figure out why, because it’d be hard, and it wouldn’t teach me anything useful about writing proper programs. It’s just some random, meaningless coincidence.
solved Scanf with format specifier and input mismatch