In the vast majority of cases, I wouldn’t call using int
instead of short
a huge mistake (or even necessarily a mistake at all).
As other people have indicated, the main difference is that an int
is 4 bytes and a short
is just 2. In most cases, using short
instead of int
is a microoptimization; there are, of course, a few exceptions (e.g. you’re working in a very memory-constrained environment, you’re doing something that requires an enormous amount of memory and using short instead of int would result in a substantial memory saving, etc.).
This point may be a little more controversial, but in some cases using short rather than int could improve code clarity (if you’re only expecting integers in a relatively small range), but that’s usually not the case.
solved Memory management for int and short datatype in c#