[Solved] Unicode conversion issues

I’m guessing the problem is that in your compiler char is signed (the standard allows it to be either signed or unsigned, it’s implementation-defined/specific). As such, whenever you convert chars that have bit 7 set to 1 (0x80 through 0xFF) into any larger integer type, it’s treated as a negative value and it gets sign-extended … Read more