Does integer overflow cause undefined behavior because of memory corruption?

You misunderstand the reason for undefined behavior. The reason is not memory corruption around the integer – it will always occupy the same size which integers occupy – but the underlying arithmetics.

Since signed integers are not required to be encoded in 2’s complement, there can not be specific guidance on what is going to happen when they overflow. Different encoding or CPU behavior can cause different outcomes of overflow, including, for example, program kills due to traps.

And as with all undefined behavior, even if your hardware uses 2’s complement for its arithmetic and has defined rules for overflow, compilers are not bound by them. For example, for a long time GCC optimized away any checks which would only come true in a 2’s-complement environment. For instance, if (x > x + 1) f() is going to be removed from optimized code, as signed overflow is undefined behavior, meaning it never happens (from compiler’s view, programs never contain code producing undefined behavior), meaning x can never be greater than x + 1.

Leave a Comment

tech