They’re both a 0, but (char) 0 is a char, while '\0' is (unintuitively) an int. This type difference should not usually affect your program if the value is 0.
I prefer '\0', since that is the constant intended for that.
They’re both a 0, but (char) 0 is a char, while '\0' is (unintuitively) an int. This type difference should not usually affect your program if the value is 0.
I prefer '\0', since that is the constant intended for that.