In C, numeric literals have default types different from their variable counterparts. For example, 6.5 is a double literal by default (so sizeof(6.5) is 8 on most platforms), while 90000 is typed as an int literal (so sizeof(90000) is 4). 'A' (in an expression context) is an int in C (4 bytes), but when stored in a char a = 'A';, its size is 1. Hence you see different sizes for what look like the “same” values.