I'm trying to understand how it's working.
#include<stdio.h> int main() { int a = 110; double d = 10.21; printf("sum d: %d \t\t size d: %d \n", a+d, sizeof(a+d)); printf("sum lf: %lf \t size lf: %lf \n", a+d, sizeof(a+d)); printf("sum lf: %lf\t size d: %d \n", a+d, sizeof(a+d)); printf("sum d: %d \t\t size lf: %lf \n", a+d, sizeof(a+d)); return 0; } The output is:
sum d: 8 size d: 1343288280 sum lf: 120.210000 size lf: 0.000000 sum lf: 120.210000 size d: 8 sum d: 8 size lf: 120.210000
sizeofneeds a%zuformat specifier.printf()that are mismatched with their corresponding formatting directives. There is nothing to understand here, beyond that C makes no guarantees about what the program will do.