I know this is a simple questions, but it came up when I was coding and I am wondering how it works now. So, my first question is that when printf is given an integer like below, but expecting a %f float value, why is it always outputting 0.000000? I am running this in GCC on linux terminal.
int main() {
int a = 2, b = 5, result = 0;
result = b/a*a;
printf("%f
", result);
}
//Above printf statement outputs 0.000000 every time.
Then when I use the code below and give printf a double when it is expecting an %i integer value, the output is always random/garbage.
int main() {
double a = 2, b = 5, result = 0;
result = b/a*a;
printf("%i
", result);
}
//Above printf statement outputs random numbers every time.
I completely understand the above code is incorrect since the printf output type is not the same as I am inputting, but I expected it to act the same way for each form of error instead of changing like this. Just caught my curiosity so I thought I would ask.
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…