I would guess that because the alternative was worse. Suppose the prototype were changed to add const
:
long int strtol(const char *nptr, const char **endptr, int base);
Now, suppose we want to parse a non-constant string:
char str[] = "12345xyz"; // non-const
char *endptr;
lont result = strtol(str, &endptr, 10);
*endptr = '_';
printf("%s
", str); // expected output: 12345_yz
But what happens when we try to compile this code? A compiler error! It's rather non-intuitive, but you can't implicitly convert a char **
to a const char **
. See the C++ FAQ Lite for a detailed explanation of why. It's technically talking about C++ there, but the arguments are equally valid for C. In C/C++, you're only allowed to implicitly convert from "pointer to type" to "pointer to const
type" at the highest level: the conversion you can perform is from char **
to char * const *
, or equivalently from "pointer to (pointer to char
)" to "pointer to (const
pointer to char
)".
Since I would guess that parsing a non-constant string is far more likely than parsing a constant string, I would go on to postulate that const
-incorrectness for the unlikely case is preferable to making the common case a compiler error.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…