Re: Why is a C compiler ANSI compatible?

"slebetman@xxxxxxxxx" <slebetman@xxxxxxxxx> writes:
moja...@xxxxxxxxxxxxxxxxxxxxxxxxxxxxx (Everett M. Greene) wrote:
"slebet...@xxxxxxxxx" <slebet...@xxxxxxxxx> writes:
On Sep 12, 4:16 am, "Bobby" <bob2...@xxxxxxxxx> wrote:
"Tim Wescott" <t...@xxxxxxxxxxxxxxxx> wrote in message
Bobby wrote:
Why is it important for a C compiler to be "ANSI compatible"? Does that
have any special advantage with my code?

If you want your code to be portable, and if you want to hire
programmers to help you out without having to train them up on your
particular version of C, you want ANSI compatibility.

I thought that ANSI compatibility referred to the _output_ of the C compiler?

No, ANSI compatibility here refers to being able to compile ANSI C
code. ANSI C is one of the standards that everybody (mostly)
recognise. So ANSI compatibility refers to the _input_ into the C

Standardization of a computer programming language should get you to a
state where given a standard-conforming source program, you should be
able to compile it with any standard-conforming compiler and get code
which will produce the same output for given inputs that another
standard-conforming installation will get. It doesn't do you much
good to have a compiler that will accept standard-conforming source
and generate code which will produce non-conforming output when
executed. [We will stay away from a discussion as to how close to
identical things have to be for floating-point arithmetic.]

Two things.

First, the OP said "_output_ of the C compiler". I read that as
literally the output of the C compiler: the executable program plus
any status, warning and error message.

I didn't interpret it that way since none of the standards say
anything about the generated output of the compiler.

The output of the compiler is
therefore different depending on target architecture. It's even
different between C compilers of the same target architecture. Heck
it's even different using the same compiler but with different
optimisations enabled. The C standard in this case does not dictate
WHAT the output should be.

Correct -- as it is with all programming language standards.

It just says that the source code has
specific meanings which must be implemented by the generated output

Secondly, even if the OP did misstate the sentence and really did mean
it as how you read it, the ANSI C standard is one of those standard
that doesn't specify the output of the generated program much. C is
very lenient in this regard (to the frustration of people wanting to
write robust portable code).

You can expect to get a result of 2 if you say 1 + 1.

For example when writing a char most
programs will write 8 bits of data. But there are compilers out there
that generate programs that treats char as 16 or even 32 bits. And
guess what? All of them conform to the standard! When it comes to C,
"standard conforming output" does not necessarily mean "the output
you'd expect" (of course, with experience, you'll learn to expect the
unexpected when using C).

The size of data types is a well-known area of implementation-defined