Re: interesting use of NEXT SENTENCE vs. CONTINUE

"Pete Dashwood" <dashwood@xxxxxxxxxxxxxx> wrote in message

> That is absolutely correct. I DID think any old compiler would be able to
> compile it WITH MINOR MODIFICATIONS for a specific platform.
> And I mean MINOR.

Pete, that's not how it came across to me; it was more like "here's code
that I've written to be compilable with any compiler", and having to spend a
*whole lot* of time in general dealing with folks who assume that a
platform-specific extension to COBOL (of any sort, from any vendor) should
be presumed to be portable to and applicable on any other vendor's platform,
I'm *very* sensitive to what is portable to/from/on our platforms and the
platforms of others, and what is not. I have to be; it's part of my job.

> I am not optimistic enough to expect 100% compatibility and I'm certainly
> not going to go to the Standard to see what SHOULD work.

If you write your COBOL code avoiding the things that are marked as
extensions in the language reference materials for the compiler you're
using, you should be able to attain 100% compatibility. You don't have to
shell out the princely sum of $18 for a copy of the standard to determine
whether the code you write is standard or not; you could make use of your
vendor's tools to do the same thing. And in this case, the standard to
which your vendor's compiler conforms is most likely the '85 standard,
which, not being the current one, is rather harder to find.

< In the time I've
> done that (always assuming I'm prepared to buy a copy of the standard from
> ANSI (and I am most definitely NOT)), I could've compiled the code, read
> diagnostics, corrected whatever is wrong, and got it working.

Yes, you could. Might've been nice if you had!

> Unlike you Chuck, I am uninterested in the standard. My only concern is
> works on the target platform.

If the target platform is Unisys MCP COBOL85, then one way to make sure that
it works on that target platform is to start with *standards-compliant*

> The fact is that code, even when it IS
> completely portable, would need to be modified for the target platform in
> order to optimize it anyway.

Yes, that was *exactly* the point of my comments on 77-vs.-01/12, USAGE and
numeric capacity issues.

> You mentioned the arithmetic formats and lengths preferred on your
> I ran tests on mine with lengths from s9(4) up to S9(18) using comp and
> comp-5. Obviously the very short tests don't mean much.

I ran tests changing J from COMP to DISPLAY to BINARY to BINARY EXTENDED,
and came up with DRAMATICALLY different results. That's why I brought it
up. The differences obfuscate the timings for the things we're actually
trying to measure, and may even drive the timing differences into the level
of noise.

One of the things I've had to learn how to spot in compiler support work is
how to minimize the cost of measurement and how to ensure that the
measurements are accurate and appropriately representative.

I've also had to work out explanations for bizarre results. For instance, I
have a strong sense that an unconditional branch to the
immediately-following machine instruction confuses the heck out of the
particular machine I'm running on, causing a "restart" of the instruction.
I don't know that for sure, but the timings seem to support this. I may
research this further at some point, but I have other priorities. And the
anomaly almost certainly wouldn't occur on a sizable percentage of our
product line.

> You should have your discussion about 01 levels and 77 levels with a
> programmer who uses a machine where memory synchronization is important,
> chastising me for defining a subscript that wasn't used, is so petty it's
> unbelievable.

By the same token I've read some sentiment on CLC that declaring items at
Level 77 serves only to provide compatibility with ancient editions of
COBOL. There's never a difference in access-time among items declared at
77, 01 or 05, and thus there's absolutely no practical reason ever to use
77's. My response is "uh... no... that's not correct, not as a universal

> It WAS used in another version of the code that wasn't
> published. I simply paste a block of subscripts in the format shown, into
> every program I write. The block ends at P (with O being skipped to avoid
> confusion with zero). I usually remove the ones that are not used after
> code is debugged and working (exactly as I did this time, but not for the
> version I posted).

That indeed is a reasonable style choice. It begs the question, however, as
to whether that particular style choice is *universally* appropriate.

> And yes, I do use INDEXED BY and indexes also.


> I never went to any pains to make this code suitable for any purpose OTHER
> THAN the one I stated so I feel that much of your commentary was really
> cheap shots.

Not my intent. The code wasn't suitable for the purpose for which I was led
(correctly or not) to believe you intended it, that is, for it to provide
meaningful results on the system I work on.

You've been asking for performance numbers from *our* system on *your*
programs for some days now, and since you posted this particular code in a
message to me, I presumed (correctly or otherwise) that it was your intent
that I provide you my results from that program.

In order to provide you any performance numbers at all I had to make certain
changes to the program -- which brought it into conformance with the '85
standard -- and certain other changes to the program to render those numbers
meaningful, some of which (Binary Extended) made use of extensions to the
standard, some of which did not.

(I think I failed to point out that inline comments -- *> -- are an
implementor extension to ANSI '85 COBOL, not specified therein. 2002 COBOL
has them. Personally, I really like inline comments, I think their absence
from COBOL was a serious shortcoming in comparison to the likes of ALGOL and
Pascal, and our COBOL85 implementation accepts them. If I could get away
with this extension I'd put it in our COBOL74 as well, but that counts as a
New Feature and we're not doing New Features in COBOL74. One problem,
though: not all implementations support an Area B that extends beyond
column 72 of the source line. Ours currently does not; I deleted the inline
comments because I knew them irrelevant to *compilation* and *execution*.)

I felt it appropriate to present what changes I needed to make to get past
the compilation, and what changes I needed to make to minimize the impact of
the measurements on the results. Once I was done, the only "nonstandard"
parts of my example seem to be the use of Usage Binary Extended (it differs
from Binary only in that code isn't generated to apply the "odometer effect"
when an item of )and the implementor-name specification for Console in
Special-Names (I could have changed the DISPLAY ... UPON CONSOLE to simply
DISPLAY ... and actually remained within "Nucleus Level 1" of the '85
standard! UPON is Level 2.).

> If your point was that the code, as written, was not portable, then I have
> no issue with that. If your point was it needed to comply with the
> to be portable, then my response is "Bollocks!".

I think the *best* way to write code that can be compiled on any platform
whose COBOL implementation conforms to a given COBOL standard is to avoid
the vendor's extensions beyond the standard.

> The only thing that would
> make it run on the target platform is if it fits the implementation on
> platform, as defined by the compiler vendor.

I agree. If the compiler vendor has agreed to conform the implementation to
the standard, then a subset of "fits the implementation on that platform" is
"conforms to the standard", and it is *very* likely that that subset "fits
the implementation on any platform" presuming the implementation conforms to
the standard.

> I am well aware of the overheads in using DISPLAY UPON and in using
> The original code had a comment saying something to the effect of 'do this
> any way you like'.

Unless I missed something somewhere, I didn't see that.

> I NEVER use ACCEPT and DISPLAY in production environments
> and would write the results to a log file and use FUNCTION CURRENT-DATE
> rather than ACCEPT from TIME.(I think the last time I used that was 20
> ago in an IBM environment where there were no intrinsic functions.)

> But, at least as far as I'm concerned, all of that is beside the point.

Which point? The universal truisms that a GO TO that's been ALTERed runs
faster on limited-memory systems than a comparison and conditional branch,
or for that matter that a comparison with a conditional branch testing the
result is always slower than an unconditional branch? I think I've
demonstrated that both of *those* points are not universal truisms!

> You can sidetrack the real issue all you like with a display of pedantry
> based on the standard. It cuts no ice with me whatsoever.

> I have written working COBOL code for at least seven platforms and never
> once referred to the X23 or any other standard in order to do so. I have
> maintained my own and other people's code without this 'Bible' as well. So
> exactly what use is it to the coal face programmer? Absolutely none. They
> their jobs perfectly well without it.

I work on the COBOL *compilers* and I refer to the standards *very* often.
I could not provide a COBOL compiler that provides support for
standards-compliant -- and thus portable -- code without doing so.

> I really do appreciate the time you put into explaining the assembler and
> architecture for your platform and I found it interesting. But I'm simply
> appalled at your reponse to a trivial code snippet, intended for a
> purpose (which it achieved), and the fact that you would then take it
> based on compliance with 'Holy Writ'.

I took it apart because I needed to take it apart to make it run on our
platform. I detailed the changes I needed to make because I thought your
expectation was that it would compile and run *with no changes* on *any*

I'm a little testy on *that* particular topic because many times in this
forum (though not from you, to the best of my memory), the response to my
comment "That doesn't work on a Unisys MCP platform" was something along the
lines of "it works on any computer that really matters to anyone", and that
ends up being a not-so-veiled slap at the company for which I work, and the
relevance of its products in the marketplace.

> I don't care. The exercise was about the relative speed of branch and
> compare instructions, on most platforms. I changed some of my ideas after
> running tests.

Your premise wasn't originally "most", it was "universal truism".

> (When is the last time you changed one of yours?)

Yesterday. Why?

> I am sorry for the inconvenience you incurred. You didn't HAVE to spend
> doing anything. If you had posted to me privately or publicly stating that
> there was difficulty in getting it to run on a Unisys platform, I might
> taken another look at it or got a friend who has access to such a platform
> to run it for me. I said I believed the code was portable because I did
> believe that. (Still do...). However, I would expect a reasonable person
> make minor modifications for their platform where necessary. My point is
> that SOME amendment would be needed to optimize it for a given platform
> anyway...

I needed to change it to get it to compile. If the source code is truly
portable it will compile in any implementation. I also changed it to reduce
the "metering costs".

> The fact is that you PREFERRED to make thunder about standard compliance
> than to address the issue. That's OK.

I thought I *did* address the issues you raised. I also detailed the
changes I needed to make to your "portable" program in order to address
those issues.

> It simply diminishes my opinion of you.

A pity. How many language compilers do you personally maintain and enhance?
Are you required by the marketplace to guarantee personally that any of them
are compliant with some national or international "de jure" rather than "de
facto" standard? In my case the answer to the first is "seven by myself, a
couple of others with help" and to the second "yes, at least three of them".
I work hard on the standards committees to *enhance* and *encourage*
universal portability. I would not have written the proposal for the '08
standard for 128-bit draft-IEEE-754 floating-point formats in decimal and in
binary, and for arithmetic performed directly in those formats (basically
taking the arithmetic rules *out* of the COBOL standard!) if I did not
believe in the usefulness of complete portability. I just don't define
"portable" as "compiles and runs as is on machines whose extensions
slavishly follow the IBM implementation", nor do I define it as "requires
major surgery before it will run in a different operating environment".

> Michael said it all when he said: "Some things are obvious
> (unnecessary buffer copies, for example), but empirical data is
> really the way to go."

> Amen to that.

I certainly agree.

-Chuck Stevens