2 C or ^ 2 C

I've done worse, far worse, especially in some of the machine languages I use. I've gradually pared down using *tricks* in coding to just a couple, and even those I have a reason for using and I encapsulate them in macros. Most of the valid reasons for coding tricks have gone away, that is, constraints on memory and speed.

The uncoolest of reasons I returned to more traditional coding techniques such as your Pascal example illustrates is Clarity, Readability, Maintainability. I like to write in C, but I detest having to work on programs that other people have written in C.

Part of the problem is that C was designed to be a systems programming language. I write operating systems code, low-level teleprocessing routines, and system utilities. The "freedom" C provides is convenient for these types of routines.

During the 1970s, language designers realized that poor language design let programmers make mistakes. Loose languages allowed more bugs. Programmers often focused on tricks instead of the algorithms of their routines. Work was done on structured programming and object systems. C was diametrically opposed to these developments.

Unfortunately, application programmers usurped C for themselves. Rexx, Pascal, Ada, PL/I, Fortran tend to restrict programmers... as you pointed out. But 99% of the time, that assists the programmer because the language is helping prevent him from doing something he didn't want to do in the first place.

When a critical application crashes, I'm often the one called in to read the machine language code, and that's when I find a programmer has (mis)used one of the C tricks he found at the bottom of a page in a magazine. If he had stayed with Cobol (or Ada or whatever), he would have had to work harder to crash the system.

To be fair, I was recently taken to task by an applications programmer who didn't understand a line of Rexx code I thought was obvious:

   Parse Var '1 2 3' a b c .
which I used instead of the traditional
   a = 1; b = 2; c = 3;
The manuals had not intimated that the Parse instruction could be used for multiple simple assignments, so it was not in the "instruction set" of that programmer.

You say, "The C notation may confuse you but this is because you still think in a human language. Try to think like a computer!". Memory and speed now allow us the luxury to think like humans. As we used to say, Don't confuse the proletariat. Let the computer think like a computer and let them think like a human.

Leigh Lundin