Berm Lee wonders:
>One thing I'm curious about, though, is why programmers back then
>underestimated the longevity of the mainframe computers (and/or their
>software)?
If I recall correctly, in 1970 RAM cost about $.50/byte. So we did
not have much; and memory was not the only resource in short supply.
Thus anything you could do to save a byte, or a whole instruction, or
a machine cycle here or there seemed like it was worth doing.
Parsimony was a virtue. Sometimes it was essential to achieving any
function at all. Anyone who might have started worrying about 4 digit
year representations would have been regarded as an impractical
idealist who was wasting the company's resources and interfering with
achieving the function required now. Yes, when memory got cheaper,
those programs began to look pretty stupid. But, in many cases, that
they worked at all with the limited resources available at the time is
quite amazing.
I think there may be a sense in which the machine longevity was indeed
short. When I put on ten times as much memory, is it really the same
computer anymore? Not really; but the old code, though constrained by
the old resource limitations, still functioned - so it was not
discarded. I do not see this a fault on the original programmers, who
would probably not have done it the same way without the old
constraints.
Regards,
David V.
               (
geocities.com/heartland/oaks/5346)                   (
geocities.com/heartland/oaks)                   (
geocities.com/heartland)