This Text file is old! In a 🏛️Museum, an unsorted archive of (user-)pages. (Saved from Geocities in Oct-2009. The archival story: oocities.org)
--------------------------------------- (To 🚫report any bad content: archivehelp @ gmail.com)
>

DEFUSING THE Y2K BOMB
  
  The Millenium Bug. Year 2000. Y2K.
  
  Names are supposed to clarify meaning, but in this case
  more has not been better. So many terms have been flung
  around in regards to Y2K that sometimes it's hard to
  know exactly what what the problem is.
  
  Explained as broadly as possible, Y2K means this: After
  December 31, 1999, two-digit date storage and computer
  logic will fail to operate as intended. Some particularly
  rude tangents include: 
  
  * incorrect day-of-week calculations, because systems will
    reference the wrong century calendar (1900-1999, instead
    of 2000-2099).
  * the year 2000 not registering as a leap year, throwing
    off March-December dates by a whole day.
  * the loss of data tagged with an expiration date of "9999,"
    or September 9, 1999 in "People-ese." (This easy-to-type
    code was used to fill the expiration field for data
    not meant to expire.)
  * improper indexing by functions that sort by date.
  
  Yeah. So what? For the average human who considers computer
  science the alchemy of our day, these things mean very little.
  
  Well, as the saying goes, there's some good news and some
  bad news. The bad news is that many of the personal horror
  stories floating around are plausible. Uncut payroll checks.
  Shrinking or skyrocketing investment/loan interest.
  Stagnant or destroyed inventory. Malfunctioning weapons
  and medical systems. Misprocessed records. Unpaid Social
  Security benefits. Misguided aircraft. Unexpected power
  incidents. Inaccurate mortgage computations.
  
  Better yet, the assumed deadline could be a cruel hoax:
  We might not have until January 1, 2000, to make
  necessary changes. The Social Security Administration ran
  into problems in 1989, due to 10-year projections that
  crossed into year 2000. Since most billing software looks
  ahead at least one year to determine interest payments,
  future assets, and possible expenses, the crisis might be
  upon us sooner than we might think.
  
  The good news is that, with our current awareness of
  the problem, Y2K won't spell the End of Western Civilization
  As We Know It. In all reality, there will be a number of
  snafus, but not nearly enough to signal the technological
  Armageddon predicted by some of the more outspoken
  systems prophets.
  
  The biggest surprise is how dependent the United States
  has become on computer technology.
  
  
The Million-Dollar Question: Why?
  
  Considering its potential threat, Y2K was surely the
  master plot of some computer-obsessed hackers or a
  global madman such as the fictional Dr. Strangelove.
  
  At least, that's what we'd want to think, because
  (believe it or not) Y2K originated not from malice but
  from overly auspicious problem-solving.
  
  In the Dark Ages of Computing, when programs read data
  from punchcards and single machines inhabited entire
  basements, storage concerns were paramount. Since each
  standard card could hold only around 80 characters --
  barely enough for one person's name, birthday, contact
  information, and Social Security number -- the number
  of cards required by an agency such as the SSA was
  staggering. (For comparison, we can now store 20
  million punchcards of information on a $200 hard
  drive no bigger than a Walkman cassette player.)
  To keep budgets in the black, agencies had no alternative
  but to compact data as much as possible.
  
  These first computers were also extremely slow by
  today's standards: The now-plodding 8086 processor
  was a virtual powerhouse compared to the computers
  we used to send men to the moon in 1969. Every processed
  digit put significant strain on the old systems.
  
  In order to save space, money, and time, programmers made
  the very sane choice to avoid processing all non-essential
  data. The two-digit century indicators in dates were
  deemed non-essential. After all, people knew what time
  period they lived in, and computers had no reason to
  look past the 20th century. Cutting the year field in
  half was an ideal temporary shortcut.
  
  The key word was "temporary." Everyone admitted that, in
  a few decades, the missing two digits would become quite
  important. However, computers would then be cheaper and
  faster. Old systems using the two-digit years would have
  been retired. Four-digit years could be easily reinstated.
  
  Or so went the thinking.
  
  Unfortunately, conventional business practices and human
  nature prevented Y2K from being fixed until it was almost
  too late.
  
  The costs of storage space and computing power did eventually
  drop, but agencies never found it convenient to implement
  the four-digit year standard. Programs were a pain to
  revise, two digits were two keystrokes too many, and the
  year 2000 was still far in the future. As long as systems
  worked, no real problem existed. Meanwhile, because new
  applications had to run on the old systems, they were
  written to accommodate the "temporary" two-digit year
  processing.
  
  Along with lack of concern came lack of funding. Because
  preventive maintenance maintains the status quo without
  bringing positive benefits, it usually receives little
  attention on the annual budget. Funds that could have
  solved Y2K problems were instead channelled into agency
  growth, to directly impact critical missions. 
  
  Another assumption was that Y2K wouldn't be a big deal to fix
  once the time came. (After all, it was only two digits,
  right?) Unfortunately, procrastination spawned problems at
  an almost geometric rate.
  
  While Y2K hibernated, system programmers continued to tinker
  with agency computers. In order to fix glitches or add
  software to the base-level code running their mainframes,
  they often slapped "patches" (code snippets) into the
  gaps instead of rewriting software. Because patches are
  small separate files that can call other programs stored
  all over the mainframe, it's difficult to track down
  and correct each one during a global system change such as Y2K.
  
  Another programming problem was that many date fields
  weren't conveniently called "date" but had unrecognizeable
  names. To complicate matters, programmers were notorious
  for not documenting changes (which took time away from
  active programming), so when a programmer left the agency,
  all knowledge of the small changes he made went with him.
  Locating and isolating problem areas has been one of the
  worst Y2K headaches.
  
  Probably the worst contribution to the Y2K situation,
  however, has been indifference. More than one commercial
  CIO has been overheard to say that Y2K wasn't his problem,
  since he'd be retired by the time it became a concern.
  Similar to the policy of borrowing tomorrow's money
  to fund today's work, the "My-Replacement-Will-Deal-With-This"
  attitude dumps a truckload of hot potatoes on the last guy
  in line. Problems are passed down until they are too large
  to be ignored -- and often too large to be fixed.
  
  So what began as a noble attempt to make do with limited
  computer resources eventually became a Sword of Damocles
  swinging over the government's forehead. The old nursery
  rhyme, "For want of a nail... the kingdom was lost,"
  never seemed believable until now: For want of two
  digits of storage space, computer systems worldwide
  could finally suffer a complete breakdown.
  
  
How much will it cost us?
  
  For a long time, agencies had trouble deciding whether
  or not Y2K was actually a problem. (Most of them finally
  decided that it was.) With that out of the way, the IT
  world's new debutant became what the pricetag should read.
  (Most of them still aren't sure.)
  
  Gartner Group estimates that a medium-size organization
  will spend almost $4 million to make Y2K changes, with
  single lines of code costing 80-100 cents apiece to fix.
  Y2K vendor Viasoft estimates the cost of correcting one
  program to run around $600-1200 -- which is small change
  compared to the US deficit, but a real drain on the
  pocketbook when you realize that most agencies have
  thousands of programs.
  
  OMB estimates a contested $2.5 billion to fix federal
  computer systems that were not already going to be fixed
  or replaced for other reasons. The Navy started a brouhaha
  when they claimed Y2K would only cost them $90 million,
  in comparison to the Air Force's $371 million. And
  so the debates went, leaving time to prove the victor.
  
  Most people agree that fixing the United States' Y2K problems
  will run $50-75 billion, and that to fix those of the entire
  world will cost $300-600 billion.
  
  Now that sounds like a lot of money, especially in light of
  OMB's announcement that Y2K work funds must come from existing
  agency budgets. But to throw a nice spin on things, Leon
  Kappelman (University of North Texas) and James Cappel
  (Western Michigan University) claimed last year in the
  Journal of Systems Management that the two-digit year
  format saved a typical organization over $1 million
  per gigabyte of total storage from 1963-1992. Invested
  properly, these savings would have returned $15 million per
  gigabyte over the same time period. Considering how many
  gigabytes the average organization uses (tens or hundreds),
  today's headache of fixing Y2K should have still left
  them rolling in dough.
  
  Apparently we just forgot to save up for a rainy day --
  such as the one most likely to occur on January 1, 2000.


----------------------------------------------------------
(c) 1997 by Fed Services, Inc.
Electronic Government, Vol. 2, No. 2, pp.???

Material to be used solely in regards to examining
my credentials for employment.
  

Text file Source (historic): geocities.com/athens/delphi/9147/resume

geocities.com/athens/delphi/9147
geocities.com/athens/delphi
geocities.com/athens

(to report bad content: archivehelp @ gmail)