The year 2000 ("Y2K" in computer-speak) has gotten a lot of attention lately, and not just because of the new millenium. Many people are worried that computer programs will not be able to handle dates properly when the century ends. A few have even predicted massive breakdowns in society as a result. Companies are spending large amounts of money to patch up those programs so they will accept and process dates in the year 2000 and beyond.
The origin of these fears can be traced back to early computer history. Those of us who programmed computers in the 1970s faced limits on both memory and disk storage; saving a few bytes in database records was a constant concern. So we routinely used two-digit numbers to represent the year in dates. I remember when a 20-megabyte hard drive was pushing the limit of technology; now, most PCs have more capacity in just local memory! With so many changes in computers over the last 20 years, it's surprising that those old programs are still around. I think most of us expected them to become obsolete and be replaced, along with the hardware and development environments. But somehow many of them survived -- until now.
Two-digit years are not a problem in themselves: that's a convenient shorthand that people will continue to use well into the next millenium. The problem comes from date comparisons. Suppose my car payment is due in early January 2000, and I send it in late December 1999. A program that uses only the last two digits of the year may decide my payment was 99 years late and take "appropriate" measures (like canceling my loan and putting a black mark on my credit history). The good thing about all the Y2K publicity is that most companies will be aware of the potential for problems, and should be more willing than usual to correct such mistakes as the one above. If that's the case, then Y2K incidents will be annoying but hardly earth-shaking.
However, there is also a down side to the attention Y2K is getting: if people start behaving abnormally, abnormal consequences may result. The banking industry became aware of the problem before other industries (due to the need for long-term loan calculations) and will almost certainly have any serious problems corrected well before year end. But if enough people start withdrawing their savings for fear the banks will go under, then the banks may in fact go under -- not because of Y2K, but because they normally do not have a lot of cash on hand. It's tied up in real estate and corporate loans. So the real danger of Y2K is not how computers will behave; it's how people will behave based on (probably incorrect) assumptions about those computers.
By the way, I have discovered (thanks to Charles Osgood of CBS News) that the Y2K problem is not confined to computer programs. Gravestones for married couples often contain information for both persons, with the year of death of the surviving spouse left blank (to be filled in when that person dies). It seems that some engravers, to reduce the amount of on-site work, have been putting "19__" in that area. In slightly less than a year, those markers will become inaccurate. Fortunately there is a solution (and it doesn't involve making a 1999 appointment with Dr. Kevorkian!) The engravers can fill in the digits with a stone-like paste, let it harden, then carve the digits "20" instead. That, to my mind, is the ultimate Y2K "patch".