The 10 Worst Programming Mistakes in History You Have To Know

Code is almost everywhere. The advent of modern computers arrived in the 1940s. In its rich history, programming enabled better communication, and led to advancements across a myriad of industries. Everything from space travel to telecommunications and healthcare has been revolutionized and affected by code.

Plus, programming can teach valuable life lessons. However, in its storied past, coding wrought destruction as well. Instances of a little bit of bad code caused disaster on a major level. The following are 10 of the worst programming mistakes in history.Computer Science Computer Science Computer Science Computer Science Computer Science

1. Y2K Bug

The Year 2000 bug, aka Y2K Bug or Millennium Bug, was a coding problem predicted to cause computer pandemonium. In the 90s, most computer programs listed four digit years in an abbreviated version. So 1990 read 90, 1991 written as 91, etc. By shortening four digit years to two digits, coders thus saved valuable memory. But computers were unable to recognize 2000 as simply 00. Further exacerbating the problem, 2000 was a leap year. Certain software applications didn’t account for the extra day.

Many feared that Y2K could bring down computers and electronics across the world. I remember my first DVD player bearing a shiny “Y2K Compliant” sticker. While the year 2000 rang in rather uneventfully from a software side, updating computers and apps throughout every industry cost roughly $300 billion. Computers did not crash. Life proceeded as normal. But not without loads of money and work, which according to Slate reports may have been a waste.Computer Science Computer Science Computer Science Computer Science Computer Science

Why it’s one of the worst programming mistakes: The Y2K panic was extremely costly, to the tune of $300 billion. Plus, resources were redirected to fix this potential problem.

Prev1 of 10Next

Leave a Reply

Your email address will not be published. Required fields are marked *