Computer programming is one of the most fascinating professions there is, but it’s also one of the riskiest career paths one can take. A programmer is aware of the importance of caution when creating code that is free of bugs. In a matter of seconds, a small error can turn someone into a multimillionaire. Software bugs are a very common occurrence because even if a software is carefully developed, errors will inevitably occur. At times, these software bugs appear minor and annoying. But sometimes these small bugs can cause disasters that can have an impact on our daily lives, a country’s economy, and even the functioning of society as a whole in the digital age. It can also result in business losses and even human life.1 The importance of identifying and preventing computer bugs increases exponentially as the internet of things slowly permeates every aspect of our environment.
Here are a few of history’s most well-known, destructive, or fascinating bugs:
- The Original Bug!
Grace Murray Hopper recorded the first computer bug in her log book after discovering a moth inside the Harvard Mark II computer. The date and the phrase, “First actual case of bug being found,” were written by her. Naturally, the term “bug” in computer science today is not used in a literal sense. We use it to describe a bug or failure in a computer program that results in an unexpected outcome or crashes the program. You may already be familiar with this story; if not, read about the first bug in the history of computer science here.2
- Failure of Ariane 5 rocket
Just 40 seconds after liftoff from Kourou in French Guiana, an Ariane 5 rocket operated by the ESA (European Space Agency) exploded. After a decade of $8 billion in development costs, the rocket was about to launch when the bug occurred, costing $370 million in lost revenue.
A common computer programming bug called an integer overflow was the cause of its failure. In this instance, a 64-bit number was attempted to be set in a 16-bit location.1
- The Y2K Bug
The Millennium bug, also known as Y2K, describes incidents involving the formatting and storage of calendar data that started in the year 2000. Kilo, which is the symbol for the number 1,000, is represented by the letter K. Therefore, Y2K refers to the year 2000. Computer engineers in the 1960s used the strategy of writing programs with two-digit code for the year, leaving “19.” The majority of them believed that adding the number “19” before the variable “year” was a waste of memory. For instance, they only used the final two digits “70” to represent the year 1970. Until December 31, 1999, everything was fine. However, after that, on January 1, 2000, many computers misread the year as 1900 because “00” was represented by two digits. As a result, millions of dollars were spent globally to upgrade computer systems, which had an impact on many different types of work.1
- Paypal Error
Chris Reynolds, a Pennsylvania-based PR executive, opened his PayPal e-mail statement to find a balance of $92,233,720,368,547,800. This sum was incorrectly credited to his PayPal account, making him 1,000 times wealthier than the combined global GDP. In the world of 64-bit numbers, the precise amount is significant and suggests a programming error. His account had already reset to zero by the time he had logged in after the error was quickly realized and rectified.3
- Gangnam Style Breaking YouTube
The previous version of YouTube’s counter used a 32-bit integer, a type of data representation used in computer architecture. The maximum number of views this 32-bit integer could count was 2,147,483,647. Nobody anticipated that a song would receive billions of views and surpass the limit of a 32-bit signed integer when YouTube was first developed. When the hit song Gangnam Style by a Korean pop star received more views than the maximum value, the YouTube bug known as Gangnam Style emerged. In order to fix the bug, Google later converted the view count to a 64-bit signed integer.1
One thing has not changed despite the dramatic advancements in computing over the past 75 years: Although there is no room in modern computers for a moth, bugs have continued to plague engineers since the days of Thomas Edison. This state of affairs might not last indefinitely, though, as intelligent systems are already being developed that can recognize their own bugs and figure out how to fix them.4
If you enjoyed this article, consider checking out BYJU’s FutureSchool Blog to read more interesting articles.
- What are some of the famous bugs in the world of computer science? – Hashnode. (n.d.). Retrieved June 27, 2022, from https://hashnode.com/post/what-are-some-of-the-famous-bugs-in-the-world-of-computer-science-ciwbq7h8h000h2e5302dt4y71
- 10 Famous Bugs in The Computer Science World – GeeksforGeeks. (n.d.). Retrieved June 27, 2022, from https://www.geeksforgeeks.org/10-famous-bugs-in-the-computer-science-world/
- Top 5 Famous Bugs in Computer Science World – The Crazy Programmer. (n.d.). Retrieved June 27, 2022, from https://www.thecrazyprogrammer.com/2016/07/famous-bugs-computer-science-world.html
- Automatic bug repair | MIT News | Massachusetts Institute of Technology. (n.d.). Retrieved June 27, 2022, from https://news.mit.edu/2015/automatic-code-bug-repair-0629