The terms “computer bug” and “software bug” are commonly used in information technology (IT), but where did they come from? Despite the fact that modern “bugs” have only existed since the invention of computers and software, the term “bug” has a long history. 

What exactly does the term “computer bug” mean?

In IT, a bug refers to an error, fault, or flaw in any computer program or a hardware system. A bug produces unexpected results or causes a system to behave unexpectedly. In short, it is any behavior or result that a program or system gets but it was not designed to do.”1

A bug is caused by an error or omission made by developers while creating the source code or designing the overall program, or within components and operating systems used to run the program. Sometimes, compiler errors may lead to the generation of incorrect or illegible code by causing errors in translating between languages.1 

Bugs, much to the chagrin of many programmers and users, can cause errors that have far-reaching consequences. Software bugs can be minor or severe to the extent of causing a program to crash or freeze.

Other bugs cause security problems, such as allowing a malicious user to get around firewalls and gain unauthorized access or privileges.1

Although the term “computer bug” is new, the term “bug” has long been used in informal engineering terminology. Ada Lovelace mentioned the possibility of problematic program “cards” in Charles Babbage’s analytical engine in 1843, so the concept, if not the term, may date back to that time.2

The Very First Computer Bug in the World

Though Thomas Edison is credited with coining the term “bug” to describe an error or malfunction in a machine for the first time, it was not “the” computer bug.3 The world’s first computer bug was recorded in September 1947. But this wasn’t just any ordinary bug. It was a real-life moth that was causing the computer’s hardware problems. 

Engineers working on Harvard University’s Mark II computer discovered a bug clogging the system in 1947—a moth had gotten into one of the machine’s components. Grace Hopper, a computer scientist, recorded the “first actual case of a bug being found.” When her Harvard colleagues opened up some hardware to see what was causing the computer’s errors, they were surprised to find the insect trapped in a relay. To fix the problem, they had to literally “de-bug” the machine, and the infamous moth was taped into the logbook.4

However, despite popular belief, Hopper did not coin the term “bug” in 1947. In fact, the term “bug” has been around since Thomas Edison used it to describe a flaw in his telephone designs in the early 1800s, as mentioned above. However, according to Graham Cluley, “While it is certain that the Harvard Mark II operators did not coin the term ‘bug’, it has been suggested that the incident contributed to the widespread use and acceptance of the term within the computer software lexicon.” Since then, the term “bug” has been used more broadly to describe errors or glitches in a program.5

And that’s the story of the world’s very first computer bug! Hope you found the article interesting and enjoyable to read. If you want to read more  fascinating and informative stories on coding, explore BYJU’S FutureSchool Blog.


  1. What is a Bug? – Definition from Techopedia. (n.d.). Retrieved June 17, 2022, from
  2. A Brief History of the Web: From 17th Century Computers to Today’s Digital Empires. (n.d.). Retrieved June 18, 2022, from
  3. The World’s First Computer Bug. (n.d.). Retrieved June 18, 2022, from
  4. September 9: First Instance of Actual Computer Bug Being Found | This Day in History | Computer History Museum. (n.d.). Retrieved June 18, 2022, from
  5. The very first recorded computer bug. (n.d.). Retrieved June 18, 2022, from