In the past 125 years, computer science has advanced significantly. There is no denying that computers have had a profound impact on both the world and how we live our lives. This industry has made extraordinary progress by quickly increasing the capacity, speed, and interactivity of computing equipment. Although the personal computer has only been around for a few decades, computer coding, as we know it, has been around much, much longer.1 Here are some of the moments in the history of computer science coding that helped shape our digital world today.

System of Binary Numbers (1703)

Key moments

The primary language used by computers to communicate is binary code. Everything you see on your screens and all of the internal operations of your devices use binary, whether it be a video of a cat playing the piano or the intricate algorithms of the most powerful supercomputer. The binary system was created long before the first computer was even considered, despite the fact that it may be one of the most crucial languages in computer science coding. In 1703, German mathematician and philosopher Gottfried Wilhelm Leibniz created the binary coding system.1

Binary Punch Cards (1725)

Key moments

In 1725, Basile Bouchon invented a loom that wove patterns based on instructions in the paper it was fed. Bouchon is believed to be the first person to punch holes into paper to control a machine. The “one” represents a punched hole, and the “zero” represents none at all. Even though a lot has changed since then, the fundamental unit of code has not. The loom created by Basile Bouchon used a perforated paper tape roll. It was later upgraded to use punched cards by his assistant Jean-Baptiste Falcon in 1728, but this loom was not entirely automated.2

Jacquard Loom (1801)

Key moments

The powered fabric loom attachment known as the Jacquard Loom was created by Joseph Jacquard and first demonstrated in 1801. It gives instructions to the loom on how to create complex textiles using a series of punch cards. For instance, to create a textile brocade, a loom might have hundreds of cards with holes that correspond to hooks that can be raised or lowered. Because it was the first device to use interchangeable punch cards to program a machine to carry out automated tasks, Jacquard Loom is significant in the history of computers. This was also an inspiration to Charles Babbage, who planned to use perforated cards in his analytical engine, as the Jacquard Loom was also capable of performing various tasks.3

The Analytical Machine (1834)

Key moments

The analytical engine was a modern general-purpose computer, and it was a major breakthrough in the history of computing.

Although Charles Babbage and Ada Lovelace were unable to see their ideas come to fruition during their lifetimes, they did lay the groundwork for numerous advancements in computer programming that followed. It was at the same time that Ada Lovelace devised the first computer algorithm capable of computing Bernoulli numbers.4

Integrated Circuit (1949)

Key moments

Werner Jacobi, a German engineer, created a new semiconductor in 1949 that involved using smaller transistors as opposed to the large, clunky vacuum tubes that were previously used in electronics. There seemed to be little interest in the new product, despite the advantage of being able to shrink down devices without having to warm them up, as vacuum tubes had to be. In 1957, a U.S. Army engineer by the name of Jack Kilby put forth a concept for making tiny ceramic wafers that could be used as integrated circuits, which was a variation of Jacobi’s original concept. In 1958, he created his first prototype for Texas Instruments, and in 1959, he filed for a patent. The new circuits outperformed the older vacuum tubes in terms of size, speed, and dependability. Nowadays, computers that once required an entire room can be reduced in size to fit inside a small box. Integrated circuits developed by Kilby were soon in use by everything from the Air Force to NASA, and were even used in the Apollo 11 mission to the moon.1

The Compiler by Grace Hopper (1952)

Key moments

When designing an early computer, Grace Hopper made the decision to simplify the process by grounding it in human language. Hopper, who joined the U.S. Naval Reserve during World War II, was aware that people in the military, including her superiors, had difficulty understanding binary code. English-based programming languages would make the work less error-prone and more approachable for those without extensive knowledge in math. Some laughed at the concept, but by the early 1950s, she had created a compiler—a collection of instructions that transforms higher-level code into lower-level code that is directly processed by the machine.  Using the tool, she collaborated with her lab to create FLOW-MATIC, a programming language that incorporates English words.2

COBOL (1959)

Key moments

Everything was incompatible because every computer manufacturer used a different programming language prior to 1959. A group of programmers got together to create the first programming language that could communicate with computers made by various manufacturers because they were frustrated by the incompatibility of the systems. They used a language known as COBOL—COmmon, Business-Oriented Language. Numerous new programming languages have been developed over time based on the COBOL concept.1

World Wide Web (1989)

Key moments

A program that could store data in files that also contained links to and from other files was created by Tim Berners-Lee, a computer scientist at the European Organization for Nuclear Research, in the late 1980s. He referred to the method as hypertext and eventually created a network by connecting computers using hypertext. Berners-Lee further put forth a bigger idea about creating a global hypertext document system that connected computers from around the globe.  Soon after, the World Wide Web was launched, revolutionizing both computer science and the rest of the world with it.1

At first glance, it’s not always obvious when a piece of code becomes epoch-defining. It frequently begins as an odd experiment or with a weird, seemingly impossible idea. According to Clive Thompson, the author of “Coders: The Making of a New Tribe and the Remaking of the World,”  “Even the coders themselves can be surprised by the results of their work“.2

If you liked reading this article and want to find more intriguing articles like this, visit BYJU’S FutureSchool Blog.


  1. Key moments in the history of computer science coding. (n.d.). Retrieved July 1, 2022, from 
  2. The lines of code that changed everything. (n.d.). Retrieved July 1, 2022, from 
  3. What is the Jacquard Loom? (n.d.). Retrieved July 1, 2022, from