About history of computer | Visualize a world without computers

Visualize a world without computers a world where human information is no longer at your fingertips. A world where a tool that you use every day just no longer exists Computers has penetrated nearly every facet of our lives. But how did they become so ubiquitous?

about-computer- of-historry

This is the history of the computer

Today the word computer refers to the devices that we interact with to work, connect and play. But, it historically described machines that did use in performing calculations with numbers.

We tell study the evolution of the earliest devices used for computations and how they became the computers that we depend on today.

Abacus

about-computer-of-history

The abacus was a computational tool used for hundreds of years and is generally considered to be the first calculator. The exact origin of the device is still unknown but the Sumerian abacus appeared as early as 2700, 2300 BCE in Mesopotamia.

This has been mentioned in numerous civilizations throughout history, including in Ancient Egypt, Persia, Greece, China, Rome, and India.

Astrolabe

about-computer-of-history

Another famous calculator from that past was the astrolabe, which was used to measure the elevation of celestial bodies in the sky. The earliest known reference to one was from around the 2nd century BCE in the Hellenistic civilization.

In addition to its value to astronomers, this astrolabe became indispensable for sailors since it allowed them to determine their local latitude on long voyages.

One defining feature of modern computers that separates them from simple calculators is the fact that they can be programmed. This allows them to automatically perform certain tasks without continual human input.

Programmable mechanical computer

In the19th century, Charles Babbage conceptualized the first programmable, mechanical computer. His idea utilized punch cards to input instructions that each machine would carry out.

Unfortunately, it proved too difficult to economically produce and the project was canceled after the British government stopped funding.

Analog computers

The early 20th century saw analog computers develop further as they were set to work to solve complex mathematical problems. The differential analyzer is the most famous example of this and was built at MIT by Vannevar Bush in the the1920s.

About-history-of-computer

Bush later became involved in the Manhattan project to produce nuclear weapons also even inspired the invention of the World Wide Web nearly 50 years before its creation.

World War 2 led to a strong leap in computer technology because nations tried to gain the upper hand over their adversaries. Computers were primarily built to calculate firing tables to improve artillery accuracy and to break enemy code to gain important intelligence.

The first large-scale digital computer was built by Howard Aiken in 1944 at Harvard University it was one of the first machines that used electrical switches to store numbers. If the switch was off, that stored zero, and while on, it stored the number one. Modern computers follow this same binary principle.

This period also saw the rise of vacuum tubes, which offered much faster performance than traditional relay switches. The most famous vacuum tube computer and once considered to be the predecessor of modern machines was the ENIAC, invented by John Mauchly and J. Presper Eckert. That was the first fully electronic and general-purpose digital computer.

Despite vacuum tubes offering advantages over electromechanical switches, they had their drawbacks. They consumed enormous quantities of power, were unstable, and required large amounts of space.

In 1947, three scientists in Bell Labs discovered that semiconductors could exist used to more effectively amplify electrical signals. This managed to the creation of the transistor, which paved the way for modern computing. Transistors did much smaller than vacuum tubes, used no power unless in operation, and extremely reliable.

William Shockley, one of the inventors of the transistor, continued refining that and founded a company in Palo Alto, California. This would foreshadow Silicon Valley’s development into the global hub of computing over the next few decades.

In the late 1950s, two teams individually built the integrated circuit, a collection of transistors and other components that could be manufactured on a large scale. This was a breakthrough that led to computers shrinking throughout the 1960s.

In 1968, the general-purpose microprocessor was invented also was the first example from the computer existing on a single chip. The miniaturization of microchips allowed inter to release a processor known as the 8080 in 1974. That was used by hobbyists to build home computers.

about-computer-of-history

One such hobbyist was Steve Wozniak, who partnered with his friend Stevie Jobs to found a company named Apple and begin selling home computers. Although the first iteration did not sell well, their second machine was sold as the Apple II and gained popularity among home users, schools, and small businesses due to its ease of use.

In 1980, the market leader for computers was IBM and they responded with their first personal computer, also based on the Intel 8080 processor. The main problem with early computers did that they all used different hardware, and programs written for one machine would not work with others.

In 1976, Gary Kildall invented an intermediary between the machine’s computer hardware and software this became the first operating system. IBM was keen to implement this into their PCs but after Kildall refused to sell to them, they turned to a young programmer named Bill Gates at a company named Microsoft.

After convincing IBM to let Microsoft own the rights over its operating system, Gates developed MS-DOS, which he licensed to IBM and eventually different PC manufacturers.

That led Microsoft to become one titan it is today. At Apple, Steve Jobs was determined to make computers easier to do. He did inspire by research that Xerox had conducted in the 1970s, which included computers with a desktop-like screen, mouse, and graphical user interface. Steve Jobs borrowed these ideas and eventually launched the Macintosh, which hurt IBM’s position in each industry.

These features were eventually implemented by Bill Gates into Windows, which led to a copyright lawsuit in the late 1980s. Microsoft finally prevailed and Windows became the dominant operating system for home personal computers, where this remains to this day.

In the 1980s and beyond should be seen computers find numerous new applications. They appeared in watches, cars, cell phones, airplanes. They became portable and ever-present. Today, computers are everywhere. And yet, the future remains even more promising.

Quantum computers could signal a paradigm shift as humanity can tackle complex problems that today machines can’t solve. Any move away from silicon may reignite that pace of transistor development.

Computers command be crucial for us in reaching out into space and exploring the stars. They may have humble beginnings but no matter what challenges humanity faces, the descendants of that abacus from Mesopotamia will be always alongside us.

Spread the love

Leave a Reply

Your email address will not be published. Required fields are marked *