The History Of Computing Development
Computing has drastically impacted advancement in science, engineering, business and numerous other regions in human endeavor. In this day and age, about everybody needs to utilize computers. Computing will keep on showing testing profession openings and those who study computing will have a vital role in shaping up the future. Computing of course, cannot exist without the use of computers. Prior to 1935, a computer was a man who performed arithmetical estimations. Somewhere in the range of 1935 and 1945 the definition alluded to a machine, instead of a man. The advanced machine definition depends on von Neumann's concept: it accepts input, processes data, stores information, and produces yield.
Before the actual power of computing could be realized, the guileless perspective of calculation had to be overcome. The people who worked to bring the computer into fruition had to learn that what they were making was not simply just a calculator, but a machine that would solve numerous problems, even problems not yet envisioned when the computer was fabricated. Likewise, they needed to figure out how to tell such a critically thinking computer how to interpret different problems. In other words, they needed to invent programming. They needed to take care of all the overwhelming issues of creating such a device, of actualizing the plan, of actually fabricating the thing. The history of tackling these issues is the history of computing.
There have been a number of computing milestones and it has evolved in many ways over the years. The earliest form of the computer dates back to the 14th century and it was known as the “Abacus”. It is an instrument used for calculations by sliding counters along rods or in grooves. Presently, as like before, it typically consisted of a rectangular edge with thin parallel poles hung with dabs. It represented values discreetly. Each bead was in a predefined position or the other representing unambiguously a number each.
In the 17th century, calculating devices took a new turn through John Napier, a Scottish mathematician who created the “Slide Rule”. A manual gadget used for estimation that comprises in its simple form of a ruler and a mobile centerpiece which are graduated with comparable logarithmic scales. There have been many more evolutionary changes over the centuries, some of which includes the Pascaline by Blaise Pascal, the Jacquard Loom by Joseph Marie Jacquard, the difference engine by Charles Babbage and Ada Byron, the Colossus which broke Hitler’s codes in WWII, ENIAC, UNIVAC, IBM, ARPAnet, IC, Apple, LSI, the Internet, WWW, ISPs and Personal Digital Assistants (such as the Palm Pilot).
None of these inventions would’ve been possible without the pioneer computer scientists such as Charles Babbage, who was considered to be the father of computing after his invention and concept of the Analytical Engine in 1837. Alan Turing, the British codebreaker, who worked on the Colossus (code breaking machine, precursor to the computer) and the ACE (Automatic Computing Engine). He was noted for many brilliant ideas, Turing is maybe best remembered for the ideas of the Turing Test for AI and the Turing Machine, a dynamic model for displaying computer operations. J. von Neumann, a child prodigy in mathematics, who authored the landmark paper clarifying how programs could be put away as data. John V. Atanasoff, one of the contenders as the inventor of the first computer. Konrad Zuse, a German who, during WW II, designed mechanical and electromechanical computers. Lastly, H. Edward Roberts, developed the MITS Altair 8800 (the first microcomputer) in 1975.
We can’t talk about computers without mentioning the birth of the internet. The internet, originally called the ARPAnet started as a military computer in 1969. It was an experimental project of the U.S. Department of Defense Advanced Research Projects Agency (DARPA). Instead of having physical communications from each institution to a general center, the NSF started a chain of connections in which institutions would be associated with their neighboring computing centers, which all integrated into focal supercomputing centers. This then expanded to a global system of computer networks, which allows computers all over the world to communicate with each other and share information stored at a central server.
In 1992, the internet was still basically used by scientists and scholastics. In 1995, expansive commercial Internet Service Providers (ISPs), such as MCI, Sprint, AOL and UUNET started offering services to a large number of customers. The internet currently interfaces millions of networks, reaching people everywhere in the world. The history of computing has its origins at the outset of civilization. As towns and communities evolved there was a need for increasingly sophisticated calculations. Xerox PARC's vision and research in the 1960s and '70s attained commercial success in the form of the mouse-driven graphical UI, networked computers, laser printers, and notebook-style machines.
Today, the vision of omnipresent processing anticipates a day when microchips will be found everywhere people go. The innovation will be undetectable and characteristic and will respond to normal patterns of behavior. Computers will disappear, or will rather become a transparent part of our physical environment. Therefore, truly starting an era of “One person, many computers.”