The History of Computing

The History of Computing
In 1842 Babbage?s Difference Engine and the Analytical Engine Convinced his machine would benefit England, Babbage applied for and received one of the first government grants to build the difference engine. Hampered by nineteenth century machine technology, cost over runs, and the possibility his chief engineer was padding the bills, Babbage completed only a portion of the difference Engine before the government with drew its support in 1842, deeming the project ?worthless to science?. Meanwhile Babbage had conceived of the idea of a more advanced ?analytical engine?. In essence, this was a general-purpose computer that could add, subtract, multiply, and divide in automatic sequence at a rate of 60 additions per second.
His 1833 design, which called for thousands of gears and drives, would cover the area of a football field and be powered by a locomotive engine. Babbage worked on this project until his death. In 1991 London?s Science Museum spent $600,000 to build a working model of the difference engine, using Babbage?s original plans. The result stands 6 feet high, 10 feet long, contains 4000 parts, and weighs 3 tons.

The Honeywell 400 and the second Generation of Computers. The invention of the transistor signaled the start of the second generation of computers (1959-1964). Transistorized computers were more powerful, more reliable, less expensive, and cooler to operate that their vacuum-tubed predecessors. Honeywell established itself as a major player in the second generation of computers. Burroughs, Univac, NCR, CDC, and Honeywell IBM?s biggest competitors during the 1960s and early 1970s became as the BUNCH.

The IBM system 360 and the third generation of computers. The third generation was characterized by computers built around integrated circuits. Of these, some historians consider IBM?s system 30 line of computers, introduced in 1963, the single most important innovation in the history of computers. System 360 was conceived as a family of computers with upwards compatibility; when a company outgrew one model it could move up to the next model without worrying about converting its data. System 360 and the other lines built around intergraded circuits made all previous computers obsolete, but the advantages were so grate that most users wrote the costs of conversion off as the price of progress.

In the early 1960?s, Dr thoms Kurtz and Dr. John Kemeny of Darmouth College began develoing a programming language that a beginner could learn and use quickly. Their work culminated in 1964 with BASIC. Over the years, BASIC gained widespread popularity an devolved from a teaching language into a versatile and powerful language for both business and scientific applications, From micros to mainframes, BASIC is supported on more computers than any other language.

Although most computer vendors would classify their computers as fourth generation, most people pinpoint 1971 as generation?s beginning. That was the year large-scale integration of circuitry. (more circuits per unit of space) was introduced. The base technology, though is still the integrated circuit. This is not to say that two decades have passed without significant innovations. In truth, the computer industry has experienced a min boggling succession of advances in further miniaturization of circuitry, data communications, and the design of computer hardware and software.

In 1968, seventh grader Bill Gates and ninth grader Paul Allen were teaching the computer to play monopoly and commanding it to play millions of games to discover gaming strategies. Seven years later, in 1975 they were to set a course which would revolutionize the computer industry. While at Harvard, Gates and Allen developed a BASIC programming language for the first commercially available microcomputer, the MITS Altair. After Successful completion of the project, the two formed Microsoft Corporation, now the largest and most influential software company in the world. Microsoft was given an enormous boost when its operating system software, MS-DOS was selected for use bye the IBM PC. Gates, now the richest man in the world, provides the company?s vision on new product ideas and technologies.

Not until 1975 and the introduction of the Altair 8800 personal computer was computing made available to individuals and very small companies. This event has forever changed how society perceives computers. One prominent entrepreneurial venture during the early years of personal computers was the Apple II computer. Two young computer enthusiasts, Steven Jobs and Steve Wozniak (then 21 and 26 years of age, respectively), collaborated to create and build their Apple ii computer on a makeshift production line in Jobs garage. Seven years later, Apple Computer earned a spot on the Fortune 500 , a list of the 500 largest corporations in the United States.

In 1981, IBM tossed its hat into the personal computer ring with its announcement of the IBM Personal Computer, or IBM PC. By the end of 1982, 835,000 had been sold. When software vendors began to orient their products to the IBM PC, many companies began offering IBM-PC compatibles or clones. Today the IBM PC and its clones have become a powerful standard for the microcomputer industry,

Mitchell Kapor is one of the major forces behind the microcomputer boom in the 1980?s in 1982, Kapor founded Lotus Development Company, now one of the largest applications software companies in the world. Kapor and the company indroduced an electronic spreadsheet product that giave IBM?s recently introduced IBM PC (1981) credibility in business marketplace. Sales of the IBM pc and the electronic spreadsheet, lotus 1-2-3 soared.

Microsoft introduced Windows, a GUI for IBM PC compatible computers in 1985; however, windows did not enjoy widespread acceptance until 1990 with the release of Windows 3.0 Windows 3.0 gave a huge boost to the software industry because larger, more complex programs could now be run on IBM pc compatibles. Subsequent releases, including Windows 95, Windows NT and Windows 98 make personal computers even easier to use. Fueling the PC explosion of the 1990?s

Biography of Tim Berners-Lee

Tim Berners-Lee graduated from the Queen?s College at Oxford University, England, 1976. Whilst there he built his first computer with a soldering iron, TTL gates, an M6800 processor and an old television.

He spent two years with Plessey Telecommunications Ltd (Poole, Dorset, UK) a major UK Telecom equipment manufacturer, working on distributed transaction systems, message relays, and bar code technology.

In 1978 Tim left Plessey to join D.G Nash Ltd (Ferndown, Dorset, UK), where he wrote among other things typesetting software for intelligent printers, and a multitasking operating system.

A year and a half spent as an independent consultant included a six month stint (Jun-Dec 1980) as consultant software engineer at CERN, the European Particle Physics Laboratory in Geneva, Switzerland. Whilst there, he wrote for his own private use his first program for storing information including using random associations. Named "Enquire", and never published, this program formed the conceptual basis for the future development of the World Wide Web.

From 1981 until 1984, Tim worked at John Poole's Image Computer Systems Ltd, with technical design responsibility. Work here included real time control firmware, graphics and communications software, and a generic macro language. In 1984, he took up a fellowship at CERN, to work on distributed real-time systems for scientific data acquisition and system control. Among other things, he worked on FASTBUS system software and designed a heterogeneous remote procedure call system.

In 1989, he proposed a global hypertext project, to be known as the World Wide Web. Based on the earlier "Enquire" work, it was designed to allow people to work together by combining their knowledge in a web of hypertext documents. He wrote the first World Wide Web server, "httpd", and the first client, "WorldWideWeb" a what-you-see-is-what-you-get hypertext browser/editor which ran in the NeXTStep environment. This work was started in October 1990, and the program "WorldWideWeb" first made available within CERN in December, and on the Internet at large in the summer of 1991.

Through 1991 and 1993, Tim continued working on the design of the Web, coordinating feedback from users across the Internet. His initial specifications of URIs, HTTP and HTML were refined and discussed in larger circles as the Web technology spread.
In 1994, Tim joined the Laboratory for Computer Science (LCS)at the Massachusetts Institute of Technology (MIT). to be Director of a W3 Consortium which coordinates W3 development worldwide, with teams at MIT, at INRIA in France, and at Keio University in Japan. The Consortium takes as it goal to lead the Web to its full potential, ensuring its stability through rapid evolution and revolutionary transformations of its usage.

The History of Computing 9.4 of 10 on the basis of 1061 Review.