It’s amassing how technology has advanced in several sort decades. The differences between new and old computers are very drastic. The evolution has come a long way from the first computers to modern ones. In fact who invented the computer is not a true simple question to answer. Really many inventors contributed to the history of computers and that a computer is a complex piece of machinery made up of many parts, each of which can be considered a separate invention. Konrad Zuse earned the semiofficial label of ‘inventor of the modern computer’ for his series of automatic calculators.
He invented calculators to help him with his lengthy ngineering calculations. One of the most difficult aspects of doing a large calculation with either a slide rule or a mechanical adding machine was keeping track of all running results of the calculation. Konrad Zuse wanted to overcome this difficulty. He realized that an automatic- calculator device would require three basic elements that all computers must have. The basic elements of all computers are a control, a memory, and a calculator for the arithmetic.
Later Professor John Atanasoff and Clifford Berry built the world’s first electronic-digital computer at Iowa State University between 1939 and 1942. The Atanasoff-Berry Computer represented several innovations in computing, including a binary system of arithmetic, parallel processing, regenerative memory, and a separation of memory and computing functions. Presper Eckert and John Mauchly were the first to patent a digital computing device, the ENIAC computer. Although a patent infringement case (Sperry Rand Vs.
Honeywell, 1973) voided the ENIAC patent as a imitation of John Atanasoff’s invention. Atanasoff was quite generous in stating, “there is enough credit for everyone in the invention and development of the electronic computer” he told reporters. It was idely believed that Eckert and Mauchly originally invented the electronic-digital computer and received most of the credit for it. Historians now say that the Atanasoff-Berry computer was the first.
The evidence comes from a cocktail napkin that John Atanasoff wrote the concepts of the first modern computer on the back of. It was at an evening of scotch and 100 mph car rides,” John Atanasoff told reporters, “when the concept came, for an electronically operated machine, that would use base-two (binary) numbers instead of the traditional base-10 numbers, condensers for memory, and a regenerative process to preclude oss of memory from electrical failure. ” (Campbell-Kelly,Martin and William Aspray “Computer: A History of the Information Machine (The Sloan Technology Series)”1996 basic books New York, NY) These primitive computers are a far cry from a modern Imac.
On the Apple Computer web site, they boost that the iMac G5 removed the extraneous, miniaturized the necessary, souped up the performance and concealed the result in immaculate perfection. The G5 processor makes everything “zippier. ” It connects to the web or e-mail, creates movies, songs and DVDs, arranges photos and plays music. It comes with a 1. 6 or 1. GHz G5 processor that’s ready to run modern 64-bit applications under the secure and stable Mac OS X operating system. The iMac G5 tucks away all the modern amenities in its two-inch thin body, such as a slot-loading SuperDrive or Combo drive.
It can burn DVD slideshows of vacation photos or can send a person’s friend a DVD with a special movie for the holidays. The iMac has the technology to store music called “iTunes” with which a person can make a customized mix CD of their favorite songs. Many have happen in computer technology since the early stages of computers. Two of the greatest advances in computer technology happen in he 60’s and early 70’s. In 1958 the “Chip” was invented. Two separate inventors, unaware of each other’s activities, invented almost identical integrated circuits or ICs at nearly the same time.
Jack Kilby and Robert Noyce co-founded the Fairchild Semiconductor Corporation, both saw that a complex electronic machine, like a computer, needed to increase the number of components involved in order to make technical advances. The monolithic integrated circuit (formed from a single crystal) replaced the previously separated transistors-resistors capacitors and all the connecting wiring nto a single crystal (or ‘chip’) made of semiconductor material. Co- founder Kilby used germanium while Noyce used silicon for the semiconductor material.
In 1961 the first commercially available integrated circuits came from the Fairchild Semiconductor Corporation. All computers then started to be made using chips instead of the individual transistors and their accompanying parts. Texas Instruments first used the chips in Air Force computers and the Minuteman Missile in 1962. They later used the chips to produce the first electronic portable calculators. The original IC had only ne transistor, three resistors and one capacitor and was the size of an adult’s pinkie finger. Today an IC smaller than a penny can hold 125 million transistors.
Douglas Engelbart changed the way computers worked, from specialized machinery that only a trained scientist could use, to a user-friendly tool that almost anyone can use. He invented or contributed to several interactive, user-friendly devices: the computer mouse, windows, computer video teleconferencing, hypermedia, groupware, email, the Internet and more. In 1964, the first prototype computer mouse was made to use with a graphical user interface (GUI), ‘windows’. Engelbart received a patent for the wooden shell with two metal wheels.
In 1970, describing it in the patent application as an X-Y position indicator for a display system. ‘It was nicknamed the mouse because the tail came out the end’ Engelbart revealed about his invention. “(Computer History Museum) His version of windows was not considered patentable (no software patents were issued at that time), but Douglas Engelbart has over 45 other patents in other computer technologies. Throughout the ’60s and ’70s, while working at his own, Engelbart dedicated himself to creating a hypermedia groupware system called NLS (for online System). Most of his accomplishments, including the computer mouse and windows, were part of NLS.
Douglas Engelbart invented the system of Xerox, which is the original windows program. Before people were using a “command line interface. ” This is the DOS programming, which used typed commands or series of commands to make the computer process information. Since he never received a patent for this during the Late 70’s And 80’s Macintosh and Apple where early computer companies looking for the “next big thing” Apple believed the had found it in Engelbart’s system. At the same time Macintosh lead by Bill Gates sough o “steal” the idea and market it. The to company developed separate systems to compete with each other.
The first popular computer to ship with a mouse and offer a graphical user interface (where you click on little pictures called “icons,” as opposed to typing archaic commands) was the Macintosh in 1984. Since then Macintosh and Apple have been bitter rivals, with Macintosh establishing more dominance in the market nearly making more money than any other company in the world. The modern advancement in computer software has been amassing. Intel has developed an innovative transistor structure and new materials that epresent a dramatic improvement in transistor speed, power efficiency and heat reduction.
The technology development is an important milestone in the effort to maintain the pace of Moore’s Law and remove the technical barriers that Intel and the semiconductor industry have only recently begun to identify. ” (Georges Ifrah) This technological breakthrough, coupled with recent announcements from Intel on faster and smaller transistors, will enable powerful new applications such as real-time voice and face recognition, computing without keyboards, and smaller computing devices with higher performance and improved battery life.