Computer Science Terms

Lakukan tugas rumah & ujian kamu dengan baik sekarang menggunakan Quizwiz!

Abacus

A manual computing device consisting of a frame holding parallel rods strung with movable counters.

The von Neumann Architecture (The von Neumann Model and The Princeton Achitecture) (1945)

A computer architecture based on that described in 1945 by the mathematician and physicist John von Neumann and others in the First Draft of a Report on the EDVAC.

Timeline

500 B.C. - The Babylonians used the abacus first as an aid to simple arithmetic around this time. 1623 - Wilhelm Schickard (1592-1635), of Tuebingen, Wuerttemberg (now in Germany), made a "Calculating Clock." This mechanical machine was capable of adding and subtracting up to six digit numbers, and it warned of an overflow by ringing a bell. Operations were carried out by wheels, and a complete revolution of the units wheel incremented the tens wheel. 1642 - Blaise Pascal, a French mathematician, built a mechanical adding machine (the "Pascaline"). Although it was more limited than the "Calculating Clock," Pascal's machine became far more well known. He was able to sell about a dozen of his machines in various forms and they could deal with up to eight digits. 1804 - Joseph-Maire Jacquard developed an automatic loom controlled by punched cards. 1822 - Charles Babbage (1792-1871) designed his first mechanical computer, the original prototype for the Difference Engine. Babbage invented two machines, the Analytical Engine (a general purpose mathematical device) and the Difference Engine. Although both worked in theory, they were much too complex to be built properly at the time. 1834 - Babbage conceives, and begins to design, his "Analytical Engine." The program was stored on read-only memory, specifically in the form of punch cards. He continued to work on the design for years, though there were only minor changes after 1840. The machine would operate on 40-digit numbers; the "mill" (CPU) would have two main accumulators and some auxiliary ones for specific purposes, while the "store" (memory) would hold 100 more numbers. There would be several punch card readers, for both programs and data; the cards would be chained and the motion of each chain could be reversed. 1890 - The 1880 census took seven years to complete since all processing was done by hand off of journal sheets. The increasing population suggested that by the 1890 census the data processing would take longer than the ten years before the next census. Thus, a competition was held to try to find a better method. This was won by a Census Department employee, Herman Hollerith, who went on to form the Tabulating Machine Company, later to become IBM. Herman borrowed Babbage's idea of using the punched cards from the textile industry for the data storage. This method was used in the 1890 census, the result (62,622,250 people) was released in just 6 weeks! This storage allowed much more in-depth analysis of the data and and despite being more efficient, the 1890 census cost about double (actually 198%) that of the 1880 census. 1899 - "Everything that can be invented has already been invented," Charles H. Duell, director of the U.S. Patent Office 1906 - Electronic Tube (or Electronic Valve) developed by Lee De Forest in America. Without this invention, it would have been impossible to make digital electronic computers. 1924, February- International Business Machines (IBM corporation) formed. 1935 - International Business Machines introduces the "IBM 601," a punch card machine with an arithmetic unit based on relays and capable of doing multiplication in one second. The machine becomes important both in scientific and commercial computation, and about 1500 of them are eventually made. 1937 - Alan M. Turing (1912-1954), of Cambridge University, England, publishes a paper on "computable numbers" - the mathematical theory of computation. This paper solves a mathematical problem, but the solution is achieved by reasoning (as a mathematical device) about the theoretical simplified computer known today as a Turing machine. 1939 - January 1 - Hewlett-Packard formed by David Hewlett and William Packard in a garage in California. They tossed a coin to determine the name of their company. 1939 - The second world war begins and spurs many improvements in technology. This eventually led to the development of machines such as the Colossus, which helped to win the war by breaking intercepted codes. 1943 - Computers between 1943 and 1959 (although some argue the first date should 1951 with the UNIVAC I) are usually regarded as 'first generation' and are based on valves and wire circuits. They are characterized by the use of punched cards and vacuum tubes, and all programming was done in machine code. A typical machine of the era was UNIVAC. 1943 - "I think there is a world market for maybe five computers," - Thomas Watson, chairman of IBM. 1943, January - The Harvard Mark I was built at Harvard University by Howard H. Aiken and his team. It was partially financed by IBM and it became the first program controlled calculator. The whole machine is 51 feet long, weighs 5 tons, and incorporates 750,000 parts. It was used to create ballistics tables for the US Navy. 1943, April - Max Newman, Wynn-Williams, and their team (including Alan Turing) complete the "Heath Robinson." This is a specialized machine for cipher-breaking, not a general-purpose calculator or computer but some sort of logic device, using a combination of electronics and relay logic. —> Heath Robinson is the name of a British cartoonist known for drawings of comical machines, like the American Rube Goldberg. Two later machines in the series will be named after London stores with "Robinson" in their names. 1943, December - The earliest Programmable Electronic Computer first ran (in Britain). It contained 2400 vacuum tubes for logic, and was called the Colossus. It was built, by Dr Thomas Flowers at The Post Office Research Laboratories in London, to crack the German code used by the 'Enigma' machines. Colossus was used at Bletchly Park during WWII as a successor to the Heath Robinson. It translated an amazing 5000 characters a second, and used punched tape for input. Although 10 were eventually built, they were unfortunately destroyed immediately after they had finished their work. The technology was so advanced that there was to be no possibility of its design falling into the wrong hands (presumably the Russians). 1944 - John Atanasoff and George Stibbitz worked in Bell Labs to create the first calculator that used binary arithmetic and capacitors for memory. mid 1940's - John von Neumann introduces the concept of the stored-program computer. The machine contained memory, an arithmetic logic unit, a control unit, and input and output equipment. 1946 - The ENIAC (Electronic Numerical Integrator and Computer) was one of the first totally electronic, valve driven, digital, computers. Development was started in 1943 and finished in 1946 by John W. Mauchly and J. Presper Eckert. 1947 -> end - William B. Shockley, John Bardeen and Walter H. Brattain invented the transistor at The Bell Laboratories. 1949 - "Computers in the future may weigh no more than 1.5 tons," - Popular Mechanics, forecasting the relentless march of science. 1950 - The Floppy Disk is invented at the Imperial University in Tokyo by Doctor Yoshiro Nakamats. 1951 - The first commercially successful electronic computer, the UNIVAC I, was the first general purpose computer, designed to handle both numeric and textual information. The UNIVAC machine was delivered to the U.S. Bureau of Census in 1951. This machine used magnetic tape for input. 1953 - Estimates say there are around 100 computers in the world at this point. 1954 - FORTRAN (FORmula TRANslation) development is started by John Backus and his team at IBM (continuing until 1957). FORTRAN is a programming language, used for Scientific programming. 1957 - "I have traveled the length and breadth of this country and talked with the best people, and I can assure you that data processing is a fad that won't last out the year." The editor in charge of business books for Prentice Hall. 1958, September 12 - The integrated circuit is invented by Jack St Clair Kilby at Texas Instruments. Robert Noyce, who later set up Intel, also worked separately on the invention. Intel was the first to perfect the microprocessor. 1959 - DEC introduces the PDP-8, which incorporated the idea of the omnibus. A bus is a collection of parallel wires used to connect the components of a computer. 1959 - Computers built between 1959 and 1964 are often regarded as 'Second Generation' computers, based on transistors and printed circuits - resulting in much smaller computers. More powerful, the second generation of computers could handle interpreters such as FORTRAN (for science) or COBOL (for business), that accepted English-like commands, and so were much more flexible in their applications. 1959 - COBOL (COmmon Business-Orientated Language) developed by Grace Murray Hopper, finished in 1961. 1961 - IBM 7030 ("Stretch") was delivered to Los Alamos in April of 1961. Data was organized as bytes and utilized magnetic disks. The machine was slower than anticipated, far more expensive, and completed way past the deadline. Although Strech was perceived as a failure, it contributed to the development of other IBM machines later to come. 1964 - IBM took a revolutionary step, introducing the single product line, the IBM 360 series. This computer was revolutionary because it was the first computer that could be used for scientific and home use. The fact that this computer uses the integrated circuit ends the second generation of computing and starts the third. 1964 - Computers built between 1964 and 1972 are often regarded as 'Third Generation' computers. They are based on the first integrated circuits - creating even smaller machines. Typical of such machines was the IBM 360 series mainframe, while smaller minicomputers began to open up computing to smaller businesses. 1965 - BASIC (Beginners All Purpose Symbolic Instruction Code) developed at Dartmouth College by Thomas E. Kurtz and John Kemeny. 1965 - The mouse is conceived by Douglas Englebart, but did not become popular until 1983 with the Apple computers. Amazingly, the mouse was not adopted by IBM until 1987. 1968 - Intel founded by Robert Noyce and a few friends. 1968 - "But what ... is it good for?" Engineer at the Advanced Computing Systems Division of IBM commenting on the microchip. 1969 - ARPANET started by the US Dept. of Defense for research into networking. It is the original basis for what now forms the Internet. It was opened to non-military users later in the 1970s and many universities and large businesses went on-line. Eventually, it was termed the "Information Superhighway" by US vice-president Al Gore. 1970 - First RAM chip introduced by Intel. It was called the 1103 and had a capacity of 1 K-bit, 1024 bits. 1970 - Development of UNIX operating system started. It was later released as C source code to aid portability, and subsequently versions are obtainable for many different computers, including the IBM PC. It and its clones (such as Linux) are still widely used on network and Internet servers. 1971, November 15 - First microprocessor, the 4004, developed by Marcian E. Hoff for Intel, was released. It contains the equivalent of 2300 transistors and was a 4 bit processor. It is capable of around 60,000 interactions per second (0.06 MIPs), running at a clock rate of 108KHz. 1972 - Atari founded by Nolan Bushnell, who also created Pong. Pong is released this same year and is widely recognized as the first popular arcade video game. 1972 - Computers built after 1972 are often called 'fourth generation' computers, based on LSI (Large Scale Integration) of circuits (such as microprocessors) typically with 500 or more components on a chip. 1972 - C programming language was developed at The Bell Laboratories by Dennis Ritche (one of the inventors of the UNIX operating system), its predecessor was the B programming language, also from The Bell Laboratories. It is a very popular language, especially for systems programming since it is flexible and fast. C++, which allowed for Object-Orientated Programming, was introduced in early 1980s. 1972 - The first handheld scientific calculator was released by Hewlett-Packard. The engineer's slide rule is at last obsolete. 1972 - The first international connections to ARPANET are established. ARPANET later became the basis for what we now call the internet. 1975 - Formation of Microsoft by Bill Gates and Paul Allen. It is now one of the most powerful and successful computing companies, a distinct improvement on the pair's original company, Traf-O-Data, which made car counters for highway departments. In just 3 years it achieved revenues of $500,000 and employed 15 people. By 1992 this had increased to revenues of 2.8 billion (50% of which are from exports), and over 10,000 employees. This is a fantastic feat for a company less than 20 years old. Microsoft's big break occured when they were asked to write the operating system for the IBM PC, released in 1981. Although financially not as large as IBM, Microsoft has a huge amount of influence in the Computing Industry. 1976 - The Cray 1 is the first commercially developed Supercomputer. It contained 200,000 integrated circuits and was freon-cooled. It could perform 150 million floating point operations per second and it is now the basis of an informal measurement of the power of Supercomputers. By the mid-1990s these had reached the 1000-'cray' mark! In 1992 the fastest Computer was the Cray-2, which could do around 250 million floating point operations per seconds. Cray has continued to develop even more powerful computers, such as the Cray Y-MP/832. Such Supercomputers are used for weather forecasting, complex math and physics problems, and animation in modern films. 1977 - "There is no reason anyone would want a computer in their home." Ken Olson, president, chairman and founder of Digital Equipment Corp.. 1977, May - Apple II computer introduced and sells for $1295. It becomes the first successful computer for home use. 1978, June 8 - Introduction of the 8086 by Intel, the first commercially successful 16 bit processor. It was too expensive to implement in early computers, so an 8 bit version was developed (the 8088), which was chosen by IBM for the first IBM PC. This ensured the success of the x86 family of processors that succeeded the 8086 since they and their clones are used in every IBM PC compatible computer. 1979 - The Apple II+ is released and sells for $1195. The compact disk (CD) was invented. 1979 - The 68000 Microprocessor is launched by Motorola. It is used by Apple for the Macintosh and by Atari for the ST series. Later versions of the processor include the 68020 used in the Macintosh II. 1979 - IBM saw its computer market dominance being eaten into by the new personal computers, such as the Apple and the Commodore PET. IBM therefore started work on their own P.C. This computer had to be a state-of-the-art machine in order to compete, but it also had to be produced very quickly due to the amazing growth of competitors. It was therefore decided to use many third party vendors to reduce development time. Microsoft was commissioned to write the Operating System. The IBM PC was released August 12, 1981 . 1980 - "DOS addresses only 1 Megabyte of RAM because we cannot imagine any applications needing more." Microsoft on the development of DOS. 1981 - "640k ought to be enough for anybody," Bill Gates 1982 - The TCP/IP Protocol established. This is the protocol that carries most of the information across the Internet. 1983, Spring - IBM XT released. It was fitted with the 8086 and had room for an 8087 math co-processor to be installed. It also had a 10Mb hard disk, 128K of RAM, one floppy drive, mono monitor, and a printer, all for $5000! 1984 - The Apple Macintosh is released in January. It is powered by a revolutionary "event-driven" operating system making it a milestone in the computer industry . Many of the concepts popularized by the Mac were actually conceptualized earlier by researchers at the Xerox Palo Alto Research Center but were not considered useful. (Some were considered "absurd" by computer manufacturers.) As long as the archive link is available, you can read more about these developments at the PC World Magazine site. DNS (Domain Name System) introduced to the Internet, which then consisted of about 1000 hosts. 1985 - Tetris was written by Russian Alexey Pazhitnov. It was later released for various western game machines, the jewel in its crown being the inclusion with Nintendo's Gameboy in 1989. Alexey made nothing from the game, since under the Communist Regime it was owned by the people. After the collapse of Communism he was able to move to the USA where he now works for Microsoft. 1985 - CD-ROM, invented by Phillips, produced in collaboration with Sony. 1985, November - Microsoft Windows is launched. It was not widely used until version 3, released in 1990. Windows required DOS to run and so was not a complete operating system. It merely provided a GUI (graphical user interface) similar to that of the Macintosh. So similar in fact that Apple tried to sue Microsoft for copying the 'look and feel' of their operating system. This court case was not dropped until August of 1997. 1989 - World Wide Web (WWW), invented by Tim Berners-Lee, who saw the need for a global information exchange that would allow physicists to collaborate on research (he was working at CERN, the European Particle Physics Laboratory in Switzerland, at the time). The Web was a result of the integration of hypertext and the Internet. The hyper linked pages not only provided information but provided transparent access to older Internet facilities such as ftp, telnet, Gopher, WAIS and USENET. He was awarded the Institute of Physics' 1997 Duddell Medal for this contribution to the advancement of knowledge. The Web started as a text-only interface, but NCSA Mosaic later presented a graphical interface for it and its popularity exploded as it became accessible to the novice user. This explosion started in earnest during 1993, a year in which web traffic over the Internet increased by 300,000%. 1990, May 22 - Introduction of Windows 3.0 by Bill Gates & Microsoft. It is a true multitasking (or pretends to be on computers less than an 80386, by operating in 'Real' mode) system. It maintained its compatibility with MS-DOS. It even allows programs on the 80386 to multitask, which they were not designed to do. This created a real threat to the Macintosh and despite a similar product, IBM's OS/2, Windows 3.0 was very successful. Various improvements were made in versions 3.1 and 3.11, but the next major step did not come until Windows '95 which relied much more heavily on the features of the 80386 and provided support for 32 bit applications. 1991, August - Linux is born with the following post to the Usenet Newsgroup comp.os.minix: "Hello everybody out there using minix- I'm doing a (free) operating system (just a hobby, won't be big and professional like gnu) for 386(486) AT clones." The post was by a Finnish college student, Linus Torvalds, and this hobby grew from these humble beginnings into one of the most widely used UNIX-like operating systems in the world today. It now runs on many different types of computers, including the Sun SPARC and the Compaq Alpha, as well as many ARM, MIPS, PowerPC and Motorola 68000 based computers. 1992, April - Introduction of Windows 3.1 1992, May - Wolfenstein 3D released by Id Software Inc. 1993, March 22 - Intel's Pentium is released. At the time it was only available in 60 & 66 MHz versions which achieved up to 100 MIPS, with over 3.1 million transistors. 1994 - Netscape 1.0 was written as an alternative browser to NCSA Mosaic. 1995 - "I predict the Internet . . . will go supernova and in 1996 will catastrophically collapse!" - Bob Metcalf, 3Com Founder 1995, August 21 [poss. 23] - Windows '95 was launched by Bill Gates & Microsoft. Unlike previous versions of Windows, Windows '95 is an entire operating system and does not rely on MS-DOS (although some remnants of the old operating system still exist). Windows '95 was written specially for the 80386 and compatible computers to make 'full' use of its 32 bit processing and multitasking capabilities. 1995, November 1 - Pentium Pro was released. At introduction it achieved a clock speed of up to 200 MHz (there were also 150, 166 and 180 MHz variants released on the same date), but is basically the same as the Pentium in terms of instruction set and capabilities. It achieves 440 MIPs and contains 5.5 million transistors. This is nearly 2400 times as many as the first microprocessor, the 4004, and capable of 70,000 times as many instructions per second. 1995, December 28 - CompuServe blocked access to over 200 sexually explicit sites, partly to avoid confrontation with the German Government. Access to all but 5 sites was restored on Feb. 13 1996. 1995, December - JavaScript development is announced by Netscape. 1997, May 11 - IBM's Deep Blue is the first computer to beat a reigning World Chess Champion, Gary Kasparov, in a full chess match. The computer had played him previously and lost 5/6 games in February 1996. 1997 - "Folks, the Mac platform is through -- totally." -John Dvorak, PC Magazine. 1997, May 7 - Intel released their Pentium II processor (233, 266 and 300 Mhz versions). It featured a much larger on-chip cache, as well as an increased instruction set. 1997, August 6 - After 18 months of losses, Apple was in serious financial trouble. Microsoft invested in Apple, buying 100,000 non-voting shares worth $150 million. A decision not approved of by many Apple owners! One of the conditions was that Apple was to drop their long running court case, its attempt to sue Microsoft for copying the "look and feel" of their operating system when designing Windows. It must be pointed out that Apple copied the XEROX Star system when designing their WIMP. 1998, April - A U.S. court has finally banned the long-running game of buying domain names relating to trademarks and then selling them for extortionate prices to the companies who own the trademark. The case was based around a man from Illinois who bought www.panavision.com in 1995 and tried to sell it for $13,000. The current going commercial rate for domain name registration is around $100. 1998, June 25 - Microsoft released Windows '98. Some U.S. attorneys tried to block its release since the new O/S interlaces with other programs such as Microsoft Internet Explorer, and so effectively closes the market of such software to other companies. Microsoft has fought back with a letter to the White House suggesting that 26 of its industry allies say that a delay in the release of the new O/S could damage the U.S. economy. The main selling points of Windows '98 were its support for USB and its support for disk partitions greater than 2.1GB. 1999, Aug 31 - Apple released the PowerMac G4. It is powered by the PowerPC G4 chip from Apple, Motorola and IBM. Available in 400MHz, 450MHz and 500MHz versions it claims to be the first personal computer to be capable of over one billion floating-point operations per second. 2000, Feb 17 -Official Launch of Windows 2000 - Microsoft's replacement for Windows 95/98 and Windows NT. It claimed to be faster and more reliable than previous versions of Windows. It is actually a descendant of the NT series, and so the trade-off for increased reliability is that it won't run some old DOS-based games. To keep the home market happy, Microsoft has also released Windows ME, the newest member of the 95/98 series. 2000, March 8 - Intel releases a very limited supply of the 1GHz Pentium III chip. 2000, June 20 - British Telecom (BT) claim the rights to hyperlinks on the basis of a US patent granted in 1989. Similar patents in the rest of the world have now expired. Their claim is widely believed to be absurd since Ted Nelson wrote about hyperlinks in 1965, and this is where Tim Berners Lee says he got the ideas for the World Wide Web from. This is just another in the line of similar incredulous cases - for example amazon.com's claim to have patented '1-click ordering.' March 24th, 2001 - MacOS X 10.0 is officially released. It is a silent release since Apple wants to have a major release event in July at MacWorldExpo when MacOS X 10.1 ships. April 18th, 2001: Apple announces a quarterly profit of $43 million with MacOS X generating $19 million in sales. Furthermore Apple annouces that it has shipped its 5 millionth iMac making it the most successful personal computer ever. October 25, 2001 - Microsoft released Windows XP which is the latest version of their Windows operating systems. Based on the NT series kernel, it is intended to bring together both the NT/2000 series and the Windows 95/98/ME series into one product. Of course, it was originally hoped that this would happen with Windows 2000 ... only time will tell if Microsoft succeedes with Windows XP. October, 2001 - Apple announces the iPod. 2001, November 15 -Microsoft releases its game console the `X' Box. It costs $299, and includes the ability to connect to the internet for multiplayer gaming. March, 2001 - Apple announces Mac OS X, a UNIX-based system and a radical departure from previous operating systems. 2002, May - Hewlett-Packard merged with Compaq Computer forming the second largest IT company on earth. 2002, August - Intel releases the The Itanium™ 2 processor. It is the second member of the Itanium processor family. The family brings outstanding performance and the volume economics of the Intel® Architecture . It provides leading performance for databases, computer-aided engineering, secure online transactions, and more. 2003, May - Apple Computer announces the G5 desktop, the first 64-bit machine commercially available to the public. The G5 processor was developed in cooperation with IBM. 2004 -Apple Computer announces the iPod Mini. January, 2005 - Apple Computer announces the iPod Shuffle and the color iPod. July, 2005 - Microsoft announces it's next operating system, codenamed "Longhorn", will be named Windows Vista.

Howard H. Aiken (1900-1973)

An American physicist and a pioneer in computing, being the original conceptual designer behind IBM's Harvard Mark I computer.

John W. Mauchly

An American physicist who, along with J. Presper Eckert, designed ENIAC, the first general purpose electronic digital computer, as well as EDVAC, BINAC and UNIVAC I, the first commercial computer made in the United States.

Pascaline

An early mechanical calculator developed by Blaise Pascal in 1642. It was able to add and subtract two decimal numbers.

Blaise Pascal (19 June 1623 - 19 August 1662)

A French mathematician, built a mechanical adding machine (the "Pascaline"). Although it was more limited than the "Calculating Clock," Pascal's machine became far more well known. He was able to sell about a dozen of his machines in various forms and they could deal with up to eight digits.

Joseph Marie Charles dit (called or nicknamed) Jacquard (7 July 1752 - 7 August 1834)

A French weaver and merchant. He played an important role in the development of the earliest programmable loom (the "Jacquard loom"), which in turn played an important role in the development of other programmable machines, such as an early version of digital compiler used by IBM to develop the modern day computer.

COBOL (COmmon Business-Orientated Language)

Developed by Grace Murray Hopper, finished in 1961. It was the first widely-used high-level programming language for business applications. The language is now considered obsolete, but many the programs are still in use today. Its method of storing a year with only the last two number (ie: 99) caused the Y2K panic.

John von Neumann (December 28, 1903 - February 8, 1957)

A Hungarian-American mathematician, physicist, inventor, computer scientist, and polymath. He made major contributions to a number of fields, including mathematics, physics, economics, computing, and statistics. He was a pioneer of the application of operator theory to quantum mechanics, in the development of functional analysis, and a key figure in the development of game theory and the concepts of cellular automata, the universal constructor, and the digital computer. His analysis of the structure of self-replication preceded the discovery of the structure of DNA. During World War II he worked on the Manhattan Project, developing the mathematical models behind the explosive lenses used in the implosion-type nuclear weapon. After the war, he served on the General Advisory Committee of the United States Atomic Energy Commission, and later as one of its commissioners. He was a consultant to a number of organizations, including the United States Air Force, the Army's Ballistic Research Laboratory, the Armed Forces Special Weapons Project, and the Lawrence Livermore National Laboratory. Along with theoretical physicist Edward Teller, mathematician Stanislaw Ulam, and others, he worked out key steps in the nuclear physics involved in thermonuclear reactions and the hydrogen bomb.

The Jacquard Loom (Machine)

A device fitted to a power loom that simplifies the process of manufacturing textiles with such complex patterns as brocade, damask and matelassé. It was invented by Joseph Marie Jacquard in 1804. (http://cs.gettysburg.edu/CompSci103/Alpha_Site/chapter_1/JacquardLoom.html)

The Difference Engine

A difference engine is an automatic mechanical calculator designed to tabulate polynomial functions. The name derives from the method of divided differences, a way to interpolate or tabulate functions by using a small set of polynomial coefficients. Created by Babbage.

ARPANET (Advanced Reasearch Projects Agency) (1969)

A part of the Department of Defense. Developed the first large scale computer network. This network turned into the Internet.

The Analytical Engine

A proposed mechanical general-purpose computer designed by English mathematician and computer pioneer Charles Babbage. It was first described in 1837 as the successor to Babbage's difference engine, a design for a mechanical computer.

URL (Uniform or Universal Resource Locator)

A reference to a web resource that specifies its location on a computer network and a mechanism for retrieving it. It is specific type of Universal Resource Identifier, although many people use the two terms interchangeably. It implies the means to access an indicated resource, which is not true of every one. Occurs most commonly to reference web pages (http), but are also used for file transfer (ftp), email (mailto), database access (JDBC), and many other applications.

Integrated Circuit (IC) (chip or microchip)

A semiconductor wafer on which thousands or millions of tiny resistors, capacitors, and transistors are fabricated. These devices are used in computers, computer networks, modems, and frequency counters. Logic gates are the fundamental building blocks of digital ICs that work with binary data, that is, signals that have only two different states, called low (logic 0) and high (logic 1). A single integrated circuit (IC), such as a microprocessor chip, can do the work of a set of vacuum tubes that would fill a large building and require its own electric generating plant.

The Colossus Computer (1943-1945)

A set of computers developed by British codebreakers to help in the cryptanalysis of the Lorenz cipher. They used thermionic valves (vacuum tubes) to perform Boolean and counting operations. Thus regarded as the world's first programmable, electronic, digital computer, although it was programmed by switches and plugs and not by a stored program. Designed by research telephone engineer Tommy Flowers to solve a problem posed by mathematician Max Newman at the Government Code and Cypher School (GC&CS) at Bletchley Park. Alan Turing's use of probability in cryptanalysis contributed to its design. It has sometimes been erroneously stated that Turing designed them to aid the cryptanalysis of the Enigma. Turing's machine that helped decode Enigma was the electromechanical Bombe. https://upload.wikimedia.org/wikipedia/commons/4/4b/Colossus.jpg

Vacuum Tube (1873 by Frederick Guthrie)

A vacuum tube is a device sometimes used to amplify electronic signals. In most applications, the vacuum tube is obsolete, having been replaced decades ago by the bipolar transistor and, more recently, by the field-effect transistor. However, tubes are still used in some high-power amplifiers, especially at microwave radio frequencies and in some hi-fi audio systems. Making a comeback among audiophiles who insist that tubes deliver better audio quality than transistors.

ENIAC (Electronic Numerical Integrator And Computer) (1946)

Amongst the earliest electronic general-purpose computers made. It was Turing-complete, digital, and could solve "a large class of numerical problems" through reprogramming. It weighed 30 tons and contained 18,000 Electronic Valves, consuming around 25kW of electrical power. It is widely recognized as the first Universal Electronic Computer, and could do around 100,000 calculations a second. It was used for calculating Ballistic trajectories and testing theories behind the Hydrogen bomb. Formally dedicated at the University of Pennsylvania on February 15, 1946 and was heralded as a "Giant Brain" by the press. It had a speed on the order of one thousand (103) times faster than that of electro-mechanical machines; this computational power, coupled with general-purpose programmability, excited scientists and industrialists alike. Calculated a trajectory that took a human 20 hours in 30 seconds (a 2400x increase in speed). (John W. Mauchly and J. Presper Eckert)

Herman Hollerith (February 29, 1860 - November 17, 1929)

An American inventor who developed an electromechanical punched card tabulator to assist in summarizing information and, later, accounting. He was the founder of the Tabulating Machine Company that was consolidated in 1911 with three other companies to form the Computing-Tabulating-Recording Company, later renamed IBM. He is regarded as one of the seminal figures in the development of data processing. He used the Jacquard Loom concept of computing. Although, Hollerith's method used cards to store data information, which he fed into a machine that compiled the results mechanically. Hollerith is best remembered as the person who mechanized the U.S. census process by using punched cards to store the 1890 census data which was then tabulated by machine. This dominated that landscape for nearly a century.

Alan Mathison Turing (23 June 1912 - 7 June 1954)

An English computer scientist, mathematician, logician, cryptanalyst and theoretical biologist. He was highly influential in the development of theoretical computer science, providing a formalisation of the concepts of algorithm and computation with the Turing machine, which can be considered a model of a general purpose computer. He is widely considered to be the father of theoretical computer science and artificial intelligence. He published his paper on the mathematical theory of computation and the Turing machine is named after him. "Father of Modern Computing".

Augusta Ada King-Noel, Countess of Lovelace (née Byron; 10 December 1815 - 27 November 1852)

An English mathematician and writer, chiefly known for her work on Charles Babbage's early mechanical general-purpose computer, the Analytical Engine. Her notes on the engine include what is recognised as the first algorithm intended to be carried out by a machine. As a result, she is often regarded as the first computer programmer.

FORTRAN (FORmula TRANslation)

An early programming language language that was designed for use by engineers, mathematicians, and other users and creators of scientific algorithms. It has a very succinct and spartan syntax. Today, the C language has largely displaced FORTRAN.

The Bombe (1938 by Marian Rejewski)

An electromechanical device used by British cryptologists to help decipher German Enigma-machine-encrypted secret messages during World War II. The US Navy and US Army later produced their own machines to the same functional specification, but engineered differently from each other. The initial design was produced in 1939 at the UK Government Code and Cypher School (GC&CS) at Bletchley Park by Alan Turing with an important refinement devised in 1940 by Gordon Welchman. https://upload.wikimedia.org/wikipedia/commons/thumb/4/49/Bletchley_Park_Bombe4.jpg/800px-Bletchley_Park_Bombe4.jpg

Parallel Processing

Computing data by using more than one CPU at once or by having one CPU work on different pieces of data simultaneously. Data can be computed much faster by breaking up information and sending each part to a different processor to compute instead of processing each piece of data with one processor.

Charles Babbage (26 December 1791 - 18 October 1871)

Known today as the "Father of Computing" for his creation of the Analytical machine.

Transistor (invented 1947, 1956)

Regulates current or voltage flow and acts as a switch or gate for electronic signals. Consists of three layers of a semiconductor material, each capable of carrying a current. A semiconductor is a material such as germanium and silicon that conducts electricity in a "semi-enthusiastic" way. It's somewhere between a real conductor such as copper and an insulator (like the plastic wrapped around wires).

PC

Sold worldwide in 1999, and these machines have a memory access time of 70 nanoseconds! Some of these vacuum tube computers were monstrosities, and took up entire rooms of space.

RAM (Random Access Memory) (late 1990s) (64 megabit)

Temporary storage that allows the processor to access data faster than by accessing it directly from the hard drive.

Memory Access Time

The amount time, measure in nanoseconds, it takes a processor to reference data stored in RAM.

The UNIVAC I (UNIVersal Automatic Computer I) (1951)

The first commercial computer produced in the United States. It was designed principally by J. Presper Eckert and John Mauchly, the inventors of the ENIAC. A total of 46 UNIVAC machines were sold. The UNIVAC had a memory access time of 500,000 nanoseconds.

Mark I (1943)

The first program controlled calculator, built at Harvard University. It consisted of many calculators which worked on parts of the same problem. (first gen.)

LSI (Large-Scale Integration)

The process of creating an integrated circuit (IC) by combining thousands of transistors into a single chip. Began in the 1970s when complex semiconductor and communication technologies were being developed. The microprocessor is this device.

John Adam Presper "Pres" Eckert, Jr. (April 9, 1919 - June 3, 1995)

an American electrical engineer and computer pioneer. With John Mauchly he invented the first general-purpose electronic digital computer (ENIAC), presented the first course in computing topics (the Moore School Lectures), founded the Eckert-Mauchly Computer Corporation, and designed the first commercial computer in the U.S., the UNIVAC, which incorporated Eckert's invention of the mercury delay line memory.


Set pelajaran terkait

What I think is important for Unit 5

View Set

Chapter 6 PMBOK 5th edition - Practice Test #4

View Set

A&P II Chapter 22 Wiley Plus Lymphatic System

View Set

practice quiz questions financial acct test 4

View Set