Chapter One (Term & Concepts Review)

अब Quizwiz के साथ अपने होमवर्क और परीक्षाओं को एस करें!

What was the significance of the punched card?

The early method which was used to store data or information to the computer was punch card. In those days computer programs were written in punch cards. There was a card reader to read the information which was in the punch card and convert that information into machine code. The machine Analytical Engine which was inverted by Charles Babbage having punched card system to store and retrieve information.

What is it about the transistor that made it such a great improvement over the vacuum tube?

The first generation computer used vacuum tubes. They were very expensive and generated a lot of heat. They could solve only one problem at a time and relied on machine language. Transistors replaced vacuum tubes in the second generation. It is a device composed of semiconductor material. They were smaller, faster, cheaper and more reliable than vacuum tubes. They moved from machine language to assembly language.

What technology spawned the development of microcomputers? Why?

In the fourth generation the complexity of circuits was growing by adding more transistors. VLSI technology spawned the development of microcomputers. These systems were small in size, cheap and portable. General public could also afford these systems. The first microcomputer was Altair 8800 after that IBM introduced IBM PC (personal computer). The term microcomputer is generally synonyms with personal computer (PC), or a computer that depends on a microprocessor. It is designed to be used by one person at a time whether in the form of PCs, workstations or notebook computers. Microcomputers were designed to sit on a table for extended period of time. They do not take much space and designed to fit on a table. Microcomputers are used in education system, offices. These systems consume less power.

To what power of 10 does the prefix micro- refer? What is the (approximate) equivalent power of 2?

Micro refers to the power of 10 to -6, i.e. 1 Microsecond = 10^-6 seconds. The way of calculating the equivalent power to 2 (more precisely is by logarithmic taels. Hence the power of 2 that would equal 10^-6 is riven by the ratio log(10^-6)/log2 = -19.9315 which can be rounded of to -20. Hence 2^-20 ` 10^-6

How does the term Abstraction apply to computer organization and architecture?

The term abstraction in computer organization and architecture helps us to understand how computers are structured as a hierarchy of levels so that complexity can be controlled. The low-level language will be translated to high-level language which human can understand. For example: the abstract levels of modern computing system, which consists of different levels of language and to translate these languages to low0level languages so that a machine easily can understand. Each level has its own function and acts as a virtual machine. This shows how a machine is capable to solve a complex problem.

Name the characteristics present in von Neumann Architecture.

1. The computer has hardware system: •Memory - there will be a main memory system (RAM) which holds the data or program. •Arithmetic Logic Unit (ALU) - As the name suggests, this will be useful in Arithmetic and logical calculations like add, subtract, divid and comparisons. •Control Unit - In CPU there will be a control unit that'll manage the process of data or program. The execution of program will be done by Control Unit like Fetch - decode execution. •Input/output system - by using this user will give the input and take the output after execution. User can store the information using CD, floppy, etc. 2. Data or Program will be loaded into main memory. 3. Processing of instruction will be sequential.

List five applications of personal computers. Is there a limit to the applications of computers? Do you envision any radically different and exciting applications in the near future. If so, what?

1. Internet Browser: software to access internet. (Internet Explorer, Firefox, Opera, Safari, etc). 2. Data Compression Software: By using the software we can reduce the file size. ZIP is the data compression software on the PC which is widely used. 3. Windows Media Player: This is a software for music libraries. By using the we can listen to music. 4. Image Editing Software: A person can take a picture of good quality. A person can edit images by using editing software. 5. Audio Editing Software: By using this software, a person can edit audio files and add audio effects. No, there is not limit to the applications of computers.

What is an ISA?

ISA - Instruction Set Architecture Instruction Set Architecture is a machine language that is visible to the programmer. It gives the commands to CPU and tells the machine what to do. It acts as an interface between hardware and the software of the computer system which executes the program. Instruction set architecture includes various instructions, number of registers, exception handling and bits per instruction. Instruction Set Architecture tells that which instructions can be performed by the machine. Different CPUs are using almost the same instruction set architecture.

What unit is typically used to measure the speed of a computer clock?

If someone wants to buy a computer, thy should check the processor speed. There are various ways to check the processor speed. The number of cores in a CPU is one factor in determining a CPU's processing speed. The more cores, the more the speed will be. The other factor to determine the CPU's processing speed is to measure the clock speed. Cpu can perform the amount of clock cycles per second. The computer's clock speed will be measure in gigahertz (GHz). One gigahertz equals one billion cycles per second. If clock speed will be high, CPU executes more operations per second.

How does the fetch-decode-execute cycle work?

A process that will describe how processor takes data or program, decor it and finally execute the data. Fetch-decode-execute cycle is also known as von Neumann execution cycle. First of all, the data or instructions will be loaded into main memory. CPU will fetch data from main memory using program counter and then store those data into registers. This process is called the 'fetch' of the cycle. Now the data or instructions will be decoded into a language so that ALU can understand those data or instructions. This is called 'decode'. Now instructions will be executed by ALU and the result will be stored in registers.

What are the main challenges of Cloud computing from a provider perspective as well as a consumer perspective?

Challenges of cloud computing from Provider perspective: •From the perspective of provider of cloud computing services, the main challenge is to maintain correct estimation between the resources being provided and the cost being charged for the utilization. This is the main challenge of every perspective because; the provider wants more benefits from the services as well as the consumer wants reliable services at a reasonable cost. •The provider should make sure of privacy and security being provided to the cloud services. •The provider should be able to manage and manipulate the economic issues and organizational changing needs from time to time. •The tools used in providing services to consumers would be reliable for both small scale and large scale cloud providers since they play a high role in maintaining the cost efficiency. •Majority of providers believe the assumption that scaling up of architecture or more powerful servers increases demand rather than scaling out on large number of servers. •The utility count of consumers can be monitored thoroughly so that the customer should satisfy with the services as well as cost and that should be beneficial to provider also. •The utmost important fact is providing privacy and security to the data in cloud servers. Challenges of cloud computing from consumer perspective. •From the perspective of provider of cloud computing services, the main challenge is to maintain correct estimation between the resources being provided and the cost being charged for the utilization. This is main challenge of every perspective because; the provider wants more benefits from the services as well the consumer wants reliable services at a reasonable cost. •The provider should make sure of privacy and sercurity being provided to the cloud services. •The tools used in providing services to consumers would be reliable for both small scale and large scale cloud providers since they play a high role in maintaining the cost efficiency. •Majority of providers believe the assumption that scaling up of architecture or more powerful servers increases demand rather than scaling out on large number of servers. •The utility count of consumers can be monitored thoroughly so that the customer should satisfy with the services as well as cost and that should be beneficial to provider also. •The utmost important factor is providing privacy and security to the data in cloud servers.

Who is known as the father of computing, and why?

Charles Babbage is known as the father of computing. Charles Babbage built two types of calculating machines. The first machine was Difference Engine. The machine was designed to give accurate solutions of functions and it was not a computer but was a calculator. The second machine was Analytical Engine. the engine was capable of doing mathematical operations. It was having a memory (storage), a processor and input/output devices just like that's computer. This machine was also having punched card system to store information. Charles Babbage was considered as a father of computer because he was the first person to introduce a calculating machine just like today's computer. He was the first person to invented punched cards to store information.

What are the key characteristics of Cloud Computing?

Cloud computing is the latest technology which provides a solution for the organizations to maintain individual services whenever needed at a reasonable price. Cloud computing provides services in the hierarchy of computing services i.e., Infrastructure as a Service (IaaS), Platform as a Service (PaaS), Software as a Service (SaaS). By using cloud computing technology, large amount of resources are shared among the organizations either in public networks or private networks. The main motive of cloud computing technology is to provide dynamically scalable and planned infrastructure for cloud based applications and to store data. The organizations may choose private, public, hybrid, or new community cloud for they access or services. There are some characteristics that cloud computing services should follow: •Cloud Computer should possess on-demand capabilities: Usually the cloud computing provides cloud services to the organizations based on agreement and pay-for-what-you-use scenario. the client or organization have the right to change the vendor if they do not satisfy with the services provided. So the cloud computing should provide services as needed by the client. whether the requirement may be the hardware, software or network, the vendor should provide services as needed by the organizations. •Feature of Resource pooling: The cloud enables the employers of the organization to gain access by just entering the user name and password associated with them. So the employers of and organization can use the cloud services at any time and from anywhere. This could be an attractive feature for the organizations with multiple business offices as various places. •Provide Measure services: The main motive of the cloud computing is to provide services to the organizations on payment or monthly rental basis. The client will pay the money only for the services utilized. The resources being utilized can be monitored and controlled from both sides either from organizations or cloud services provider. So the client can reduce the utilization of bandwidth and message transmission etc. •Flexibility and scalability: The cloud services should be in such a way that any user can be added or removed. The resources and the software services should be easily manipulated according current changing technologies. •Broad-Network access: The services should be availed easily and through wider networks to provide better services. The services should be availed through mobile phone, smart phones, laptops, or other office computers to provide business management solutions. this facility for mobility easily attracts the business persons so that they can gain services wherever they're needed.

What is the difference between Computer Organization and Computer Architecture

Computer Organization focuses on the internal design of the computer system, how to control signals, design of bus system, and memory types. In short, Computer Organization tells us how a compute works. Computer Architecture focuses on the design and behavior of the computer system. Computer Architecture refers to many attributes such as instruction sets, data types, input/output mechanisms and those which a programmer can see. In short, Computer Architecture tells us how to design a computer.

Name and explain the seven commonly accepted layers of the Computer Level Hierarchy. How does this arrangement help us to understand computer systems?

Computer understands machine language but this language is difficult to understand for humans. Because of this difficulty, computers are structured as a hierarchy of levels so that complexity can be controlled. Each level has its own function and acts as a virtual machine. Seven commonly accepted layers of the Computer Level Hierarchy: Level 0: the lowest level. It is the Digital Logic Level. This level contains gates and computes the functions like AND, OR. Level 1: the next is the Control Level. In this level there will be a control unit. Control unit will manage the process of data. The execution of program will be done by Control Unit. Level 2: This is the Instruction Set Architecture (ISA) or machine level. Instructions will be executed without any compiler, interpreter. Level 3: The Operating System Level. In this level instructions will be translated from assembly language to machine language. Level 4: The Assembly Language Level contains an assembler which translates the language of level 1, 2 and 3. Level 5: The High-Level Language Level consists of C, BASIC, Java, Fortran languages. This level contains an interpreter or compiler to translate these languages to a language which machine can understand. Level 6: This is the User Level. In this level user runs a program.

Name two driving factors in the development of computers.

Computers have long since been a major assistance to humankind in general. Problems earlier thought to be insurmountable can now be solved in a matter of seconds. It has also inspired automation of so many processes that has made human life today as convenient as it is. The primary need for computer was two-fold: problem-solving and automation or triggering of a series of events at a single command. Computers have evolved over time and technologies employed to get them to where they are today. There are four stages (or generations) in the evolution of computers. The first (Generation 0) involved the use of mechanical machines where there were actually physical moving parts that performed calculations with the output also being physical i.e. punched out on cards. This eventually gave way to a faster and more compact breakthrough in computer technology since it involved a lateral approach with novel approach. The vacuum tubes were still considered unwieldy and also susceptible to damage because of the huge amount of glassware involved. It was also considered a safety hazard due to the large voltage required to run it and expensive as well. All these factors eventually gave way to yet another generation: the transistors (Generation 2). These made use of semiconductor devices, consumed signifanctly less power and were also smaller lighter and more compact. They also consumed less power and, due to their manageable size, were more reliable that the vacuum tubes. This can be considered as the second major breakthrough in computer technology since again it involved a lateral approach with novel approach. Subsequent generations 3 & 4 i.e. Integrated circuits and VLSI respectively were mainly improvements over generation 2 and were hence not as revolutionary as the leap from Generation 0 to 1 from Generation 1 to 2.

What is underlying premise of Amdahls Law?

Computers have long since been a major assistance to humankind in general. Problems earlier thought to be insurmountable can now be solved in a matter of seconds. It has also inspired automation of so many processes that has mad human life today as convenient as it is. The primary need for computers was two-fold; problem-solving and automation or triggering of a series of events at a single command. Even with the vast advancement in technology, there is always a limit to how high a performance can be extracted out of a system. Even if the processors in a system are arranged in parallel so that the task is equally divided among them, there is still the possibility of the system reaching saturation and thereby slowing it down. Amdahl's Law states that the performance of the system that van be enhanced has a limit. This limit depends on the duration of use of the feature that improved. The guiding principle is that there exists some type of sequential processing part in a system which allows scope for improvement by means of multiprocessing.

What is meant by an "open architecture"?

In an open architecture specifications add-on peripherals are made available for third parties. Open architecture is made public. Anyone can make duplicate products which are designed by manufacturer i.e.: IBM PC Linux is an open architecture because it is free for public. Users can extend a system's capabilities according to their requirements.

What was it about the Von Neumann architecture that distinguished it from its predecessors?

In early days, computers were designed including Difference engine, ENIAC (electronic numerical integrator and computer), ABC (Atanasoff Berry Computer), vacuum tube computers and many more. In early days computer the data nd instructions were not stored in the same memory but it is possible in von Neumann architecture. Te von Neumann architecture is also known as "store-program" architecture because in this architecture the program data and instruction data will be stored in the same memory. By using this architecture computer will perform the complex operations in less time. Computer will perform the calculation and also manage the calculations sequence. the basic structure of von Neumann architecture consists of - memory, processing unit and control unit.

Explain the differences between SSI, MSI, LSI, and VLSI.

In the third generation dozens of transistors were combined onto a single chip. In this generation there are various level of integration - SSI (small scale integration), MSI (medium scale integration), LSI (large scale integration) and VLSI (very large scale integration). Differences: 1) SSI has less than 10 to 1000 components. LSI has greater than 1000 components. VLSI has greater than 10000 components. 2) SSI used in old television. MSI used in decoder and register. LSI used to perform arithmetic and logic operations. VLSI used in microprocessors.

What is Moore's Law

Moore's Law is a computing term which originated around 1970. An observation made by intel co-found Gordon Moore. This law states that the number of transistors on a processor would double every two years and this trend would likely continue in the future. the pace of change having slowed down a bit over the past few years. This was later amended to 18 months.

What is a multicore processor?

Multicore processor refers to more than one processors or cores into a single chip which run simultaneously. Each core operates different type of instructions independently so that the work can be done very fast. Multicore processor can increase the performance of a processor. When the computer will be turned on, it will not generate much heat as it contains multicore processor. Multicore processor divides the work load between processors.

Name the three basic components of every computer

Processor: the brain of the computer. It is also known as Central Processing unit. It takes data in the form of input and processes it using some arithmetic or logical operations using ALU and transforms it into output. Memory (storage): memory refers to the data storage either permanent or temporary. Computer memory understands only to bits, 0 and 1. RAM (Random Access Memory) is temporary memory and ROM (Read Only Memory) is permanent memory. Input/Output: It is the communication mechanism. Input and Output devices are the significant portion of the computer accessories. Input devices are used to give data to the computer as input from the external source and output devices are used to get the information after processing to the user.

What are three types of cloud computing platforms?

Public: Cloud services in public platform can be utilized by anyone in the open source and kept for open-use. This service is provided in open source by some organization who also hosts the service. Private: The cloud services in private platform are used for a single organization and it can be hosted either internally or extremely based on their requirement. Hybrid: Hybrid platform is composition of two or more clouds (public, private or community) to provide unique instances and it may be hosted internally or externally. some of the PaaS are: Google App Engine, Microsoft Azure, and Amazon AWS.

Name Two Types of Computer Memory.

RAM: Stands for Random Access Memory, which is a very important part of the computer. It stores the data which is accessed by the CPU. It is the place where the programs or data in current use can be kept. Ram is also known as working memory main memory, or primary storage. RAM is volatile in nature as the information within it is lost when power to the computer is turned off. ROM: Read Only Memory, computer can only read information from it but no information can be written on it. Part of the operating system is stored in ROM. When the computer system is turned on, the CPU executes instructions stored in ROM.

How is Rock's Law related to Moore's Law?

Rock's Law is a computing term which originated in the mid 1990's. It is also called Moore's second law. An observation made by inter capitalist Arther Rock. this law states that the cost of semiconductor chip would be double every four years and this trend would likely to continue into the future. According to this logic, the price of chip plant was $12 million in the mid 1990s. The price of chip plant was $3 billion in 2005. Year by year the cost of chip plant will be growing.

What is the name of the Swiss organization that devotes itself to matters concerning telephony, telecommunications, and data communications?

Standard setting organizations are generally formed on a need basis in order to normalize a way to universally grade a product or service. If a set of common guidelines are established for a particular type of goods, then it is significantly easier to market them to a broader customer base. One such prominent standard setting organization is the International Telecommunication Union which is based in Geneva, Switzerland. The original name of this organization was Comite Consultatif International Telegraphique but then switched due to the more universal appeal of the English name. The interoperability of data communications with telecommunications is one of the major focus areas of this group. There have been a significant number of standards laid down by the telecommunications division of this group, the ITU-T.

What are the distinguishing features of tablet computers?

Tablet

What is the mission of the IEEE?

The Institute of Electrical and Electronics Engineers (IEEE) is an organization that gives the information related to the electrical, electronics, computer and all engineering branches. It helps to the students for their professional development. IEEE publishes a lot of technical theories to worldwide engineering community to promote the engineering students so that student can develop and apply knowledge about different technologies. Because of IEEE technical community students will be aware about the latest development in technologies. IEEE believes that students can develop their knowledge and profession as well by exchange technological information.

What is the full name of the organization that uses the initials ISO? Is ISO an acronym?

The International Organization for Standardization (ISO) is an organization dedicated to the worldwide development of standards including commercial and industrial. ISO was established in 1947 and it is a non-governmental organization. ISO develops and progrometes the international standards. It sets standards for industries to develop and improve quality of them. It's not an acronym. The word ISO derived from the Greek work sos which means "equal". In other languages iso is a prefix such as "isometric" (equal dimension). The Organization believes that the name should be ISO so that it would be acceptable by all other languages.

What is meant by parallel computing?

The ability to carry out multiple programs simultaneously. To solve the problem faster by using group of computers simultaneously, parallel computing can increase a computers performance because different operations will run on different processors at the same time. If number of processors will be more in parallel computing, time taken by multiple operations can be fast and because of that a computers performance will be increased.

To what power of 10 does the prefix giga refer? What is the (approximate) equivalent power of 2?

The prefix giga means 10^9 in the SI (International System of Units) of decimal system. Converting to Binary: 1MB = 2^10KB 1KB = 2^10 bytes = 1024 1 giga = 1024 * 1024 * 1024 = 2^10 * 2^10 * 2^10 =2^30 bytes So, the 1 giga = 2^30 bytes.

What is the importance of the Principle of Equivalence of Hardware and Software?

The principle of equivalence of hardware and software means that any operation which is done by software can also be done by hardware. If a task can be done by hardware can also be done by software. We can make the best choice if we know how to match the software with the hardware by knowing the principle of equivalence of hardware and software. By using this principle we can maximize the performance of the computer system. Sometimes a simple system can give a better performance than a complex computer program and other times a good program will be preferred.

How does an integrated circuit differ from a transistor?

Transistors: The second generation computer used Transistors. It is a device composed of semiconductor material. They moved from machine language to assembly language. Integrated Circuit: The third generation computer used Integrated Circuit. It exist on a silicon chip/ They moved from assembly language to high level language. Differences: 1. Integrated Circuit is a combination of dozens of transistors but Transistor is a single device. 2. Integrated Circuit moved from assembly language to high level language while transistor moved from machine language to assembly language. 3. By using Transistor computers became faster, costly and smaller. By using Integrated circuit computers became faster, cheaper, and smaller.

What makes Watson so different from traditional computers?

•A computer that belonged to AI technology. •basically developed as a question-answering machine which can understand the data given in natural language and retrieves data accordingly. • Usually search engines do searching in the databases or servers with the help of keywords in the data given by user. It takes the unstructured data as input. •unlike normal computers, Watson computer will understand the data provided or pronounced in natural language and retrieves the related (mostly accurate) data from encyclopedias, thesaurus, dictionaries etc. •Watson computers also contains databases, taxonomies, etc. •The Watson computer helps in many areas like health management, law supervisions, help desk, and many more. • This machine is first entered in a game show called 'jeopardy' and won over the human contestants. It is not connected to internet and it is completely offline machine. •The Watson computer was developed by IBM as the competent for Deep blue artificial intelligent machine. •It stores the terabytes of data into its brain (the component of Watson computer where it stores the data) and accesses it whenever the query is passed. •This machine doesn't forget its knowledge at any time. •It is not just a computer rather it is a huge collection of servers and system located at a place as large as a floor of building.


संबंधित स्टडी सेट्स

Loan Origination Activities Terminology Quiz

View Set

DM Chapter 6: Multiple Linear Regression

View Set

LEED Synergistic Opportunities Practice Test

View Set

A+ 220-1101 & 220-1102 Practice Exam

View Set