CS 440 Exam 4

अब Quizwiz के साथ अपने होमवर्क और परीक्षाओं को एस करें!

abandoned systems

---lack of clear, well thought out goals and specifications --poor management and poor communication among customers, designers, programmers and so on --institutional or political pressures that encourage unrealistically low bids, unrealistically low budget requests, and underestimates of time requirements --use of every new technology with unknown reliability and problems, perhaps for which software developers have insufficient experience and expertise

Ethical Guidelines for Computer Professionals

--A professional is an expert in a field --Customers rely on the knowledge, expertise, and honesty of the professional --The work of many professionals profoundly affect large numbers of people, some indirectly --Professionals must maintain up to date skills and knowledge

Special Aspects of Professional Ethics

--A professional is an expert in a field --Customers rely on the knowledge, expertise, and honesty of the professional --The work of many professionals profoundly affect large numbers of people, some indirectly --Professionals must maintain up to date skills and knowledge --responsibilities include potential risks and security of data, safety, reliability, and ease of use. --courage in a professional setting could mean admitting to a customer that your program is faulty, declining a job for which you are not qualified, or speaking out when you see someone else doing something wrong.

The Digital Divide: The Global Divide and the Next Billion Users

--Approximately two billion people worldwide have access to the Web, a fivefold increase over roughly a decade. --Approximately five billion do not use the Internet. --Bringing new technology to poor countries is not just a matter of money to buy equipment; PCs and laptops must work in extreme environments. --Some people actively working to shrink the digital divide emphasize the need to provide access in ways appropriate to the local culture.

Problems for Individuals

--Billing errors --Inaccurate and misinterpreted data in databases --Large population where people may share names --Automated processing may not be able to recognize special cases --Overconfidence in the accuracy of data --Errors in data entry --Lack of accountability for errors --ex: messing up your name on no fly list --system failures that affect large numbers pf people and or cost large amounts of money and problems in safety ciriticial applications that may unjust or kill people --at some point the expense of improving a system is not worth the gain, especially for applications where the impact of the error is small and errors can be detected

CTB Example

--CTB/McGraw-Hill develops standardized tests for schools. Error in software caused it to report test results incorrectly. --School principals and superintendents lost their jobs because their schools appeared to be doing a poor job of teaching students to read. --CTB said nothing was wrong. --but the company did not take seriously enough the questions about the accuracy of the results and was reluctant to admit the possibility nd later the certainty of errors .

Neo-Luddites: Criticisms of Computing Technologies

--Computers cause massive unemployment and de-skilling of jobs. --Computers "manufacture needs"; we use them because they are there, not because they satisfy real needs. --Computers cause social inequity --Computers cause social disintegration; they are dehumanizing. They weaken communities and lead to isolation of people from each other. --Computers separate humans from nature and destroy the environment. --Computers benefit big business and big government the most. --Use of computers in schools thwarts development of social skills, human values, and intellectual skills in children. --Computers do little or nothing to solve real problems.

Law, Regulation, and Markets

--Criminal and civil penalties --Provide incentives to produce good systems, but shouldn't inhibit innovation --Regulation for safety-critical applications --Professional licensing --Arguments for and against --Taking responsibility --unfortunately, there are many flaws in liability of law in the US. People often win multimillion-dollar suits when there is no scientific evidence or sensible reason to hold the manufacturer or seller of a product responsible for accidents of other negative impacts.

Wisdom of the crowd

--Daunting amount of information on the web, much of this information is not correct --Search engines are replacing librarians, but Web sites are ranked by popularity, not by expert evaluation --If millions participate, the results will be useful --Problems of unreliable information are not new --The Web magnifies the problems --Rating systems are easy to manipulate --depends on the diversity and independence of people who participate --the problems of unreliable information are not new --How good is the wisdom of the crowd? and how can we distinguish good sources of info on the web? average, or median or most common answer is often a good one --there is no magic formula that tells us what is true and reliable either on the web or the off web --determine who sponsors the site

What Goes Wrong?

--Design and development problems --Inadequate attention to potential safety risks --Interaction with physical devices that do not work as expected --Incompatibility of software and hardware, or of application software and the operating system --Not planning and designing for unexpected inputs or circumstances --Confusing user interfaces --Insufficient testing --Reuse of software from another system without adequate checking --Overconfidence in software --Carelessness --Management and use problems --Data-entry errors --Inadequate training of users --Errors in interpreting results or output --Failure to keep information in databases up to date --Overconfidence in software by users Misrepresentation, hiding problems and inadequate response to reported problems Insufficient market or legal incentives to do a better job Ex: Reuse of software: the Ariane 5 rocket and "No Fly" lists --It is essential to reexamine the specifications and design of the software, consider implications and risks for the new environment, and retest the software for the new use. --Computer systems fail for 2 general reasons: the job they are doing is inherently difficult and sometimes the job is done poorly. --computer systems interact with the real world include complex communication networks, have numerous features and interconnected subsystems and are extremely large. --computer software is "nonlinear" in the sense that whereas a small error in a mechanical system might cause a small degradation in performance, a single type in a computer program can cause a dramatic difference in behavior

The Difficulty of Prediction

--Each new technology finds new and unexpected uses --The history of technology is full of wildly wrong predictions --Weizenbaum argued against developing speech recognition technology --Mistaken expectations of costs and benefits --Should we decline a technology because of potential abuse and ignore the benefits? --New technologies are often expensive, but costs drop as the technology advances and the demand increases A brief look at the development of communications and computer technology suggests the difficulty of evaluating the consequences and future applications of a new technology. The failure to predict that the computer revolution was the failure to understand how society would modify the original notion of a computational device into a useful tool for everyday activities. Solutions: opening it up to as many people as possible

Testing

--Even small changes need thorough testing --Independent verification and validation (IV&V) --Beta testing --Test Driven Development - start with the end goal in mind. --it is difficult to overemphasize the importance of adequate, well-designed testing of software. Testing is not arbitrary. --IV&V - identification verification and validation - independent company tests and validates the software. (Beta testing)

Voting systems

--Help America Vote Act passed in 2002 to improve voting systems. 2006 elections only small percentage of americans voted with paper ballots. Problems that occurred with rush to machines: --some systems crashed and people were unable to vote. --some machines failed because of a technical problem. --one county lost 4000 votes because machines memory was full. --a programming error generated 100000 extra votes in one Texas county. --a programming error caused one candidate to recieve votes actually cast for other candidates. They found that voting system developers lacked sufficient security training. Programmers omitted basic procedures such as input validation and boundary checks. Many of the failures that occurred result from causes we will see over and over: lack of sufficient planning and thought about security issues, insufficient testing, and insufficient training.

Why So Many Incidents? (therac25)

--Hospitals had never seen such massive overdoses before, were unsure of the cause --Manufacturer said the machine could not have caused the overdoses and no other incidents had been reported (which was untrue) --The manufacturer made changes to the turntable and claimed they had improved safety after the second accident. The changes did not correct any of the causes identified later. --Recommendations were made for further changes to enhance safety; the manufacturer did not implement them. --The FDA declared the machine defective after the fifth accident. --The sixth accident occurred while the FDA was negotiating with the manufacturer on what changes were needed. --the staff at the site of the first incident said that one reason they were not certain of the source of the patients injuries was they they had never seen such a massive radiation overdose before.

Scenarios: Introduction and Methodology Analysis phase

--Identify responsibilities of the decision maker --Identify rights of stakeholders --Consider the impact of the options on the stakeholders (consequences, risks, benefits, harms, costs) --Categorize each potential action as ethically obligatory, prohibited, or acceptable --When there are multiple options, select one, considering the ethical merits of each, courtesy to others, practicality, self-interest, personal preferences, etc.

Safety-critical applications

--Identify risks and protect against them --Convincing case for safety --Avoid complacency --often times but make sure everything is safe rather than nothing is bad --one lesson is that most accidents are not the result of unknown scientific principles but rather of a failure to apply well-known standard engineering practices.

Professional techniques

--Importance of good software engineering and professional responsibility --User interfaces and human factors --Redundancy and self-checking --Testing --Include real world testing with real users --Management and communication --High reliability organization principles --preoccupation with failure --loose structure - who should you tell if somethig goes wrong Big difference: not just making decision for yourself duty to clients/customers/demographics themes of class still apply

Luddites

--In early 1800s England, people burned factories and mills in efforts to stop the technologies and social changes that were elimnationg their jobs. --For 200 years the memory of violent Luddite uprising has endured as the most dramatic symbol of opposition to the Industrial Revolution. The term has long been derisive description for people who oppose technological progress. --they strategically sabotaged equipment

Accomplishments of technology

--Increased life expectancy --Elimination or reduction of many diseases --Increased standard of living --Assistive technologies for those with disabilities

Specifications

--Learn the needs of the client --Understand how the client will use the system --Good software developers help clients better understand their own goals and requirements, which the clients mightt not be good at articulation.

Vulnerable viewers

--Less educated individuals --Children what about people who have less education or ability? what can we do to improve quality of info? --basic social and legal forces help, freedom of speech, and people who care, who volunteer to write review and correct online ifnormation. What else can we do to reduce access to dangerously wrong information by vulnerable people?

Scenarios: Introduction and Methodology Brainstorming phase

--List all the people and organizations affected (the stakeholders) --List risks, issues, problems, and consequences --List benefits. Identify who gets each benefit --In cases where there is no simple yes or no decision, but rather one has to choose some action, list possible actions

Neo-Luddites: Nature and human life styles

--Luddites argue that technology has made no important improvements in life. --Many debates set up a humans-versus-nature dichotomy. --Conflicts about the environment are not conlicts between humans and nature. They are conflicts between people with different views about how to meet human needs. --Critics of modern technologies point out their weaknesses but often ignore the weaknesses of the alternatives

Observations and Perspective

--Minor design and implementation errors usually occur in complex systems; they are to be expected --The problems in the Therac-25 case were not minor and suggest irresponsibility --Accidents occurred on other radiation treatment equipment without computer controls when the technicians: Left a patient after treatment started to attend a party --Did not properly measure the radioactive drugs Confused micro-curies and milli-curies --The underlying problems were carelessness, lack of appreciation for the risk involved, poor training, and lack of sufficient penalty to encourage better practices.

Failures and Errors in Computer Systems

--Most computer applications are so complex it is virtually impossible to produce programs with no errors --The cause of failure is often more than one factor --Computer professionals must study failures to learn how to avoid them --Computer professionals must study failures to understand the impacts of poor work --Problems for Individuals --System Failures that affect large numbers of people or cost lots of money --Safety critical that injure or kill people --bane of CS --example: samsung galaxy S7 - charges in like an hour --we should study how failures happen in order to avoid them --studying these failures and risks contributes to understanding their causes and helps prevent future failures --we can categorize computer errors and failures in several ways - by the cause, by the seriousness of the effects, or by the application area.

Redundancy and self-checking

--Multiple computers capable of same task; if one fails, another can do the job. --Voting redundancy --Software modules can check their own results.

The Digital Divide: Trends in Computer Access

--New technologies only available to the wealthy --The time it takes for new technology to make its way into common use is decreasing --Cost is not the only factor; ease of use plays a role --Entrepreneurs provide low cost options for people who cannot otherwise afford something --Government funds technology in schools --As technology becomes more prevalent, the issues shift from the haves and have-nots to level of service --digital divide refers to the fact that some groups of people (the haves) enjoy access to and regularly use the various forms of modern information technology, while the others (the have-nots) do not. The focus of the discussion about the digital divide has shifted over time.

Abdicating responsibility

--People willing to let computers do their thinking --Reliance on computer systems over human judgment may become institutionalized --Fear of having to defend your own judgment if something goes wrong

Professional Codes of Ethics

--Professional Codes of Ethics --Provide a general statement of ethical values --Remind people in the profession that ethical behavior is an essential part of their job --Provide guidance for new or young members

What is "Professional Ethics"?

--Professional ethics includes relationships with and responsibilities toward customers, clients, coworkers, employees, employers, others who use one's products and services, and others whom they affect --A professional has a responsibility to act ethically. Many professions have a code of ethics that professionals are expected to abide by: --Medical doctors --Lawyers and judges --Accountants --There are special aspects to making ethical decisions in a professional context --Honesty is one of the most fundamental ethical values; however, many ethical problems are more subtle than the choice of being honest or dishonest --Some ethical issues are controversial --as kant may say, a lie treats people as merely means to ends, not ends in themselves. --How much riss is acceptable in a system? What uses another company intellectual property are acceptable? --How will you decide whether to accept the contract? the critical first step, however, is recognizing that you face an ethical issue.

Therac-25 Software and Design problems

--Re-used software from older systems, unaware of bugs in previous software --Weaknesses in design of operator interface --Inadequate test plan --Bugs in software --Allowed beam to deploy when table not in proper position --Ignored changes and corrections operators made at console --it was fully computer controlled. malfunctioned frequently. -- one facility said sometimes 4- dose rate malfunctions a day, generally underdoses. Thus operators became used to error messages approving often with no indication that there might be the safety hazards. --operators manual did not include an explanation of the error messages. --Atomic Energy of Canada, Ltd (AECL) manufactured it, claimed they tested it extensively and it appeared that test was inadequate.

Intelligent Machines and Superintelligent Humans - Or the End of the Human Race?

--Technological Singularity - point at which artificial intelligence or some combined human-machine intelligence advances so far that we cannot comprehend what lies on the other side --We cannot prepare for aftermath, but prepare for more gradual developments --prominent technologists describe a not-very-distant future in which intelligence-enhancing devices, artificial intelligence, and intelligent robots change our society and ourselves in profound ways. --Rodney brooks, for example suggests that by 2020 we might have wireless internet interfaces that doctors can implant in our heads. --Going farther into the future, will we download our brains to ling-lasting robot bodies? is we do will we still be human?

Therac-25 Radiation Overdoses

--Therac25 was a software-controlled radiation therapy machine used to treat people with cancer, machines at 4 medical centers gave massive overdoses of radiation to 6 patients. --Massive overdoses of radiation were given; the machine said no dose had been administered at all --Caused severe and painful injuries and the death of three patients --Important to study to avoid repeating errors --Manufacturer, computer programmer, and hospitals/clinics all have some responsibility Why is it important to study this case? to avoid repeating the errors --Studies of the incidents showed that many factors contributed to the injuries and deaths, including lapses in good safety design, insufficient testing, bugs in the software that controlled the machines, and an inadequate system of reporting and investigation the accidents

Does technology create the need for itself?

--Those who emphasize the value of individual action and choices argue that needs are relative to goals, and goal s are held by individuals. Thus, should we ask whether we, as a society need portable computers? Or should this be an individual decision with different responses? --the argument that capitalists or technologies manipulate people to buy things they do not really want, like the argument that use of computers has an insidiously corrupting effect on computer users, displays a low view of the judgement and autonomy of ordinary people. It is one things to differ with another persons values and choices. It is another to conclude that because of the difference, the other person is weak and incapable of making his or her own decisions

Trust the Human or the Computer System?

--Traffic Collision Avoidance System (TCAS) --Computers in some airplanes prevent certain pilot actions How much control should computers have in a crisis? --TCAS detects a potential in-air collision of 2 airplanes and directs the pilots to avoid each other.

Guidelines and Professional Responsibilities

--Understand what success means --Include users (such as medical staff, technicians, pilots, office workers) in the design and testing stages to provide safe and useful systems --Do a thorough, careful job when planning and scheduling a project and when writing bids or contracts --Design for real users --Guidelines and Professional Responsibilities (cont.) --Don't assume existing software is safe or correct; review and test it --Be open and honest about capabilities, safety, and limitations of software --Require a convincing case for safety --Pay attention to defaults --Develop communication skills

System Failures

--Voting systems --Technical failures --Programmers or hackers rigging software to produce inaccurate results. --Vulnerability to viruses Examples: Denver airport, Airports in Hong Kong and Kuala Lumpur --Comprehensive systems failed because designers did not adequately consider potential for user input error. --Lack of clear, well-thought-out goals and specifications --Poor management and poor communication among customers, designers, programmers, etc. --Institutional and political pressures that encourage unrealistically low bids, low budget requests, and underestimates of time requirements --Use of very new technology, with unknown reliability and problems --Refusal to recognize or admit a project is in trouble --one aim is to see the serious impacts of the failures-and to see what you want you work hard to avoid

Why models may not be accurate

--We might not have complete knowledge of the system we are modeling. --The data describing current conditions or characteristics may be incomplete or inaccurate. --Computing power may be inadequate for the complexity of the model. --It is difficult, if not impossible, to numerically quantify variables that represent human values and choices. --models dont always accurately predict results ex: weather, election results, etc

Wikipedia

--Written by volunteers, some posts are biased and not accurate --Although anyone can write, most people do not --Those that do typically are educated and experts --they have a discussion board and editing standards needs to be notable or well known to get a wiki pg --the biggest online encyclopedia, immensely popular --can we rely on its accuracy and objectivity when anyone can edit any article at any time? --Information Quality example! --Open, volunteer, instant-publishing systems cannot prevent errors and vandalism as easily as publishers of printed books or closed proprietary online info sources. --We as users must learn to deal appropriately with side effects or weaknesses of new paradigms. Even though so much of wiki is excellent and useful, we learn that someone might have wrecked the accuracy and objectivity of any individual article at any hour

Several factors contribute to the frequency and severity of the problems people suffer because of errors in databases and misinterpretation of their contents:

--a large population (many people have identical or similar names, and most of our interactions are with strangers) --automated processing without human common sense or the power to recognize special cases --overconfidence in the accuracy of data stored on computers --errors (some due to carelessness) in data entry --failure to update information and correct errors --lack of accountability for errors

FBI watch list

--after 9/11, FBI gave a watchlist to police departments and businesses. recipients emailed the list to others and eventually thousands of police departments and thousands of companies had copies. --many incorporated list into their databases and systems that screened customers or job applicants. The list included people who weren't suspects but who the FBI wanted to question, but some companies labeled them all as "suspected terrorists". --Many entries didn't have date of birth or any other info. some companies received the list by fax and typed misspelled names from blurred copies into their databases. --The FBI stopped updating the lists but did not tell the recipients; thus, many entries became obsolete.

crash of american airlines flight 965 near cali, columbia,

--illustrates the importance of consistency. pilot intended to lock autopilot on beacon called Rozo. He typed R, he selected the top Beacon romeo, crashed into mountain killing 159 people --juries blamed it on pilot error, he chose wrong beacon without checking, but some blamed company that provided the computer system for not putting the closest beacon at the top of the list. Now we have ground proximity warning system (GPWS)

Management and communications

--management experts use the term high reliability organization (HRO) for any organization that operates in difficult environments, often with complex technology where failures can have extreme consequences. --One characteristic of an HRO is preoccupation with failure - always assuming something unexpected can go wrong - not just planning, designing, programming for all problems the team can foresee, but always being aware that they might miss something. --Preoccupation with failure includes being alert to cues that might indicate an error. It includes fully analyzing near failures and looking for systems reasons for an error or failure rather than focusing on the detail that was wrong..

Stalled airports: Denver

--opening rescheduled at least 4 times. delay cost 30 mil per month in bond interest and opening costs. the bag handling system caused most of delay. --bags were supposed to be sent anywhere in 10 min on cars going on underground tracks. --carts crashed into each other, the system misrouted dumped and even flung luggage. --carts needed to move luggage mistakenly went to waiting pens. Some of the specific problems: --Real-world problem - some scanners got dirty and knocked out of alignment and could not detect carts going by, faulty latches made carts cause luggage to fall onto tracks --problems in other systems - airports electrical system couldn't handle power surges associated with baggage system. First full scale test blew so many circuits --Software Errors - error caused the routing of cars to waiting pens when they were actually needed 2 main causes for what led to delay in baggage system: 1. the time allowed for development and testing of the system was insufficient. 2. Denver made significant changes in specifications after the project began. The bottom line lesson is that the system designers must build plenty of test and debugging time when scaling up proven technology into a much more complicated environment. Main causes: --Time allowed for development was insufficient --Denver made significant changes in specifications after the project began

legacy systems

--out of date systems still in use, with special interfaces, conversion software, and other adaptions to make them interact with more modern systems --it is important to document your work. it is important to design flexibility, expansion, and upgrades. --Reliable but inflexible --Expensive to replace --Little or no documentation

Overconfidence (ariane 5)

--overconfidence or an unrealistic or inadequate understanding of the risks in a complex system is a core issue --unrealistic estimates of reliability or safety can come from genuine lack of understanding from carelessness or from intentional misrepresentation

Neo-Luddites

--people who have fear of technology --leads to cognitive dissonance --how to deal with these people? Confront them with science, reality, facts, and how technology improves our lives --their view: technology itself manufacturing need for technology --they say computers are fundamentally bad, causing problems --they don't acknowledge the benefits of technology What do net-luddites find so reprehensible about computers? One of the differentiating charactiersitcs of the neoLuddites is that they focus on these problems, seeing no solutions or trade-offs, and conclude that computers are terribly bad development for humankind. --The conditions in computer factories hardly compare to conditions in the sweatshop factories of the early industrial revolution. --The neoluddite view is associated with a particular view of the appropriate way of life for human beings. Ex: people who put foil on phones

Technological Singularity

--point at which artificial intelligence or some combined human-machine intelligence advances so far that we cannot comprehend what lies on the other side --It is plausible that we can in the fairly near future, create of become creatures who surpass humans in every intellectual and creative dimension. Events beyond such a singular event are as unimaginable to us as is opera is to a worm. --Some welcome and others find it horrifying - and others unlikely. --Once they can improve their own design and build better bots, will they outcompete humans? two estimates support these scenarios: 1. estimate of the computing power of the human brain 2. based on moored law, the obervation that the computing power of the new microprocessors doubles roughly every 18-24 months.

How do we determine the quality of information being presented to us?

--popularity, cross-references (the pg rank), number of contributers --reliability of info --info overload --reinforce same opinions

Ariane 5

--rocket veered off cause and was destroyed as a safety precaution. --software error caused the failure. --used software that worked for ariane 4, but ariane 5 is faster, calculation produced numbers bigger than the program could handle ( an overflow) causing the system to halt

Bugs (Therac - 25)

--set up test performed a variety of checks to be sure machine was in correct position. a flag variable indicated whether a specific device on the machine was in the correct position. the flag variable was stored in one byte. After the 256th call to the routine, the flag overflowed and showed a value zero, meaning the device was ready. --the error was such a simple one, the solution is to set the flag to fixed value say 1, rather than incrementing it to indicate that the device needs checking. --when operator typed all the necessary info for a treatment, the software checked for editing of the input by the operator during this time and restarted the set-up if it detected editing. --because of bugs this section of the program, some parts of the program learned of the edited info while others didn't. This led to machine settings that were incorrect and inconsistent with safe treatment. Then according to later investigation by FDA, there were no consistency checks on the program. The error was most likely to occur with an experienced operator who was quick at dating input.

Narrowing the information stream

--some get all their news and interpretation of events from a small number of sites that reflect specific political point of view. --some critics see the web as significantly encouraging political narrowness and political extremes by making it --Facebook implemented algorithms to filter news feed updates from friends based on how recently a member communicated with them. What lessons can we learn from Facebook filtering? --microsoft said is eliminated words that may have offensive uses. Was this a dunderhead decision that dulls the language and reduces literacy? Do the various aspects of th internet narrow our information stream significantly diminish access to different points of view on controversial social and political topics?

Discussion of weizenbaums objections is important for several reasons.

1. Although Weizenbaum was an expert in artificial intelligence, of which speech recognition is a subfield, he was mistaken in his expectations about the costs and benefits. 2. His objections about military and government use highlight dilemme: Should we decline to develop technologies that people can misuse, or should we develop the tools because of their beneficial uses, and use other means, including our votes and our voices to influence government and military policy? 3. a common objections to some new medical technologies is that they are so expensive that only the rick will be able to afford them. This shortsighted view can result in the denial of benefits to thew whole population.

How to determine accuracy and usefulness of a model

1. How well do the modelers understand the underlying science or theory of the system they are studying? How well understood are the relevant properties of the materials involved/ How accurate and complete are the data? 2. Models necessarily involve assumptions and simplifications of reality. What are the assumptions and simplifications in the model? 3. How closely do the results or predictions of the model correspond with results from physical experiments or real experience?

Why don't we have a singularity yet?

1. Moores Law, observation that the number of transistors per square inch doubles every year 2. we may have misjudged processing power of human brain His reason: no one is working on artificial stupidity

Observations

1. many of the issues related to reliability and safety for computers systems have raised before with other technologies 2. there is a learning curve for new technologies. by studying failures, we can reduce their occurrence 3. Much is known about how to design, develop, and use complex systems well and safely. Ethical professionals learn and follow these methods. 4. Perfection is not an option. The complexity of computer systems makes errors, oversights, and failures likely. 5. comparing the risks of using computer technologies with the risks of using other methods, and weighing the risks against the benefits, give us important perspective.

Reasons why it might not happen

1. the hardware progress might slow down 2. we might not be able to develop the necessary software in the next few decades or at all 3. the estimates of the hardware computing power of the man brain might be drastically too low. 4. some philosophers argue that robots programmed with AI software cannot duplicate the full capability of the human mind.

Dependence, Risk, and Progress

Are We Too Dependent on Computers? --Computers are tools --They are not the only dependence --Electricity --Risk and Progress --Many new technologies were not very safe when they were first developed --We develop and improve new technologies in response to accidents and disasters --We should compare the risks of using computers with the risks of other methods and the benefits to be gained --we use tools because we are better off with them than without them. They reduce the need for hard physical labor and tedious routine mental labor. They help us be more productive, or safer, or more comfortable performing a task. if the tool breaks down we are stuck. --but the negative effects of a breakdown do not condemn the tool. --the inconveniences or dangers off a breakdown are a reminder for the convenience, productivity, or safety the tool provides when it is working.

Stalled airports: Hong Kong and Kuala Lumpur

China: --computer were to magnate everything --cleaning crews and fuel trucks etc went to the wrong gates --airplanes scheduled to take off were empty Malaysia: --employees had to write boarding passes by hand and carry luggage --flights delayed --food cargo rotted in the heat --failures for both blamed on people typing in incorrect info --any system that has large number of users and lot of user input must designed and tested to handle input mistakes. --The "system" includes more than software and hardware. I includes people who operate it.

Discussion Questions

Do you believe we are too dependent on computers? Why or why not? In what ways are we safer due to new technologies?

A Few Observations / solutions

Does this mean that no one should make decisions about whether it is good to develop a particular application of new technology? NO 1. Limit the scope of decisions about development of new technology 2. Decentralize the decision-making process and make it noncoercive, to reduce the impact of mistakes, avoid manipulation by entrenched companies who fear competition, and prevent violations of liberty The fundamental problem is not what decision to make about a particular technology. Rather it is to select a decision-making process that is most likely to produce what people want, to work will despite the difficulty of predicting consequences, to respect the diversity of personal opinions about what constitutes a desirable life style, and to be relatively free of political manipulation.

Questions

How should we make decisions about the basic question of whether to use a whole technology, or major segments of it, at all? Who would make such decisions? Can a society choose to have certain specific desirable modern inventions while prohibiting others whole technologies?

Evaluating models

How well do the modelers understand the underlying science or theory? How closely do the results or predictions correspond with the results from physical experiments or real experience? How well do people understand it? are they trained? Have they made simplifications when programming it? Did models predict what was being studied? --computer generated predictions based on mathematical models of subjects with important social impact frequently appear in the news --Mathematical models do not include equations for every factor that could influence the outcome. --It is the professional and ethical responsilbity of those who design and develop models for public issues to describe honestly and accurately the results, assumptions, and limitations of their models

Abdicating responsilbity

In some institutions when something goes wrong, "I did what the program recommended" is a stronger defense than "I did what my professional judgement and experience recommended." Such institutions are encouraging abdicaition of personal responsibility with potentially harmful results.

Scenario 10: Release of Personal Information

Scenario 10: Release of Personal Information You work for the IRS, the Social Security Administration, a movie-rental company, or an Internet service provider. Someone asks you to get a copy of records about a particular person. He will pay you $500. You know another employee sells records with people's personal information.

Scenario 11: Conflict of Interest

Scenario 11: Conflict of Interest You have a small consulting business. The CyberStuff company plans to buy software to run a cloud data-storage business. CyberStuff wants to hire you to evaluate bids from vendors. Your spouse works for NetWorkx and did most of the work in writing the bid that NetWorkx plans to submit. You read the bid while your spouse was working on it and you think it is excellent. Do you tell CyberStuff about your spouse's connection with NetWorkx?

Scenario 12: Kickbacks and Disclosure

Scenario 12: Kickbacks and Disclosure You are an administrator at a major university. Your department selects a few brands of security software to recommend to students for their desktop computers, laptops, tablets, and other devices. One of the companies whose software you will evaluate takes you out to dinner, gives you free software (in addition to the security software), offers to pay your expenses to attend a professional conference on computer security, and offers to give the university a percentage of the price for every student who buys its security package.

Scenario 13: A Test Plan

Scenario 13: A Test Plan A team of programmers is developing a communications system for firefighters to use when fighting a fire. Firefighters will be able to communicate with each other, with supervisors near the scene, and with other emergency personnel. The programmers will test the system in a field near the company office.

Scenario 14: Artificial Intelligence and Sentencing

Scenario 14: Artificial Intelligence and Sentencing You are part of a team developing a sophisticated program using artificial intelligence techniques to help judges make sentencing decisions for convicted criminals. Suppose judges in your state use a sentencing decision system that displays similar cases for the judge to view. You are a programmer working for your state government. Your state has just made it a criminal offense to use a cellphone while taking a college exam. Your boss, a justice department administrator, tells you to modify the program to add this new category of crime and assign the same relevancy weights to cases as the program currently does for using a cellphone while driving a car (already illegal in your state).

Scenario 15: A Gracious Host

Scenario 15: A Gracious Host You are the computer system administrator for a mid-sized company. You can monitor the company network from home, and you frequently work from home. Your niece, a college student, is visiting for a week. She asks to use your computer to check her email. Sure, you say.

Scenario 1: Protecting Personal Data

Scenario 1: Protecting Personal Data Your customer is a community clinic that works with families with problems of family violence. It has three sites in the same city, including a shelter for battered women and children. The director wants a computerized record and appointment system, networked for the three sites. She wants a few laptop computers on which staffers can carry records when they visit clients at home and stay in touch with clients by email. She asked about an app for staffers' smartphones by which they could access records at social service agencies. At the shelter, staffers use only first names for clients, but the records contain last names and forwarding addresses of women who have recently left.

Scenario 2: Email System With Targeted Ads

Scenario 2: Email System With Targeted Ads Your company is developing a free email service that will include targeted advertising based on the content of the email messages (similar to Google's Gmail). You are part of the team designing the system. What are your ethical responsibilities? protect privacy of email the sys

Scenario 3: Webcams in School Laptops

Scenario 3: Webcams in School Laptops As part of your responsibilities, you oversee the installation of software packages for large orders. A recent order of laptops for a local school district requires webcam software to be loaded. You know that this software allows for remote activation of the webcam.

Scenario 4: Publishing Security Vulnerabilities

Scenario 4: Publishing Security Vulnerabilities Three MIT students planned to present a paper at a security conference describing security vulnerabilities in Boston's transit fare system. At the request of the transit authority, a judge ordered the students to cancel the presentation and not to distribute their research. The students are debating whether they should circulate their paper on the Web. Imagine that you are one of the students.

Scenario 5: Specifications

Scenario 5: Specifications You are a relatively junior programmer working on modules that collect data from loan application forms and convert them to formats required by the parts of the program that evaluate the applications. You find that some demographic data are missing from some forms, particularly race and age. What should your program do? What should you do?

Scenario 6: Schedule Pressures - Safety-critical

Scenario 6: Schedule Pressures - Safety-critical Your team is working on a computer-controlled device for treating cancerous tumors. The computer controls direction, intensity, and timing of a beam that destroys the tumor. Various delays have put the project behind schedule, and the deadline is approaching. There will not be time to complete all the planned testing. The system has been functioning properly in the routine treatment scenarios tested so far. You are the project manager, and you are considering whether to deliver the system on time, while continuing testing and making patches if the team finds bugs. FNISH

Scenario 7: Schedule Pressures - Product to market

Scenario 7: Schedule Pressures - Product to market You are a programmer working for a very small start-up company. The company has a modest product line and is now developing a truly innovative new product. Everyone is working 60-hour weeks and the target release date is nine months away. The bulk of the programming and testing is done. You are about to begin the beta testing. (See Section 8.3.1 for an explanation of beta testing.) The owner of the company (who is not a programmer) has learned about an annual industry show that would be ideal for introducing the new product. The show is in two months. The owner talks with the project manager. They decide to skip the beta testing and start making plans for an early release.

Scenario 8: Software License Violation

Scenario 8: Software License Violation Your company has 25 licenses for a computer program, but you discover that it has been copied onto 80 computers.

Scenario 9: Going Public

Scenario 9: Going Public Suppose you are a member of a team working on a computer-controlled crash avoidance system for automobiles. You think the system has a flaw that could endanger people. The project manager does not seem concerned and expects to announce completion of the project soon. Do you have an ethical obligation to do something?

User interfaces and human factors

User interfaces should: --provide clear instructions and error messages --be consistent --include appropriate checking of input to reduce major -----system failures caused by typos or other errors a person will likely make --The user needs feedback to understand what the system is doing at any time. --The system should behave as an experienced user expects. --A workload that is too low can be dangerous. --This is a simple and common example of considering human factors in designing software

critisimcs of computing techonology

We might urgently try to prevent implementation of some applications and urgently advocate of increased protection from risks, yet not consider the threats and risks as reasons for condemning the technology as a whole.

Neo-Luddites: Views of Economics, Nature, and Human Need

What is the purpose of technology? --To Luddites, it is to eliminate jobs to reduce cost of production --To non-Luddites, it is to reduce effort needed to produce goods and services. While both statements say nearly the same thing, the first suggests massive unemployment, profits for capitalists, and a poorer life for most workers. The second suggests improvements in wealth and standard of living. --They see profit-seeking goals of businesses as in fundamental conflict with the well-being of workers and the natural environment.

responding to the threats of intelligent machines

What protections do people who fear for the human race recommend? --Space enthusiasts suggest creating colonies in space. Joy observes that it will not happen soon enough. If it does, it might save the human race but not the vast majority of humans on earth. If colonists take the current technologies with them, the threat goes too. --A second solution is to develop protections that can stop the dangeroustechonolgies from getting out of control. "A portfolio of resilient responses." Joy argues that we could not develop shields in time, and if we could they would necessarily be at least as dangerous as the technologies they are supposed to protect us against. Joy recommends "relinquishment" by which we must limit development of the technologies that are too dangerous, by limiting our pursuit of certain kinds of knowledge. --relinquishment has the same kinds of weaknesses Joy attributes to the approaches he rejects: they are either undesirable or unachievable or both. --Although we can find flaws with all proposals to protect against the dangers of powerful technologies, that does not mean we should ignore the risks.

NOT high reliability

ex: VW automotive company --green turbo diesel --didn't pass smog test --not eco friendly --deontological issues?

Overconfidence (therac25)

in the first overdose incident, when patient told the machine operator that the machine had burned her, the operator told her that was impossible. this was one of the many indications that the makers and some of the users of the Therac-25 were overconfident about the safety of the program

Trends in computer access

the phenomenon that new technologies and inventions first are expensive luxuries, then become cheaper and spread throughout the population, has led some observers to conclude that it is more accurate to think of people as haves and have-laters.

Narrow info stream (questions)

who's presenting the info? what stakes do they have? can you trust reviews? how do you know which are real? yelp - can you trust the people who post on there


संबंधित स्टडी सेट्स

((Chapter 7)) Unemployment and Inflation, Macro Homework, Econ Ch.9

View Set

Chapter 9 -- Introduction to Organic Molecules

View Set

chapter 13: leadership effectiveness

View Set

Taxes, retirement and other insurance concepts

View Set

Head and Neck Muscles--- Functions

View Set

Chapter 7: How Cells Harvest Energy (Cellular Respiration)

View Set