WWS 351 Midterm Review

Pataasin ang iyong marka sa homework at exams ngayon gamit ang Quizwiz!

Web Server Market Share

Apache: 50% nginx (also open source): 20% Microsoft: 13%

Linus Torvalds

Finnish grad student Linus Torvalds 1. Build GPL-licensed OS kernel (UNIX-like). He emailed it to friends to say that they were welcome to work on it. 2. Kernel grew. Linus successfully built a kernel. 3. Linux kernel (Linus) + GNU Utilities (Stallman) = Linux. Linux grew organically from one guy to thousands of Linux programmers. The development team is as big as those at Windows or Mac OS.

Technology/policy re spectrum

Radio waves are analogous to sound waves. 1. Waves can interfere 2. How to control/avoid interference? Many ways 1. Divide up time, allocate to different senders 2. Divide up frequencies (assign bands of them), allocate 3. Divide up space, allocate Mobile phone technology uses all of these methods. NB: Spectrum and time are scarce.

Technology does not develop along a single path - there are many false starts (1)

Technologies are Malleable and dead ends are common 1. A case in point: Arthur Hotchkiss's bicycle railroad in Smithville 2. A monorail for bikes that would let blue-collar workers fly over farmlands to their factory

The Politics of Net Neutrality

Technology, Regulation and the Architecture of the Information & Communications Market

Switch

1. Represented by a house with inward slanted walls with an antenna looking thing on top 2. Electricity comes in from the left, and when the gate is open, flows out through the right

What was Google thinking?

*Mountain View: Google built WiFi here *Motorola: Purchased Motorola *Wireless watch: *2010 spectrum auction: almost purchased spectrum Google's rattling its saber to show it can be an ISP like Verizon.

Selling Eyeballs: False Starts

1. Advertising rates based on visitors (unreliable, gameable) 2. Demographic information about visitors (no advantage over traditional media) 3. Pay per click (manipulation by persons or software)

The Rules Were Contested from the Start

1. Big ISPs (led by Verizon) challenged proposed rules in court, but case rejected as moot. 2. Free Press suit demanded stronger rules - opposes easier regulation of cellular companies (went nowhere).

Universal Circuit, Software, and Programming

1. Big advantage of coupling universal circuit with memory is that you don't need to design circuit for every purpose, as you just need to change the inputs 2. This is the idea of software and programming: you can decide later what you want it to do

Developments in code necessary for the takeoff of e-commerce? (browsers, javascript, cookies)

1. Browser:a software application for retrieving, presenting and traversing information resources on the World Wide Web 2. Javascript: the programming language of the Web 3. Cookies: information (a small text file) that a site saves to your computer using your web browser. Cookies make the personalization of your web experiences possible. •Commercial applications would require a user-friendly graphical interface - i.e. a "browser" -Tim Berners-Lee invented first browser at CERN in 1989 -Mark Andreesen released Mosaic in 1993 at Univ. Illinois - commercialized as Netscape -Thomas Reardon built Internet Explorer (on Spyglass), IE bundled with Windows, takes over world (95% of market) temporarily •Javascript (Netscape 1994; IE 1996) and cookies (Netscape 1994) for individualization & transactions

Appliances solve the problem of protecting intellectual property.

1. But so did television in the 1970s. 2. Benkler: "The battle over the institutional ecology of the digitally networked environment is waged ... over how many individual users will continue to participate in making [that] environment, and how [many] will continue to sit on the couch..."

CPU

1. Central processing unit. 2. Sometimes referred to simply as the central processor, but more commonly called processor, the CPU is the brains of the computer where most calculations take place. In terms of computing power, the CPU is the most important element of a computer system.

Circuits

1. Chips drive computers 2. Silicon has distinctive electrical properties 3. You can build switches by putting different impurities in silicon wafers 4. These wafers connected by small strings of wire are circuits

Stakeholders: The Pros

1. Civil liberties groups - vibrant public square (ACLU, Common Cause, Public Knowledge, American Library Association) 2. Social movement groups - MoveOn, Christian Coalition (at first) 3. Content and service providers (Google, eBay, Amazon, Yahoo, Skype, Netflix) A. Want to avoid new fees and hold-ups by ISPs B. Recognize that ISPs have interest in downgrading services (VoIP, streaming movies) that compete with their own.

Business Relationships

1. Device branding A. user has a contract with the carrier i. Individual or family/group plan: family group/plan has roots in days when family would purchase landline for home. In early days of iPhone, iPhone closely associated with Apple and only sold at Apple stores. 2. Typically device sold roughly $200 below cost A. consumer is locked into 2 year monthly. NB: The phone discount is made up for with higher network costs. This contract structure disincentivizes switching and carriers offer contracts that extend into the future. It's also difficult to switch out of this model. There's sticker shock when phone priced at $800-900. The FCC is constantly trying to decide when customers can get out of their contracts.

Eric Raymond and Richard Stallman

1. Eric Raymond: great programmer made famous by book Cathedral and the Bazaar 2. Stallman had a moral argument that many programmers didn't buy 3. Raymond provided this clever metaphor of cathedral and bazaar in which open source is commercial, entrepreneurial. 4. Stallman had been top-down, rigid, moral like the cathedral. Raymond popularized the term "Open Source."

What made e-commerce work?

1. Establishing trust through reputation and collaborative filtering 2. Targetting eyeballs: The advertising model 3. Producing and exploiting network externalities (with an excursus on networks and network analysis) 4. Long tails: Keeping costs low and selections high 5. The Freemium model 6. Disintermediation (more on Wednesday)

Positivity Bias in Rating Systems

1. Herd behavior 2. Fear of retaliation 3. Self-selection 4. Strategic manipulation Yet 70 percent of consumers report that they trust online reviews.

Control through Code: A Non-Internet Example (from Karen Levy's dissertation)

1. For decades, truckers have kept track of their work hours - which are limited by federal regulations to prevent fatigue-related accidents - using paper logbooks, which are easily falsified by drivers eager to maximize their driving time (and thus pay). 2. New regulations would mandate that drivers' time be automatically monitored by electronic devices that integrate into trucks themselves and send information back to centralized online portals in real time, thus attempting to compel drivers' compliance with the timekeeping rules. 3. Karen Levy considers how these legal rules and the technological capacities of the devices themselves, are co-evolving to shape enforcement practices, as well as the ways in which social relationships among employees, employers, and law enforcement officers are reconfigured when such systems are used. 4. Drawing together concepts from legal studies, organizational sociology, and the sociology of technology, the project aims to reframe our understanding of regulation and discretion for the age of ubiquitous computing, and to contribute to broad social debates about the role of technological surveillance in legal rulemaking and in social life.

Coursera

1. For-profit with $67 M in venture capital 2. April 2012 launch 3. 93 institutions 4. 500+ courses 5. 5+ million Courserians

AdWords

1. Google AdWords is an online advertising service that places advertising copy above, below, or beside the list of search results Google displays for a particular search query, or it displays it on their partner websites. 2. The choice and placement of the ads is based in part on a proprietary determination of the relevance of the search query to the advertising copy. 3. AdWords has evolved into Google's main source of revenue.

Angie's List: Producing Trust on a High-Stakes Recommendation Site

1. High Stakes A. Pay for membership B. Big ticket items so risk high 2. Primary challenge: Creating trust in recommendations that are basis for reputations A. No anonymous reviews B. Investigation of reviewers to avoid shills 3. Secondary challenge: Creating trust in claims about trustworthiness A. Personification: i) owner as face of company ii) heavy TV marketing B. Endorsement: i) investigation by major auditing firm ii) reference to media coverage C. Guarantees i) helpline ii) complaint adjudication Nonetheless, Angie's list has been accused of failing to post negative reviews of companies that advertise on its site

Foresight is limited by getting things wrong

1. Honeywell 316 - 1969: Used to process Arpanet messages and other things a 16-bit computer could do. 2. Then Honeywell tried to find a new market: The first home computer, 1969: The only application Honeywell's marketers could think of was managing recipes. So they pitched it to women as a kitchen appliance. $10,600 at Neiman Marcus. (Price included 2-week programming course.). No units were sold.

Policy Overview

1. Issues 2. Arenas 3. Modalities of Control 4. Values 5. Stakeholders 6. Politics

How To Get Information From A to B

1. Make a "packet" Schematic representation of a packet To: <net address> From: <net address> <contents> Packet size less than or equal to 1000 bytes, and a byte is 8 bits 2. Give packet to the network 3. Network delivers packet to its destination 4. Large messages get "packetized" into pieces and assembled at destination

Stallman's Position

1. Morally wrong to prevent someone from modifying or sharing software 2. Okay to get paid for stuff, but have to allow people to modify and share software Stallman sees software as being similar to science in that it needs to be advanced 3. Stallman's Economic Argument: software has high fixed costs and zero marginal costs: if it's free to give it to people, you should. 4. Stallman embedded this idea in term "free software." Free has 2 meaning: free speech and free beer. Software is free as in speech not as in beer. Free also known as "libre." One can also call it FOSS (Free and open-source software) or FLOSS (free/libre and open-source software). Stallman was the first to reach this position, and still believes in it strongly.

Wires

1. Represented by a straight line 2. Wire has 2 states: on and off 3. States have to do with the number of electrons 4. It is digital, as there are only 2 levels we care about. The device tries to stay near one state or the other. As such, it is able to deal with errors or corruptions by rounding off. This is not the case with analog technology.

Page rank

1. PageRank is what Google uses to determine the importance of a web page. It's one of many factors used to determine which pages appear in search results. 2. PageRank measure's a web page's importance. Page and Brin's theory is that the most important pages on the Internet are the pages with the most links leading to them. PageRank thinks of links as votes, where a page linking to another page is casting a vote. 3. Now that people know the secrets to obtaining a higher PageRank, the data can be manipulated.

"The Long Tail: Why the Future of Business is Selling Less of More" by Chris Anderson

1. Retailers take advantage of size and scope to earn returns from the "long tail" (Anderson 2006).

Moore's Law

1. Rule of thumb 2. No theory as to why this should be the case 3. Has held true since roughly 1960 4. Cost of some unit of circuitry, memory etc. is cut in half every 18 months 5. Equivalently, capacity (this can mean anything) available at fixed cost doubles every 18 months. 6. This is incredibly fast growth 8. Cost of 50 GB of storage 1981: $15 million 1990: $500,000 2000: $500 2012: $5 2015: free at Box.com 9. This is power of Moore's Law: businesses now give away as a loss-leader at loss of 40-50 cents 10. There are cities full of engineers making technology better 11. People in the industry assume it will hold true 12. Moore's Law gives you the assumption that you'll get a new phone in 2 years

Technical Background in Two Slides: TCP/IP Protocol

1. Step 1: The TCP Protocol breaks data into packets 2. Step 2: The packets travel from router to router over the Internet according to the IP protocol 3. Step 3: The TCP protocol reassembles the packets into the original whole

Examples of Network Goods

1. Telephone 2. Fax 3. Adobe PDF 4. eBay

"Lessons from the History of the Internet" by Manuel Castells

1. The Castells reading focuses on the origins of the Internet 2. Castells provides a historical overview about the development of the Internet, starting from its birth as ARPANET and its subsequent transformation by various social and political forces at the time. In 1958, during the time of the Cold War, the US government set up ARPA in the Department of Defense as a way of enhancing its technological superiority by connecting universities and sharing resources. Since computers were expensive equipment at the time, networks helped share time between computers. A standard protocol (TCP) had to be developed to allow computers to communicate with one another. 3. Castells argues that this early history was the result of the convergence between military science and a counterculture movement. He dispels the myth that the Internet was developed for military purposes; although the initial vision was of a decentralized network was presented to the military, initially the Pentagon rejected the idea, and later separated its own military network from the research network. Further, Castells indicates that the scale innovation of the development would not have been possible for the private sector. With ARPA and other public funding, the Internet developed squarely in an open, academic rather than military or business environment. Many of the early developers of the Internet were hackers who were interested in creating free and open software. Interestingly, Castells never clarifies that the word "hacker" has multiple connotations. Although it is commonly associated with a negative meaning that refers to those who pirate or sabotage networks, its original meaning referred to the open software movement that began in the early history of the Internet. These hackers were happy to create and share their innovations, leading to tools such as Linux, early browsers, messaging systems, and other Internet programs. This open movement was key to the development of the Internet because it allowed early users, who were mostly graduate students, to freely discuss topics. Castells argues that this uncensored medium aligned with some of the counterculture principles of that generation. 4. In this telling of Internet history, Castells describes the impact of a relatively small number of key figures. For example, Tim Berners-Lee, working at CERN, helped build the world wide web by creating a browser/editor program. This program was later released to the public and expanded into Mosaic, the basis of what became Netscape. Linus Torvalds, a student at the University of Helsinki, created a UNIX-based operating system, called Linux, and freely released it to the public. Richard Stallman, working at MIT, established a "copyleft" clause that encouraged anyone who improved on an open software to share their version with the public. These, and other individuals, helped shape this early history of the Internet and transform it into a more autonomous, distributed, and open network.

Application-agnostic network management

1. The first approach - which is the approach favored by van Schewick - bans application-specific discrimination, but allows application-agnostic discrimination 2. Under this rule, a network provider would not be allowed to treat Vonage differently from Skype, YouTube differently from Hulu, or the website of the New York Times differently from the website of the Wall Street Journal or Free Press. That would be discrimination based on application. Nor would it be allowed to treat online video differently from e-mail, treat applications that use the BitTorrent protocol differently from applications that do not use this protocol, or treat latency-sensitive applications differently from latency-insensitive applications. That would be discrimination based on class of application. 3. But it would be allowed to treat data packets differently based on criteria that have nothing to do with the application or class of application. For example, it could give one person a larger share of the available bandwidth if that person has paid for a higher tier of Internet service (e.g., if that person has paid for the "Up to 6 Mbps" Internet service packet instead of the "Up to 3 Mbps" Internet service packet).

Rating systems

1. There are currently very few practical methods for assessing the quality of resources or the reliability of other entities in the online environment. This makes it difficult to make decisions about which resources can be relied upon and which entities it is safe to interact with. 2. Trust and reputation systems are aimed at solving this problem by enabling service consumers to reliably assess the quality of services and the reliability of entities before they decide to use a particular service or to interact with or depend on a given entity.

Stallman's Evolution

1. What set Stallman off was when people at the AI lab went off to start companies. Stallman modified the printer, but wasn't able to spread this development around because of the commercialization of software

Raising the Stakes: By combining cyberspace and intimate space

1. Uber A. Risk: Physical - assault, reckless driving, and so on B. Strategy: *Driver background checks •Symmetric anonymized rating systems •Insurance; cash-free transactions 2. Airbnb A. Risk: Property damage, litigation B. Strategy: *Symmetric rating systems and reviews (simultaneous release; listing rank based on rating) * Profiles *"Host Guarantees" ($1M, but carry insurance) *Language: "community," "family"

Why do people contribute to collective rating or review systems?

1. Vengeance 2. Enthusiasm/altruism 3. Commitment to generalized reciprocity (Golden Rule, "paying forward") 4. Pleasure in writing 5. Interest in reputation or building human capital?

Freemium

1. Version 1: Give something away that will require many people to purchase something else. 2. Version 2: Give something away and hope that people will like it so much they will want to pay more for even better.

Long-Term Trend: WiFi Offloading

1. WiFi offloading A. Unlicensed spectrum B. Low-cost (free or cheap to users) C. Carries 30-70% of mobile data traffic 2. Multiple flavors: home or office, offered by a business (e.g. Starbucks), commercial service (e.g. Boingo) 3. Influencing the market structure A. More options for customers B. Cellular for coverage, and WiFi for capacity C. Seamless authentication and mobility support

eBay's Dilemma (Diekmann et al.)

1.Enable buyers to establish phenomenological trust - to view site as legitimate and to default to trusting the competence and good will of the people who ran the site. 2.Enable buyers to "distinguish between trustworthy and non-trustworthy sellers" when they can't talk to seller or see goods 3.Encourage sellers to be trustworthy (secured trust) 4.Drive out untrustworthy sellers.

"The long tail"

1.On-line retailing + 2.Cheap production + 3.Robotics in warehousing + 4.GPS in trucking fleets = 5.Robustness of long tail strategy. When demand is highly dispersed, selling a few copies of lots and lots of different things may work. (But you need recommendation systems for consumers to find the products they want.)

Product niches

A good or service with features that appeal to a particular market subgroup. A typical niche product will be easily distinguished from other products, and it will also be produced and sold for specialized uses within its corresponding niche market.

Issue Domain

A set of policies that 1. are linked by common values, interests and stakeholders; and 2. choices about each have implications for choices about the others

An Excursus on Network Analysis

After Many Years Network Analysis has gone from being the pursuit of a small cult of scientists in many disciplines to being very popular...

Creative Destruction: Who is Next?

Cable TV and universities

Network neutrality

Corporations and governments may not interfere with users' right to access any part of the World Wide Web for any legal purpose. 1. They may not interfere with the transmission of particular content, particular sites, classes of websites, or classes of web traffic. 2. They may not interfere by: A. blocking B. discriminating with technology C. discriminating through pricing

North American Admissions

Down Slightly

The Zittrain Cycle

Generativity requires interoperativity, which leads to vulnerability, which requires control, which threatens generativity

Newspapers

Hit by a tsunami

Law

In 1995, NSF surrendered control to private entities (nonprofit governance organizations and commercial backbone providers) in 1995. (NSF net restricted to educational and research use).

And from privacy to monitoring.

Invasive technologies produced for one purpose find new uses •From establishing identity at the session level to using code (cookies etc.) to aggregate user identities across sessions. •Taking data aggregated for commercial purposes and using it for government surveillance: we're trapped in Foucault's Panopticon

There are no technological imperatives. Technologies provide affordances rather than dictate behavior.

Inventors rarely know how their inventions will be used 1. The phonograph was first commercialized in 1888 by Jesse Lippincott, who thought it would replace stenographers and notepads. 2. It didn't. 3. Later, people paid to hear phonograph recordings in public... 4. Take the case of Hedy Lamarr: No star was more beautiful than she. She outran even the beauty factory of 1940 Hollywood. Hedy Lamarr came to America from Austria. She'd run away from a bad marriage to an arms maker who had helped to arm armed Mussolini for his invasion of Ethiopia. Her flight to America had also been a flight from the horrors of fascism as well as her marriage. In 1940 Lamarr met composer George Antheil at a dinner party. They fell to talking. The next evening, she invited him to dinner at her place. A peculiar chemistry had risen between two remarkable minds. They talked far into that night. Between them, they had an idea. Allied subs, it seems, were wasting torpedoes. Ocean currents and evasive action worked against them. Lamarr and Antheil meant to do something about that. Lamarr, just 26, had been only a girl when she'd listened to her husband talking about torpedoes. She might have looked like pretty wallpaper, but she'd been a quick pupil. And Antheil had done ingenious early work with the technology of modern music. The solution, they reasoned, was a radio-controlled torpedo. But it would be easy for the enemy to jam a radio-control signal. So they cooked up something called "frequency-hopping." The trick was to set up a sequencer that would rapidly jump both the control signal and its receiver through 88 random frequencies. They patented the system and gave it to the Navy. The Navy actually did put the system to use, but not in WW-II. Sylvania engineers reinvented it in 1957. The Navy first used frequency-hopping during the 1962 blockade of Cuba. That was three years after the Lamarr/Antheil patent had expired. Today, frequency hopping is used with the wireless phones that we have in our homes, GPS, most military communication systems

Understanding the "Net Neutrality Debate"

Lecture by Jennifer Rexford

Greater regulation has entailed a movement from anonymity and freedom to identity and accountability.

Lessig's "Identity Layer" - Protocols that Authenticate Users (1996-2013) 1. Permanent IP addresses and caching of temporary addresses to tie users to sessions. 2. IP Mapping software to enable states to locate activity in space (essential for identifying violation of state or national laws; useful for retail) 3. Site specific security requiring public/private key encryption for sensitive websites 4. Permanent identities for particular domains (e.g., what you do at www.princeton.edu) 5. Stable identity within most of what you do online (Google, Facebook)

Pay for priority

Paid prioritization is a financial arrangement in which a content owner pays a broadband provider to "cut to the front of the line" at congested nodes, or where a broadband provider engages in "vertical prioritization" by favoring its own content.

Newspaper Classified Ads in Millions 1990-2010

Peaked in 2000: declined between then and 2005 and declined incredibly dramatically between 2005 and 2010

All Print Newspaper Ad Revenues

Same pattern as Newspaper Classified Ads in Millions 1990-2010

Six degrees of separation

Six degrees of separation is the theory that anyone on the planet can be connected to any other person on the planet through a chain of acquaintances that has no more than five intermediaries. E.g. 1. DiMaggio to Homans to Mama Homans to H. Adams to J.Q. Adams to J.Adams to George Washington 2. And now you are all 7 degrees from George Washington, Benjamin Franklin & Louis XIV

Recorded Music

The Internet: Terrible for big integrated record companies, maybe not so bad for music

WIPO

The World Intellectual Property Organization (WIPO) is the UN agency responsible for treaties involving copyright, patent, and trademark laws. WIPO can be a force for progressive change, helping the world take into account public interest and development needs. But all too often, governments are using international treaties negotiated through WIPO as well as other bilateral trade agreements to ratchet up IP rights at the behest of copyright holders.

Root servers

The authoritative name servers that serve the DNS root zone, commonly known as the "root servers", are a network of hundreds of servers in many countries around the world. They are configured in the DNS root zone as 13 named authorities, as follows.

Cloud computing

The practice of using a network of remote servers hosted on the Internet to store, manage, and process data, rather than a local server or a personal computer.

Affordance

The qualities or properties of an object that define its possible uses or make clear how it can or should be used

Structure of Industry

Type of businesses 1. Carriers (mobile networking) 2. Device makers 3. Platform makers (Apple - iOS, Google - Android). NB: Apple bundles device and platform. 4. app makers

The Film Industry

Weathering the storm

Networks are everywhere and nowhere: Network analysis is a set of questions and tools.

Well, what do you know about that! ...These forty years now, I've been speaking in prose without knowing it. - M. Jourdain in Moliere's, The Bourgeois Gentleman

Control through Markets: Using incentives

Why has government not had to pass many laws to foster trustworthy behavior in eCommerce? 1. Online retailers and auction sites have incentives to protect users and create trust 2. Banks and credit-card companies have interest in ensuring security of transactions

Let the Network Produce Your Content

e.g. Facebook, Youtube

Let the Network Be Your Product

e.g. eBay, Meetup, eHarmony

Specialized servicese

• Specialized services: services offered by broadband providers that share capacity with broadband Internet access service over providers' last-mile facilities • Examples: facilities-based VoIP, IP video, e-reading services, heart rate monitoring, energy sensing

Freedom of Expression

•Are DMCA takedown-notice provisions consistent with freedom of expression? •Should government require libraries that serve kids to have filters on computers that block websites with "obscene" material - what if these filters also block non-obscene material?

In 2012

•21% e-mailed, posted or shared music •15% of U.S. adults e-mailed, posted or shared own photography; •13% edited photos; 12.4% "created photographs for artistic purposes") •12% played an instrument •5.9% did "creative writing" •4.4% e-mailed, posted or shared one's own music

Values II: A Capable Citizenry

•Citizens have to be competent to consume and interpret information - Mass literacy and numeracy essential •Public schooling, first primary then secondary - the "common school movement" (Horace Mann) •Land grant colleges (Morrill Act)

The Arenas of Net Neutrality

•Congress •FCC •Courts •Private companies •Code (router manufacturers; programmers in ISPs and middle-mile firms; Broadband Internet Technical Advisory Group)

Policy Options: Modalities of Control (From Lessig, Code, ch. 7)

•Law •Markets •Architecture ("code") •Norms

Security

•Should government be permitted to engage in routine monitoring of on-line communication? •Should government restrict use of encryption technologies? •Should government require "back doors" to all communication applications?

Takeaways

•To analyze policy, focus on: -Issues -Arenas -Modalities -Values -Stakeholders -Political Context •Concepts -Issue domain -Lessig's 4 modalities of control: law, markets, code, norms -Interactions among law and other modalities --Constitutive values in U.S. information policy (Starr) -"diffuse" vs. "concentrated" interests in policy formation

Code, Law and Enthusiasm were not Enough

•Trust in reliability and safety •Business models that did more than burn venture capitalists $$$

Intellectual Property (see duplicate)

•World International Property Organization: The World Intellectual Property Organization (WIPO) is the UN agency responsible for treaties involving copyright, patent, and trademark laws. WIPO can be a force for progressive change, helping the world take into account public interest and development needs. But all too often, governments are using international treaties negotiated through WIPO as well as other bilateral trade agreements to ratchet up IP rights at the behest of copyright holders. •Congress •Courts •Registrar of Copyright: Authorizes exemptions to DMCA anti-circumvention - e.g. for libraries or organizations of blind persons who alter DRM software to permit use of read-aloud software to enable sightless people to hear readings of digital works.

Gateway

1. A gateway is a network point that acts as an entrance to another network. 2. On the Internet, a node or stopping point can be either a gateway node or a host (end-point) node. Both the computers of Internet users and the computers that serve pages to users are host nodes. The computers that control traffic within your company's network or at your local Internet service provider (ISP) are gateway nodes.

"A Declaration of Independence for Cyberspace" by John Perry Barlow

1. Barlow's brief statement of principle captures the cyberutopianism of the Internet's early days 2. statement of the core belief of many cyber-libertarians that governments should have no authority on the Internet. 3. call for sovereign nations to give up all claims of authority on the Internet

Interoperability

1. Interoperability: devices can "talk to each other." 2. Need agreement on how to say what you want and rules of passing information.

Transistor

1. The transistor, invented by three scientists at the Bell Laboratories in 1947, rapidly replaced the vacuum tube as an electronic signal regulator. A transistor regulates current or voltage flow and acts as a switch or gate for electronic signals. 2. Transistors are the basic elements in integrated circuits (ICs), which consist of very large numbers of transistors interconnected with circuitry and baked into a single silicon microchip or "chip."

Network externalities

A network has positive externalities if the value of that network increases as a function of the number of persons (or nodes of any kind) that it includes.

The Evolution of Thought on Freedom on the Internet

"Information wants to be free." - Stewart Brand 1984 "Governments of the Industrial World, you weary giants of flesh and steel, I come from Cyberspace, the new home of Mind. On behalf of the future, I ask you of the past to leave us alone. You are not welcome among us. You have no sovereignty where we gather." John Perry Barlow, 1996 "Left to itself, cyberspace will become a perfect tool of control." Lawrence Lessig, 2006 How did we get from Barlow to Lessig in just 10 years?

Policy Making: The Legislative Process

"Laws, like sausages, cease to inspire respect in proportion as we know how they are made." John Godfrey Saxe, 1869 (sometimes attributed to Otto von Bismarck)

The public case for net neutrality

"Net neutrality may sound like a technical issue, but it's the key to preserving the internet as we know it -- and it's the most important First Amendment issue of our time." - Al Franken Exemplars: 1. 2005: Telus's (Canadian ISP) blockage of union website, Voices of Change 2. 2007: Verizon blocks NARAL (Pro-Choice America) access to its wireless broadband text-message network, claiming right to block "controversial or unsavory" content 3. 2005 Madison River Communications (ISP) blocks VoIP 4. 2009 Comcast slows BitTorrent traffic BUT: Exemplars are few, responses have been effective, and consensus has emerged in North America that content censorship is unacceptable. Sen. Al Franken (D-MN)

The Nuances of Net Neutrality

"Network neutrality" is not one thing - it is several. Each version of network neutrality raises different issues and the case for each version rests on different kinds of facts.

Variety.com: "Cable TV Tightens its Grip on Revenues"

"Subscriber declines don't matter much when those who stick around pay more than ever. But how much longer can these companies continue to defy gravity?" (Todd Spangler, Variety May 2013)

What is Net Neutrality? 1st cut: Partisan Rhetoric (2013)

"The Federal Communications Commission is writing new rules - called network neutrality rules - to dictate how Internet providers can manage the information that flows over their networks... [N]etwork neutrality supporters really just want the government - for the first time since the inception of the Internet - to have control over how private companies deliver broadband to your home and office." -- From the website of the Internet Freedom Coalition "The Internet will look a whole lot different if network operators get to favor one online business or speaker over another. We can't let the Verizons of the world turn the Web into their own private fiefdoms where they award express service to their corporate allies and shunt everyone else to the side. Verizon has put its cards on the table. Under its preferred scenario, the open Internet no longer exists."

The Public Case Against Net Neutrality

"The new [2013 network neutrality] rules represent an unprecedented power-grab by the unelected members of the FCC .. The `unreasonable discrimination' order would in effect establish that the FCC would have an approval portal that companies must pass through just to manage their day-to-day operations." --Kay Bailey Hutchison ISPs do have to manage their networks actively, including using deep packet inspection: 1. latency-dependent applications (e.g., VoIP, gaming, films) need more bandwidth & different buffering 2. periods of network congestion require management 3. need to protect users against Spam, malware, or DoS attacks 4. CALEA requirements (Communications Assistance to Law Enforcement Act) But... Bandwidth scarcity is socially constructed (little firm evidence; partly related to choices [e.g., how much of cable devoted to Internet v. TV]. And consensus has emerged that minimal-interference congestion management is acceptable.

"Creative Destruction" - Joseph Schumpeter

*Capitalism, Socialism & Democracy, 1942 •Capitalism is periodically revitalized (in whole and in particular industries) by game-changing innovations •These innovations are sometimes referred to as competence-destroying technologies because they make previous ways of doing things (and the knowledge on which these are based) obsolete

Values III: Privacy

-Democracy also requires privacy, so citizens could organize and communicate without fear -Privacy of U.S. mails -Prohibition on access to Census information (except in aggregate form) -In Europe, the opposite: Government not transparent, Citizens were...

Control through Law

-Digital Millennium Copyright Act (DMCA): ISPs liable for repeat IP offenders -Communications Assistance for Law Enforcement Act (CALEA) etc.: Requiring ISPs or cell phone companies to use technologies that permit spying -DMCA: $10,000 penalty for each downloaded tune... Which of these laws has been least effective? DMCA penalty

Control through Norms

-Norms among cooperating professionals: Internet protocols; Internet Engineering Task Force; Linux programmers; Joint Photographic Experts Group; -Private ordering: Peering among infrastructure companies -Norms governing discussion board postings •Respectful treatment of others by respondents •Not divulging information that is "too personal" •Not divulging secrets of third parties -Real-world norms enforced By digital means: it all started when a woman recently refused to clean up her dog's excrement on a subway in South Korea. Fellow travelers, obviously bothered by the new addition to the train, expressed their irritation. She did not yield. In the old days, that would have been the end of it. But today, when face-to-face persuasion fails, there's a fallback plan: anonymous Internet humiliation. The Post notes, "One of the train riders took pictures of the incident with a camera phone and posted them on a popular Web site. Net dwellers soon began to call her by the unflattering nickname [loosely translated to 'Dog Poop Girl'], and issued a call to arms for more information about her."And information they received. "According to one blog that has covered the story, 'within days, her identity and her past were revealed.'" But that wasn't enough: blogs and online discussion groups buzzed with dirt about Dog Poop Girl's parents and relatives, and cries for more invasions of her privacy. As George Washington University law professor Daniel J. Solove wrote on one blog, this was a demonstration of bloggers acting "as a cyber-posse, tracking down norm violators and branding them with digital scarlet letters."

Network neutralities (3rd cut)

1) The Impossible Dream: •Pure Classic End to End Neutrality: ISPs simply move packets from mid-mile operators to end users, without even the technical means to discriminate. Horses are out of the barn on this one. 2) The Main Event: A. Content Neutrality. Prohibits discrimination (or blocking) on the basis of content. B. Application Neutrality. Prohibits discrimination against particular applications within a class (e.g., YouTube vs. Hulu) (Open Internet Order includes "special services" provided by ISPs) C. Application-Class Neutrality. Prohibits discrimination against different types of applications (application agnostic) D. Content-provider Neutrality. Prohibits tiering: discrimination in favor of providers who pay more for superior service, and thus against providers who fail to pay. (Why does Yoo think this is a good thing?) C) Side shows (for now) A. Consumer-based discrimination based on metering of usage (controversial), slowdowns for peak users during congestion (approved) or price/bandwidth tier (conventional) B. Search neutrality - Are search-engine algorithms stacking the deck against some kinds of content? Is Google a quasi-monopolist? (Speculation in law schools...) C. Middle-mile neutrality - Are the networks that link content servers to ISPs a level playing field? (No [Yoo 2010: some bits have to pay traffic charges; some bits slowed so carriers can stay below mandated traffic levels; providers that can afford multiple server farms get faster service as do those that use caching services like Akimai], but we know very little about them and it's a regulatory nightmare.) D. Common carriage - Requiring last-mile carriers to rent capacity to competitors (as had been case with telcos)

What is Net Neutrality?

1. "Pinning down a precise definition of network neutrality is difficult." -Christopher Yoo 2. "Network neutrality proponents agree that network neutrality rules should preserve the internet's ability to serve as an open, general-purpose infrastructure... There is, however, a lot of uncertainty on how to get from a high-level commitment to network neutrality to a specific set of rules."- Barbara van Schewick

High-courtesy equilibrium

1. (1) high rate of providing evaluations, and (2) extreme rarity of neutral or negative evaluations. 2. The first suggests that free riding is overcome, the second that buyers are grading generously, or saying nothing after bad experiences.

Application-class-specific discrimination

1. A "class of applications" is a group of individual applications that share some common characteristic. Thus, there are many different potential classes of applications based on which a network provider could discriminate, each defined by the criteria that are used to allocate the applications to the classes. 2. For example, a class of applications may be the group of all applications of the same application type (e.g., Internet telephony, e-mail), all applications that use the same application-layer protocol (e.g., all applications that use SIP, all applications that use HTTP) or transport-layer protocol (e.g., all applications that use TCP, all applications that use UDP), or all applications that have similar technical requirements (e.g., all latency-sensitive applications, all latency-insensitive applications). 3. A network provider discriminates "based on class of application" if it treats the application differently depending on whether it belongs to the class or not. Since classes are defined by a common characteristic that the applications in the class share, discrimination based on class of application is the same as discrimination based on a characteristic of an application.

Or Structure

1. A Gate at top 2. B Gate at bottom 3. Both connected to each other 4. Always on part is to the left, between A and B 5. Output registers A or B on right side, between A and B 6. Can build a "not gate" etc. 7. Any statement you can express in logic, you can build 8. Can build up any logical formula

"Regulation 3.0 for Telecom 3.0" by Eli Noam

1. A brief article by Eli Noam that highlights the implications for Internet policy of new technological developments 2. Telecommunications infrastructure goes through technology-induced phases, and the regulatory regime follows. Telecom 1.0, based on copper wires, was monopolistic in market structure and led to a Regulation 1.0 with government ownership or control. Wireless long-distance and then mobile technologies enabled the opening of that system to one of multi-carrier provision, with Regulation 2.0 stressing privatization, entry, liberalization, and competition. But now, fiber and high-capacity wireless are raising scale economies and network effects, leading to a more concentrated market. At the same time, the rapidly growing importance of infrastructure, coupled with periodic economic instabilities, increase the importance of upgrade investments. All this leads to the return for a larger role for the state in a Regulation 3.0 which incorporates many elements (though using a different terminology) of the traditional regulatory system—universal service, common carriage, cross-subsidies, structural restrictions, industrial policy, even price and profit controls. At the same time, the growing role of telecommunications networks of carriers of mass media and entertainment content will also lead to increasing obligations on network providers to police their networks and assure the maintenance of various societal objectives tied to mass media. These are predictions, not recommendations.

"Cloud TV: Toward the next generation of network policy debates" by Eli Noam

1. A brief article by Eli Noam that highlights the implications for Internet policy of new technological developments 2. We are entering the 4th generation of TV, based on the online transmission of video. This article explores the emerging media system, its policy issues, and a way to resolve them. It analyzes the beginning of a new version of the traditional telecom interconnection problem. The TV system will be diverse in the provision of technology, standards, devices, and content elements. For reasons of interoperation, financial settlements, etc., this diversity will be held together by intermediaries that are today called cloud providers, and through whom much of media content will flow. Based on their fundamental economic characteristics, the cloud operators will form a concentrated market structure. To protect pluralism and competition among clouds and of providers of specialized elements requires the protection of interoperation. This can be accomplished by a basic rule: by the principle of an a la carte offering of service elements.

Ad placement with and without cookies

1. A cookie is information (a small text file) that a site saves to your computer using your web browser. Cookies make the personalization of your web experiences possible. 2. The term "third-party" indicates that rather than having a direct relationship with a user, a company has a relationship with one or more of the websites that a user visits. For example, if a user visits sportsfan.com, that website is the "first party." If sportsfan.com partners with an advertising network, platform, or exchange to place ads, the network, platform, or exchange is the "third party." The advertising network uses cookies when the user visits sportsfan.com to help it select and serve the best ad. These cookies are considered "third-party cookies". First parties partner with third parties in this way because third parties have technology and expertise to enable more efficient ad placement across websites. NAI members, working with brands, publishers and websites, use third-party cookies to make advertising more engaging and relevant to users and more valuable to publishers and advertisers. 3. The days of dropping in small pieces of code or "cookies" on a user's computer will one day be a thing of the past. Yes, these cookies are still helpful with letting marketers know where or how a user has interacted with something on the web, but they can easily be deleted, therefore providing advertisers with incorrect information, especially from mobile devices. On average, cookies have a 59% tracking success rate, and according to executives on an Atlas launch panel, they overstate frequency by 41%. Using Atlas, Facebook has fixed this problem by linking users' ad interactions to their Facebook persistent ID rather than a cookie , which allows the social network to measure user activity on both desktop and mobile devices, including mobile conversion and desktop conversion tracking.

"GNU Manifesto" by Richard Stallman

1. A manifesto that heralded the open-source software movement 2. Richard Stallman's GNU Manifesto outlines his desire to create a new computer operating system and disseminate it for free. Both an intervention in creative software design and open access, once GNU is acquired, users are entitled to four freedoms: "The freedom to run the program as you wish; the freedom to copy the program and give it away to your friends and co-workers; the freedom to change the program as you wish, by having full access to source code; the freedom to distribute an improved version and thus help build the community."

Cookies

1. A message given to a Web browser by a Web server. The browser stores the message in a text file. The message is then sent back to the server each time the browser requests a page from the server. 2. The main purpose of cookies is to identify users and possibly prepare customized Web pages for them. When you enter a Web site using cookies, you may be asked to fill out a form providing such information as your name and interests. This information is packaged into a cookie and sent to your Web browser which stores it for later use. The next time you go to the same Web site, your browser will send the cookie to the Web server. The server can use this information to present you with custom Web pages. So, for example, instead of seeing just a generic welcome page you might see a welcome page with your name on it.

The distinction between the web and apps

1. A mobile website is designed specifically for the smaller screens and touch-screen capabilities of smartphones and tablets. It can be accessed using any mobile device's Web browser, like Safari on iOS and Chrome on Android. Users simply type in the URL or click on a link to your website, and the website automatically detects the mobile device and redirects the viewer to the mobile version of your website. 2. A mobile app is a smartphone or tablet application. Unlike a mobile website, a mobile app must be downloaded and installed, typically from an app marketplace, such as the Apple App Store or Android's Google Play store.

Router

1. A networking device that forwards data packets between computer networks 2. A router is connected to two or more data lines from different networks 3. When a data packet comes in one of the lines, the router reads the address information in the packet to determine its ultimate destination. Then, using information in its routing table or routing policy, it directs the packet to the next network on its journey. This creates an overlay internetwork. Routers perform the "traffic directing" functions on the Internet. A data packet is typically forwarded from one router to another through the networks that constitute the internetwork until it reaches its destination node

Packet

1. A packet is the unit of data that is routed between an origin and a destination on the Internet or any other packet-switched network. 2. A packet is the unit of data that is routed between an origin and a destination on the Internet or any other packet-switched network. When any file (e-mail message, HTML file, Graphics Interchange Format file, Uniform Resource Locator request, and so forth) is sent from one place to another on the Internet, the Transmission Control Protocol (TCP) layer of TCP/IP divides the file into "chunks" of an efficient size for routing. Each of these packets is separately numbered and includes the Internet address of the destination. The individual packets for a given file may travel different routes through the Internet. When they have all arrived, they are reassembled into the original file (by the TCP layer at the receiving end).

Public goods

1. A product that one individual can consume without reducing its availability to another individual and from which no one is excluded. 2. Economists refer to public goods as "non-rivalrous" and "non-excludable". 3. National defense, sewer systems, public parks and basic television and radio broadcasts could all be considered public goods.

What are public goods? (nonrival, nonexcludable)

1. A product that one individual can consume without reducing its availability to another individual and from which no one is excluded. 2. Economists refer to public goods as "non-rivalrous" and "non-excludable". 3. National defense, sewer systems, public parks and basic television and radio broadcasts could all be considered public goods.

Rootkit

1. A rootkit is a type of malicious software that is activated each time your system boots up. 2. Rootkits are difficult to detect because they are activated before your system's Operating System has completely booted up. A rootkit often allows the installation of hidden files, processes, hidden user accounts, and more in the systems OS. Rootkits are able to intercept data from terminals, network connections, and the keyboard.

Anti-blocking vs. nondiscrimination rules

1. A rule against blocking: a rule that forbids network providers from blocking applications, content and services on their networks. A rule against blocking is part of all network neutrality proposals; it is the one rule on which all network neutrality proponents agree. 2. Nondiscrimination rules: ban differential treatment that falls short of blocking. The first approach - which is the approach favored by van Schewick - bans application- specific discrimination, but allows application-agnostic discrimination.

80/20 rule

1. A rule of thumb that states that 80% of outcomes can be attributed to 20% of the causes for a given event i.e. 20% of products responsible for 80% of sales 2. Anderson believes the long tail means the death of the so-called 80/20 rule

Spiders, crawlers, bots

1. A spider is a program that visits Web sites and reads their pages and other information in order to create entries for a search engine index. 2. The major search engines on the Web all have such a program, which is also known as a "crawler" or a "bot."

Creative destruction and its 3 mechanisms

1. A term coined by Joseph Schumpeter in his work entitled "Capitalism, Socialism and Democracy" (1942) to denote a "process of industrial mutation that incessantly revolutionizes the economic structure from within, incessantly destroying the old one, incessantly creating a new one." Mechanisms: •Aggregation: bringing together larger audiences as eBay did •Disintermediation: reduction in the use of intermediaries between producers and consumers •Hypersegmentation: the process of looking at your customers and dividing them up into segments - for example you may have a cluster of retired customers, some young girls, etc. Then once you know this and have identified them, you market to them differently based on their differing wants/needs.

Programming language

1. A vocabulary and set of grammatical rules for instructing a computer to perform specific tasks. The term programming language usually refers to high-level languages, such as BASIC, C, C++, COBOL, FORTRAN, Ada, and Pascal. 2. Each language has a unique set of keywords (words that it understands) and a special syntax for organizing program instructions. 3. High-level programming languages, while simple compared to human languages, are more complex than the languages the computer actually understands, called machine languages. Each different type of CPU has its own unique machine language.

Opinion #2: Service Providers

1. AT&T at a higher risk for focused overload A. Many customers have iPhones B. and unlimited data plans 2. Good to introduce FaceTime gradually A. Constrain the number of users B. Create incentives to limit use C. Reduce negative impact on others 3. Dynamic rate limiting was less attractive A. Complex, not supported by equipment B. May degrade performance for all

The Policy Issue: "Net Neutrality"

1. About the rights and obligations of the ISPs - Internet Service Providers - that bring the Internet over "the last mile" to our homes and offices. 2. FCC's 2013 Open Internet Order limited ISPs rights to interfere with the bits that pass through their systems. 3. Large part of that order was declared null and void early in 2014 by 2nd District Court of Appeals - with significant implications for the architecture of media, information, and communications industries. 4. After appearing to capitulate, the FCC is now trying to do what it must do (reclassifying the Internet) legally to assert authority and promulgage "open internet" regulations.

Problems to Solve to Make Worldwide Web Work

1. Addressing (packets) 2. Routing: even if you know address, internet structure is complicated and intricate. How are gateways supposed to know what to do with inputs? Hard to do with network grown organically 3. Reliability: how to get network to work when many parts are down 4. Security: how to protect internet

Information appliance

1. An "information appliance" is one that will run only those programs designated by the entity that built or sold it. 2. In the taxonomy of generativity, an information appliance may have the leverage and adaptability of a PC, but its accessibility for further coding is strictly limited.

Client-server

1. An architecture in which the user's PC (the client) is the requesting machine and the server is the supplying machine, both of which are connected via a local area network (LAN) or a wide area network (WAN) such as the Internet. Throughout the late 1980s and early 1990s, client/server was the hot buzzword as applications were migrated from minicomputers and mainframes with input/output terminals to networks of desktop computers. 2. With ubiquitous access to company LANs and the Internet, almost everyone works in a client/server environment today. However, to be true client/server, both client and server must share in the business processing.

"The Generative Internet" by Jonathan Zittrain

1. An influential essay by Jonathan Zittrain that emphasizes the tradeoff between security and freedom as the Internet develops 2. The generative capacity for unrelated and unaccredited audiences to build and distribute code and content through the Internet to its tens of millions of attached personal computers has ignited growth and innovation in information technology and has facilitated new creative endeavors. It has also given rise to regulatory and entrepreneurial backlashes. A further backlash among consumers is developing in response to security threats that exploit the openness of the Internet and of PCs to third-party contribution. A shift in consumer priorities from generativity to stability will compel undesirable responses from regulators and markets and, if unaddressed, could prove decisive in closing today's open computing environments. This Article explains why PC openness is as important as network openness, as well as why today's open network might give rise to unduly closed endpoints. It argues that the Internet is better conceptualized as a generative grid that includes both PCs and networks rather than as an open network indifferent to the configuration of its endpoints. Applying this framework, the Article explores ways - some of them bound to be unpopular among advocates of an open Internet represented by uncompromising end-to-end neutrality - in which the Internet can be made to satisfy genuine and pressing security concerns while retaining the most important generative aspects of today's networked technology.

"How the Internet Works, and Why it's Impossible to Know What Makes Your Netflix Slow" by Tim Fernholz and David Yanovsky

1. An informative article by two journalists on recent developments 2. You'll hear people say that debates over transit and peering have nothing to do with net neutrality, and in a sense, they are right: Net neutrality is a last-mile issue. But at the same time, these middle-mile deals affect the consumer internet experience, which is why there is a good argument that the back room deals make net neutrality regulations obsolete—and why people like Netflix's CEO are trying to define "strong net neutrality" to include peering decisions. 3. What we're seeing is the growing power of ISPs. As long-haul networks get cheaper, access to users becomes more valuable, and creates more leverage over content providers, what you might call a "terminating access monopoly." While the largest companies are simply building their own networks or making direct deals in the face of this asymmetry, there is worry that new services will not have the power to make those kinds of deals or build their own networks, leaving them disadvantaged compared to their older competitors and the ISP. 4. The counter-argument is that the market works: If people want the services, they'll demand their ISP carry them. The problem there is transparency: If customers don't know where the conflict is before the last mile, they don't know whom to blame. Right now, it's largely impossible to tell whether your ISP, the content provider, or a third party out in the internet is slowing down a service. That's why much of the policy debate around peering is focused on understanding it, not proposing ideas. Open internet advocates are hopeful that the FCC will be able to use its authority to publicly map networks and identify the cause of disputes. 5. The other part of that challenge, of course, is that most people don't have much choice in their ISP, and if the proposed merger between the top two providers of wired broadband,Time Warner Cable and Comcast, goes through, they'll have even less.

Operating system

1. An operating system (sometimes abbreviated as "OS") is the program that, after being initially loaded into the computer by a boot program, manages all the other programs in a computer. 2. The other programs are called applications or application programs. 3. The application programs make use of the operating system by making requests for services through a defined application program interface (API). 4. In addition, users can interact directly with the operating system through a user interface such as a command language or a graphical user interface (GUI).

Operating systems

1. An operating system (sometimes abbreviated as "OS") is the program that, after being initially loaded into the computer by a boot program, manages all the other programs in a computer. The other programs are called applications or application programs. The application programs make use of the operating system by making requests for services through a defined application program interface (API). In addition, users can interact directly with the operating system through a user interface such as a command language or a graphical user interface (GUI). 2. Operating systems consists of a kernel as well as utilities. The kernel is the core of the OS, while utilities make the OS useful.

OS & Device: SDK/Handset Agreements

1. Android A. OS is free and open (unlike Apple iOS) B. But the OS isn't the whole story 2. Agreements with handset manufacturers A. Early access to new versions of Android B. Engineering and technical support C. Access to Google Play (app store and search) 3. Anti-fragmentation policy A. Reduces app portability problems B. Limits OS experimentation (e.g. search, navigation)

Case Studies

1. App stores 2. Carrier service agreements 3. Network-unfriendly applications 4. SDK and handset agreements 5. WiFi offloading

Some "Vertical" Players in Mobile Business

1. Apple: Devices (iPhone/iPad) and OS (iOS) 2. Google: OS (Android), Apps, and recently devices 3. Samsung: Top handset manufacturer; sells LTE equipment, handset equipment 4. Huawei: Mobile devices and network equipment

Authentication systems

1. Authentication: The process of identifying an individual, usually based on a username and password. 2. In security systems, authentication is distinct from authorization , which is the process of giving individuals access to system objects based on their identity. Authentication merely ensures that the individual is who he or she claims to be, but says nothing about the access rights of the individual. 3. Zittrain suggests that one way to reduce pressure on institutional and technological gatekeepers is to make direct responsibility more feasible. Forthcoming piecemeal solutions to problems such as spam take this approach. ISPs are working with makers of major PC e-mail applications to provide for forms of sender authentication.

"The Web is Dead. Long Live the Internet" by Chris Anderson and Michael Wolff

1. Authors Chris Anderson and Michael Wolff argued that the World Wide Web was "in decline" and "apps" were in ascendance. 2. They clearly forecast the rise of the mobile Web, but the debate they launched with the apps vs. Web formulation continues.

Bitnet

1. BITNET was an early world leader in network communications for the research and education communities, and helped lay the groundwork for the subsequent introduction of the Internet, especially outside the US. 2. BITNET was a "store-and-forward" network similar to the Usenet, and coincidentally invented at about the same time, in 1981, by Ira Fuchs and Greydon Freeman at the City University of New York (CUNY), and originally named for the phrase "Because It's There Net", later updated to "Because It's Time Net". 3. The network was designed to be inexpensive and efficient, and so was built as a tree structure with only one path from one computer to another, and like the early Usenet with low bandwidth telephone connections, typically at 9600 bps or about 960 characters a second. 4. The first BITNET connection was from CUNY to Yale University. By the early 90's, BITNET was the most widely used research communications network in the world for email, mailing lists, file transfer, and real-time messaging. 5. One of the most popular elements of BITNET was their mailing lists on every subject under the sun, from butterfly biology to theoretical physics, usually filtered and approved by a human moderator, and supported by the LISTSERV software. 6. A second network called BITNET II was created in 1987, in an effort to provide a higher bandwidth network similar to the NSFNET. However, by 1996, it was clear that the Internet was providing a range of communication capabilities that fulfilled BITNET's roles, so CREN ended their support and the network slowly faded away.

Opinion #1: App Developers

1. Bad to single out one (popular) app A. May lead to blocking other lawful apps B. Requires upgrade to expensive plans C. Discourages investment in mobile apps 2. App-agnostic management is better A. Rate limit customers during peak hours B. Vary pricing based on congestion C. regardless of the application

Why is Net Neutrality Such a Prominent Issue?

1. Because of the stakes. 2. But also because of its centrality - in terms of the issues and players -- to the information policy field. A. Cybersecurity - deep packet inspection critical to CALEA rules & protection of networks and lay behind implementation of smart-pipe incentives B. Intellectual property - ISPs increasingly focus of rights-holders (graduated response) - deep packet inspection crucial, but may jeopardize safe-harbor status C. Privacy - Elimination of commercial neutrality would increase incentive for ISP information-gathering on subscriber behavior D. Digital divide - 2010 Berkman Report: best predictor of broadband penetration is whether country has common carriage requirements (fair access of competitors to last-mile networks).

GNU Project

1. Build a GPL licensed "clone" of UNIX 2. NB: UNIX: Pronounced yoo-niks, a popular multi-user, multitasking operating system developed at Bell Labs in the early 1970s. Created by just a handful of programmers, UNIX was designed to be a small, flexible system used exclusively by programmers. 3. Operating systems consists of a kernel as well as utilities. The kernel is the core of the OS, while utilities make the OS useful. 4. Stallman was an amazing programmer, but while the utilities were good, the kernel was bad.

2 ways that content providers can speed delivery without running afoul of net neutrality rules

1. Build their own backbone 2. Build content delivery network (CDN): a large distributed system of servers deployed in multiple data centers across the Internet. The goal of a CDN is to serve content to end-users with high availability and high performance. Content providers such as media companies and e-commerce vendors pay CDN operators to deliver their content to their audience of end-users.

Implications of Mobile Technology

1. Cameras everywhere: changes journalism, civilian-police relations 2. Kids can talk to friends always 3. Breakdown of wall between business and personal computing

Implications of Being Able to Build Circuits

1. Can do arithmetic: because we take arithmetic and express it as logic 2. Binary (base 2) representation: in base 2, each digit is either 0 or 1. Thus 1011011 = 64 + 16 + 8 + 2 + 1 = 91 Addition and subtraction work the same in binary. 3. Can represent text: each letter represented by a number: 1 = a, 2 = b, 3 = c etc. Want more for capitals, punctuation, other languages. In general, can represent many things in many formats. Almost anything that can be represented as a symbol can be represented as a number.

Foresight is limited by insufficient vision

1. Castells: DOD offered to give AT&T Arpanet in 1972: They weren't interested. 2. Post Office had e-mail service and got rid of it in 1983

Circuit

1. Chips drive computers 2. Silicon has distinctive electrical properties 3. You can build switches by putting different impurities in silicon wafers 4. These wafers connected by small strings of wire are circuits Implications of Being Able to Build Circuits 1. Can do arithmetic: because we take arithmetic and express it as logic 2. Binary (base 2) representation: in base 2, each digit is either 0 or 1. Thus 1011011 = 64 + 16 + 8 + 2 + 1 = 91 Addition and subtraction work the same in binary. 3. Can represent text: each letter represented by a number: 1 = a, 2 = b, 3 = c etc. Want more for capitals, punctuation, other languages. In general, can represent many things in many formats. Almost anything that can be represented as a symbol can be represented as a number.

Collaborative filtering

1. Collaborative filtering (CF) is a technique used by some recommender systems. 2. Collaborative filtering has two senses, a narrow one and a more general one. In general, collaborative filtering is the process of filtering for information or patterns using techniques involving collaboration among multiple agents, viewpoints, data sources, etc. 3. In the newer, narrower sense, collaborative filtering is a method of making automatic predictions (filtering) about the interests of a user by collecting preferences or taste information from many users (collaborating).

New Online News Magazines Producing More Original Content

1. Combination of investors (Vice, .Mic, the Awl) and parent companies (Quartz, Verge) 2. Combination of paid staff (often former journalists at top papers) and "bloggers" or "volunteers" 3. Strong emphasis on click-metrics (page views, tweets, favorites) delivered to newsroom in real time 4. Tension over definition of "good journalism" as both serious and popular 5. Variation in resources allocated to reporting 6. Specialization (Vice: Hip, International; Quartz: Business; .Mic: Youth; The Awl: Culture)

Without Strong Net Neutrality Rules, What Future Could Look Like

1. Comcast Lowers Usage Caps: Netflix Shares Plummet 2. AG OKs Google/AT&T Merger: New firm promises to bring faster broadband to more Americans 3. Cablezon Offers Prime Customers Quick Access to Over 1000 Websites for $150 a Month: Offer Available with 3-year Cable, Voice, Kindle & Home Security Contract

"The Emerging Field of Internet Governance" by Laura DeNardis

1. DeNardis demonstrates how technical decisions about networks have important implications for the distribution of power and about social values 2. this paper has conveyed how Internet governance functions carry significant public interest implications and how these functions are diffusely distributed among new institutional forms, the private sector, and more traditional forms of governance. Network management via deep packet inspection raises privacy concerns; Internet protocol design makes decisions about accessibility, interoperability, economic competition, and individual freedoms; critical resource administration has implications for the future of the Internet's architecture as well as the pace of access and economic development in the global south; governments use technologies such as filtering and blocking for censorship and surveillance.

Deep packet inspection technologies

1. Deep packet inspection (DPI) is an advanced method of packet filtering that functions at the Application layer of the OSI (Open Systems Interconnection) reference model. The use of DPI makes it possible to find, identify, classify, reroute or block packets with specific data or code payloads that conventional packet filtering, which examines only packet headers, cannot detect. 2. Another example of complex relationship between commercial and state interests •Backdoors in pipes mandated under powers given the Justice Department under the Communications Assistance to Law Enforcement Act (CALEA) of 1994 •But backdoors can be used by ISPs to manage traffic and gain commercial advantage •They also make ISPS vulnerable to demands that they serve as IP law enforcement agents •Deep Packet Inspection can be used for: -Analyzing data flows -Responding to government information requests -Network security -Facilitating services like VoIP that require special treatment -(DPI and "packet sniffing") BUT! DPI also gives operators the opportunity to block content they don't like; block competing services; and gather data on customers.

Key Protocol: TCP/IP

1. Designed in 1970s by Vint Cerf and Bob Kahn 2. IP = "Internet Protocol." Device implements internet protocol, when it can connect to the internet. IP provides common language for networks to work together.

TCP/IP

1. Designed in 1970s by Vint Cerf and Bob Kahn 2. IP = "Internet Protocol." Device implements internet protocol, when it can connect to the internet. IP provides common language for networks to work together. 3. (Transmission Control Protocol/Internet Protocol) The most widely used communications protocol. Developed in the 1970s under contract from the U.S. Department of Defense, TCP/IP was invented by Vinton Cerf and Bob Kahn. This de facto Unix standard is the protocol of the Internet and the global standard for local area networks and wide area networks, the major exception being the traditional networks of the telephone companies. TCP/IP is commonly referred to as just "IP," which is the network layer of the protocol (see illustration below); thus, the terms "TCP/IP network" and "IP network" are synonymous. The TCP/IP suite provides two transport methods. TCP ensures that data arrive intact and complete, while UDP just transmits packets. TCP is used for data that must arrive in perfect form, and UDP is used for real-time applications such as voice over IP (VoIP) and video calling, where there is no time to retransmit erroneous or dropped packets. TCP/IP is a routable protocol, and the IP network layer in TCP/IP provides this capability. The header prefixed to an IP packet contains not only source and destination addresses of the host computers, but source and destination addresses of the networks they reside in. Data transmitted using TCP/IP can be sent to multiple networks within an organization or around the globe via the Internet, the world's largest TCP/IP network. Every node in a TCP/IP network requires an IP address (an "IP") which is either permanently assigned or dynamically assigned (see IP address and DHCP).

Things that have been Disintermediated

1. Destroyers (examples) A. eBay B. eTrade C. Google books/Amazon D. Cdbaby, Soundcloud E. Kayak.com F. Netflix G. Google, Wikipedia 2. Victims A. Antiques specialists B. Walk-in brokerages C. Libraries/ book stores D. Record companies E. Travel agencies F. Video stores G. Reference books

What is Different About Mobile?

1. Device is moving around 2. Device is small 3. Device is always on (a legacy of traditional phone and unlike laptop, which is often asleep) and usually networked 4. battery-powered 5. uses radio spectrum 6. lots of sensors, especially the camera 7. in developed world, one per person 8. also a phone

Six Degrees: An Example

1. DiMaggio to Homans to Mama Homans to H. Adams to J.Q. Adams to J.Adams to George Washington 2. And now you are all 7 degrees from George Washington, Benjamin Franklin & Louis XIV

The Dialectic of Freedom and Constraint

1. Dialectic: the tension between freedom and unfreedom: not just an opposition, as you come out with third thing that is better and transcends both 2. This is both the challenge and the hope

Solutions

1. Did it work? A. Diekmann et al. >2/3 of buyers gave feedback and B. Positive feedback increased chance of sale (for new phones & DVDs) and also increases the price commanded for the same item 2. Issues A. Reciprocal or one-way? B. Problems of vengeance, perfidy, and identity? C. How much information?

"Quality of Service"

1. Different applications have different needs.10 For example, Internet telephony is very sensitive to delay, but does not care about occasional packet loss. By contrast, e-mail is very sensitive to packet loss, but does not care about some delay. 2. So you could imagine a network that treats packets belonging to different applications differently, depending on their needs. 3. For example, a network could give low-delay service to Internet telephony packets, but best-efforts service to e-mail packets. 4. A network that offers different types of service to different data packets is a network that offers "Quality of Service."

Concentrated Interests Almost Always Trump Diffuse Interests

1. Diffuse vs. Diffuse: Inaction e.g. Infotech vouchers: 2. Diffuse vs. Concentrated: Concentrated wins e.g. most intellectual property issues 3. Concentrated vs. Diffuse: Concentrated wins e.g. Media Concentration (Comcast/NBC merger) 4. Concentrated vs. Concentrated: Gridlock Or Compromise e.g. Net neutrality/Open Internet

Analog vs. digital

1. Digital describes electronic technology that generates, stores, and processes data in terms of two states: positive and non-positive. 2. Prior to digital technology, electronic transmission was limited to analog technology, which conveys data as electronic signals of varying frequency or amplitude that are added to carrier waves of a given frequency

Search engine functions (crawling, caching, indexing, ranking)

1. Distributed web crawling is a distributed computing technique whereby Internet search engines employ many computers to index the Internet via web crawling. 2. Caching: retaining a copy of a web page after indexing it 3. Indexing: The web is like an ever-growing public library with billions of books and no central filing system. Google essentially gathers the pages during the crawl process and then creates an index, so we know exactly how to look things up. Much like the index in the back of a book, the Google index includes information about words and their locations. When you search, at the most basic level, our algorithms look up your search terms in the index to find the appropriate pages. 4. Ranking: Once search engine has selected the relevant documents, it ranks the search results, that is puts them in order. Ranking makes the search useful.

Richard Stallman

1. Dr. Richard Stallman launched the Free Software Movement in 1983 by announcing the plan to develop the GNU operating system, intended to be composed entirely of free software

Foresight is limited by both undue optimism and insufficient vision (1)

1. Early on, people overestimate the amount and rapidity of change. 2. Then, when it does not occur, they discount the possibility. 3. Then it happens very quickly when they least expect it.

"The Structure of the Web" by David Easley and Jon Kleinberg

1. Easley and Kleinberg analyze the web as a network 2. we consider a diff erent type of network, in which the basic units being connected are pieces of information, and links join pieces of information that are related to each other in some fashion. We will call such a network an information network. As we will see, the World Wide Web is arguably the most prominent current example of such a network, and while the use of information networks has a long history, it was really the growth of the Web that brought such networks to wide public awareness. While there are basic di fferences between information networks and the kinds of social and economic networks that we've discussed earlier, many of the central ideas developed earlier in the book will turn out to be fundamental here as well: we'll be using the same basic ideas from graph theory, including short paths and giant components; formulating notions of power in terms of the underlying graph structure; and even drawing connections to matching markets when we consider some of the ways in which search companies on the Web have designed their businesses.

These trends are likely to be exacerbated by the rise of cloud computing and the substitution of app-centric wireless devices for web-centric PCs.

1. Eli Noam: More and more functions will move to a few powerful cloud providers, threatening interoperability and innovation. 2. A few companies will potentially have immense control and information over the Internet and how we use it, unless regulations protect competition. 3. Why? High fixed costs + low marginal costs + high network externalities = pressures toward industrial concentration

Control through Markets: Creating incentives

1. Eliminating sales tax for on-line retail activity. 2. Why did Amazon drop opposition to permitting states to tax online sales?Amazon can handle red tape easier than the competition: thus competition hurt more by sales tax than Amazon

"The Cathedral and the Bazaar" by Eric Raymond

1. Eric Raymond's shrewd mix of analysis and personal memoir 2. Anatomizes a successful open-source project, fetchmail, that was run as a deliberate test of some surprising theories about software engineering suggested by the history of Linux. Discusses these theories in terms of two fundamentally different development styles, the "cathedral" model of most of the commercial world versus the "bazaar" model of the Linux world. 3. Shows that these models derive from opposing assumptions about the nature of the software-debugging task. 4. Then makes a sustained argument from the Linux experience for the proposition that "Given enough eyeballs, all bugs are shallow", suggests productive analogies with other self-correcting systems of selfish agents, and concludes with some exploration of the implications of this insight for the future of software.

IP addresses

1. Every machine on a network has a unique identifier. Just as you would address a letter to send in the mail, computers use the unique identifier to send data to specific computers on a network. Most networks today, including all computers on the Internet, use the TCP/IP protocol as the standard for how to communicate on the network. 2. In the TCP/IP protocol, the unique identifier for a computer is called its IP address.

Honeywell video

1. Expectations of technology are socially embedded: you're going to be able to shop online, you can monitor your kids, the man will pay the bills for the woman of the house 2. Necessity of 3 screens unclear 3. Also Honeywell video messes up prediction of family organization in the future

Case-by-case regulation vs. brightline standards

1. Felten: Declaring a vague standard rather than a bright-line rule can sometimes be good policy, especially where the facts on the ground are changing rapidly and it's hard to predict what kind of details might turn out to be important in a dispute. Still, by choosing a case-by-case approach, the FCC left us mostly in the dark about where it would draw the line between "reasonable" and "unreasonable". 2. van Schewick: A second set of proposals recognizes that some forms of differential treatment will be socially harmful, while others will be socially beneficial, but assumes that it is impossible to distinguish among them in advance. Therefore, these proposals suggest adopting standards that specify criteria that will be used to judge discrimination in the future. Whether certain discriminatory conduct meets these criteria would be determined by the regulatory agency in future case-by case adjudications. 3. As the paper shows, both approaches are flawed. Banning all discrimination is overinclusive and restricts the evolution of the network more than necessary to protect the values that network neutrality rules are designed to protect. Allowing all discrimination is underinclusive and effectively makes the rule against blocking meaningless.

Incumbents challenged by competence-destroying technologies:

1. Fleeing from change: •Imagine the new technology through the lens of the old: AOL c. 1997 •Use monopoly power: Cable industry has tried to impose contracts that prevent providers from selling content to Internet providers •Use local monopoly power over home Internet service to degrade competition (illegal under net neutrality rules) •Use monopoly power to hike cable fees as high as possible 2. Adapting to change: •Use monopoly power to make ISP revenues compensate for lost cable revenues •Spin off content properties through Internet outlets •Collaborate with platforms like Facebook, Snapchat and Youtube to air content, with sharing of ad revenues

Data Mining

1. Generally, data mining (sometimes called data or knowledge discovery) is the process of analyzing data from different perspectives and summarizing it into useful information - information that can be used to increase revenue, cuts costs, or both. 2. Data mining software is one of a number of analytical tools for analyzing data. It allows users to analyze data from many different dimensions or angles, categorize it, and summarize the relationships identified. Technically, data mining is the process of finding correlations or patterns among dozens of fields in large relational databases.

"Access to Broadband Networks: The Net Neutrality Debate" by Angele Gilroy

1. Gilroy provides a neutral overview of the issues as Congress faces them 2. As congressional policy makers continue to debate telecommunications reform, a major discussion point revolves around what approach should be taken to ensure unfettered access to the Internet. The move to place restrictions on the owners of the networks that compose and provide access to the Internet, to ensure equal access and non-discriminatory treatment, is referred to as "net neutrality." While there is no single accepted definition of "net neutrality," most agree that any such definition should include the general principles that owners of the networks that compose and provide access to the Internet should not control how consumers lawfully use that network, and they should not be able to discriminate against content provider access to that network.

The Big Intellectual Leap

1. Going from a special purpose to general-purpose computer 2. This leap is due to Alan Turing in 1935/1936. Turing is 22 in Cambridge, UK. He's teaching and he has the idea of a general-purpose computer. His advisers are excited and tell him to go work with Alonzo Church at Princeton. Turing at Princeton from 1936-1938. Turing's idea is to build a circuit that emulates circuits, a so-called "universal circuit."

Mobile Technology

1. Good example of path-dependence: way things are influenced by history 2. Mobile computing evolved out of telephone system 3. Smartphones evolved out of feature phones 4. There are implications from the fact that mobile computing developed from the phone industry 5. For one, development closely regulated by the FCC 6. Another thing to note: because this path-dependency, country differences have to do with differing state regimes of regulating telephony e.g. regulated state-owned monopolies were source of tax revenue that governments didn't want to lose 7. Another example of path-dependency is number portability. The FCC decreed that if you switch carrier, you can keep number. This is still the case. 8. Implications: 1) mobile phone number is de facto identifier. It's the new social security number. 2) Disconnect between phone number and location. Before FCC decree, one's phone number was closely tied to one's geographical location. When traveling, you would've had to let people know where you're going to be when and how to reach you.

Addressing

1. IP address is a 32-bit number, so roughly 4 billion possible addresses. There's a scarcity of IP addresses. IP addresses are allocated administratively and not always fairly. UCSD used to have more IP addresses than China. 2. Addresses allocated administratively e.g. Princeton allocated IP addresses. IP address can be independent of location. IP address is like a phone number: can be called on, no matter where you are.

Logic built from wires and switches

1. If Signal A enters from the left, and Signal B enters from the top, the gate is on if (A is on) and (B is on) 2. Output is represented by A and B 3. This is an "and gate" 4. If you can build an and structure, you can build an or structure

"Directed graph"

1. In a directed graph, the edges don't simply connect pairs of nodes in a symmetric way — they point from one node to another. 2. This is clearly true on the Web: just because you write a blog post and include a link to the Web page of a company or organization, there is no reason to believe that they will necessarily reciprocate and include a link back to the blog post.

Fair use

1. In its most general sense, a fair use is any copying of copyrighted material done for a limited and "transformative" purpose, such as to comment upon, criticize, or parody a copyrighted work. 2. Such uses can be done without permission from the copyright owner. In other words, fair use is a defense against a claim of copyright infringement. If your use qualifies as a fair use, then it would not be considered an illegal infringement.

CALEA (1994 Communications Assistance for Law Enforcement Act)

1. In response to concerns that emerging technologies such as digital and wireless communications were making it increasingly difficult for law enforcement agencies to execute authorized surveillance, Congress enacted CALEA on October 25, 1994. 2. CALEA requires a "telecommunications carrier," as defined by the CALEA statute, to ensure that equipment, facilities, or services that allow a customer or subscriber to "originate, terminate, or direct communications," enable law enforcement officials to conduct electronic surveillance pursuant to court order or other lawful authorization. 3. CALEA is intended to preserve the ability of law enforcement agencies to conduct electronic surveillance by requiring that telecommunications carriers and manufacturers of telecommunications equipment design and modify their equipment, facilities, and services to ensure that they have the necessary surveillance capabilities as communications network technologies evolve. 4. Deep-Packet Inspection: Another example of complex relationship between commercial and state interests •Backdoors in pipes mandated under powers given the Justice Department under the Communications Assistance to Law Enforcement Act (CALEA) of 1994 •But backdoors can be used by ISPs to manage traffic and gain commercial advantage •They also make ISPS vulnerable to demands that they serve as IP law enforcement agents

Middle mile

1. In the broadband Internet industry, the "middle mile" is the segment of a telecommunications network linking a network operator's core network to the local network plant, typically situated in the incumbent telco's central office that provides access to the local loop, or in the case of cable television operators, the local cable modem termination system. This includes both the backhaul network to the nearest aggregation point, and any other parts of the network needed to connect the aggregation point to the nearest point of presence on the operator's core network 2. You'll hear people say that debates over transit and peering have nothing to do with net neutrality, and in a sense, they are right: Net neutrality is a last-mile issue. But at the same time, these middle-mile deals affect the consumer internet experience, which is why there is a good argument that the back room deals make net neutrality regulations obsolete—and why people like Netflix's CEO are trying to define "strong net neutrality" to include peering decisions.

"Best efforts"

1. In the original Internet, the network provides a single best-effort service. 2. That is, the network does its best to deliver data packets, but does not provide any guarantees with respect to delay, bandwidth or losses. 3. Thus, the network operates like the default service offered by the postal service, which does not guarantee when a letter will arrive or whether it will arrive at all. Contrary to the postal service, which lets users choose services other than the default service like two-day shipping, the original Internet provides only best-effort service.

"Navigational links" vs. "transactional links"

1. In view of these considerations, it is useful to think of a coarse division of links on the Web into navigational and transactional, with the former serving the traditional hypertextual functions of the Web and the latter primarily existing to perform transactions on the computers hosting the content. 2. This is not a perfect or clear-cut distinction, since many links on the Web have both navigational and transactional functions, but it is a useful dichotomy to keep in mind when evaluating the function of the Web's pages and links.

The Universal Circuit

1. Inputs entering the circuit from the left: information specifying which circuit the universal circuit is to emulate and inputs to the circuit that the universal circuit is supposed to emulate 2. Output exiting from the right side of the universal circuit: what emulated circuit would do, given the provided input

"Usage caps"

1. Internet data caps are monthly limits on the amount of data you can use over your Internet connection. 2. When an Internet user hits that limit, different network operators engage in different actions, including slowing down data speeds, charging overage fees, and even disconnecting a subscriber. These caps come into play when a user either uploads or downloads data. Caps are most restrictive for wireless Internet access, but wired Internet access providers are also imposing these caps.

Bandwidth caps

1. Internet data caps are monthly limits on the amount of data you can use over your Internet connection. 2. When an Internet user hits that limit, different network operators engage in different actions, including slowing down data speeds, charging overage fees, and even disconnecting a subscriber. These caps come into play when a user either uploads or downloads data. Caps are most restrictive for wireless Internet access, but wired Internet access providers are also imposing these caps.

Internet governance

1. Internet governance scholars, rather than studying Internet usage at the content level, examine what is at stake in the design, administration, and manipulation of the Internet's actual protocological and material architecture. 2. This architecture is not external to politics and culture but, rather, deeply embeds the values and policy decisions that ultimately structure how we access information, how innovation will proceed, and how we exercise individual freedom online. 3. "Governance" in the Internet governance context requires qualification because relevant actors are not only governments. Governance is usually understood as the efforts of nation states and traditional political structures to govern. Sovereign governments do perform certain Internet governance functions such as regulating computer fraud and abuse, performing antitrust oversight, and responding to Internet security threats. Sovereign governments also unfortunately use content filtering and blocking techniques for surveillance and censorship of citizens. Many other areas of Internet governance, such as Internet protocol design and coordination of critical Internet resources, have historically not been the exclusive purview of governments but of new transnational institutional forms and of private ordering.

Key Internet Concepts

1. Interoperability: devices can "talk to each other." Need agreement on how to say what you want and rules of passing information. 2. Protocol: a technical standard that allows interoperability. This is like the voicemail menu. Design of protocols allows different networks to work together

Tiering (reL open internet)

1. It was not until fairly recently after the merger of two major telecom (telecommunications) companies AT&T and SBC communications where talks of "tiering" the internet came into light. 2. These companies proposed that there should be a high-speed "tier" to their networks where some services would be favored over others. Sites that choose to pay for this service will then get faster and more reliable service. The desire to stop tiering from taking place is the main reason that interest in net neutrality legislation has ballooned in the last few years.

Classic Network Effects: The Telephone

1. John F. Parkinson, the first person to get telephone service in Palo Alto, California 2. Parkinson was a builder, real estate speculator, and politician 3. Next: The newspaper editor and the doctors. 4. Then: The pharmacist - with a "pay" phone in a special room. 5. Every new subscriber added a little to the service's value.

Last mile

1. Last-mile technology is any telecommunications technology that carries signals from the broad telecommunication backbone along the relatively short distance (hence, the "last mile") to and from the home or business. 2. Or to put it another way: the infrastructure at the neighborhood level.

Latency sensitivity

1. Latency is the delay from input into a system to desired outcome 2 A latency-sensitive application, also called a "real-time" application, is one in which there is very little tolerance for any kind of network latency in the millisecond range; more typically microseconds are the unity of measure. Common applications in this arena are VoIP

International Marketplace

1. Leadership in Cellular Development A. Europe for 2G (GSM) B. Asia for 3G (WCDMA) C. US for 4G (LTE) 2. Many leading companies based in US: some e.g. Huawei bigger outside US 3. Manufacturing mostly outside US: handsets and components 4. International agreement on standards 5. Business trends often start outside US: lower role of device subsidies, two-sided pricing

"Code 2.0" by Lawrence Lessig (Chapter 7)

1. Lessig presents a useful way to think about policy, to which we will return throughout the semester. 2. He argues that code, technology, law, and informal norms are alternative mechanisms for regulating on-line behavior and achieving policy objectives. 3. Much information technology policy, he argues, is a matter of deciding which mix of these mechanisms we wish to employ 4. That regulator could be a significant threat to a wide range of liberties, and we don't yet understand how best to control it. This regulator is what I call "code"—the instructions embedded in the software or hardware that makes cyberspace what it is. This code is the "built environment" of social life in cyberspace. It is its "architecture." And to see this new, salient threat, I believe we need a more general understanding of how regulation works—one that focuses on more than the single influence of any one force such as government, norms, or the market, and instead integrates these factors into a single account.

Cathedral vs. bazaar

1. Linux overturned much of what I thought I knew. I had been preaching the Unix gospel of small tools, rapid prototyping and evolutionary programming for years. But I also believed there was a certain critical complexity above which a more centralized, a priori approach was required. I believed that the most important software (operating systems and really large tools like the Emacs programming editor) needed to be built like cathedrals, carefully crafted by individual wizards or small bands of mages working in splendid isolation, with no beta to be released before its time. 2. Linus Torvalds's style of development - release early and often, delegate everything you can, be open to the point of promiscuity - came as a surprise. No quiet, reverent cathedral-building here - rather, the Linux community seemed to resemble a great babbling bazaar of differing agendas and approaches (aptly symbolized by the Linux archive sites, who'd take submissions from anyone) out of which a coherent and stable system could seemingly emerge only by a succession of miracles. The fact that this bazaar style seemed to work, and work well, came as a distinct shock

Lossless vs. lossy

1. Lossless and lossy compression are terms that describe whether or not, in the compression of a file, all original data can be recovered when the file is uncompressed. With lossless compression, every single bit of data that was originally in the file remains after the file is uncompressed. All of the information is completely restored. This is generally the technique of choice for text or spreadsheet files, where losing words or financial data could pose a problem. The Graphics Interchange File (GIF) is an image format used on the Web that provides lossless compression. 2. On the other hand, lossy compression reduces a file by permanently eliminating certain information, especially redundant information. When the file is uncompressed, only a part of the original information is still there (although the user may not notice it). Lossy compression is generally used for video and sound, where a certain amount of information loss will not be detected by most users. The JPEG image file, commonly used for photographs and other complex still images on the Web, is an image that has lossy compression. Using JPEG compression, the creator can decide how much loss to introduce and make a trade-off between file size and image quality.

Influences on Richard Stallman and the GNU Manifesto

1. MIT Artificial Intelligence Lab: This is the place where the early work on AI was done. First programs that could take instructions were created there. In those days, computers were super expensive. MIT lab was the first place where a critical mass of programmers existed and developed a culture there. In the 1980s, the hacker community in which Stallman lived began to fragment. To prevent software from being used on their competitors' computers, most manufacturers stopped distributing source code and began using copyright and restrictive software licenses to limit or prohibit copying and redistribution. Such proprietary software had existed before, and it became apparent that it would become the norm.In 1980, Stallman and some other hackers at the AI lab were not given the source code of the software for the Xerox 9700 laser printer (code-named Dover), the industry's first. The hackers had modified the software on the other printers, so it electronically messaged a user when his job was printed, and messages all logged-in users when a printer was jammed. Not being able to add this feature to the Dover printer was a major inconvenience, as the printer was on a different floor, then all the users. This one experience convinced Stallman of the ethical need to require free software. At that time, it became clear that he wanted people to discard proprietary software. 2. UNIX/Bell Labs: If MIT AI lab supplied the culture, UNIX Bell Labs, where basic research and applied computer science were done, supplied the technology (UNIX).

Malware

1. Malware is a category of malicious code that includes viruses, worms, and Trojan horses. 2. Destructive malware will utilize popular communication tools to spread, including worms sent through email and instant messages, Trojan horses dropped from web sites, and virus-infected files downloaded from peer-to-peer connections. Malware will also seek to exploit existing vulnerabilities on systems making their entry quiet and easy.

App & Carrier: Network-Unfriendly Apps

1. Misbehaving apps overload the network A. Chatty: wasting signaling resources B. Unfair: consuming excessive bandwidth C. Inefficient: poor catching wastes bandwidth 2. Challenging to address A. Large number of developers B. Naivete about app impact on the network 3. Aligned incentives A. Educate developers (e.g. ATT&T ARO tool) B. Benefit users (e.g. less bandwidth and battery)

Apps & OS: App Stores

1. Mobile app distribution A. Balancing trust, functionality, convenience B. App review by platform provider C. Semi-sandboxed execution environment 2. Policies affecting openness A. Installation mechanisms (app store required) B. Screening policies (performance, security) C. Revenue-sharing agreements (e.g. 20-30%) D. App-store navigation (promotion, categories) 3. Longer-term: HTML5

"Reasonable network management"

1. More formally, to qualify as reasonable network management, the practice would have to further a legitimate network management purpose and be narrowly tailored to address that purpose. In the context of network neutrality rules, the term "network management" refers to technical measures whose purpose is "to maintain, protect, and ensure the efficient operation of a network." 2. Network management includes, e.g., managing congestion or protecting the security of a network.

MPEG

1. Moving Picture Experts Group (MPEG): A working group of ISO/IEC with the mission to develop standards for coded representation of digital audio and video and related data. 2. Since 1988 when it has been established, the group has produced standards that help the industry offer end users an ever more enjoyable digital media experience.

"Cell Internet Use 2013" by Maeve Duggan and Aaron Smith

1. Nearly two-thirds (63%) of cell phone owners now use their phone to go online, according to a new survey by the Pew Research Center's Internet & American Life Project. We call them "cell internet users" and define them as anyone who uses their cell phone to access the internet or use email. Because 91% of all Americans now own a cell phone, this means that 57% of all American adults are cell internet users. The proportion of cell owners who use their phone to go online has doubled since 2009. 2. Additionally, one third of these cell internet users (34%) mostly use their phone to access the internet, as opposed to other devices like a desktop, laptop, or tablet computer. We call these individuals "cell-mostly internet users," and they account for 21% of the total cell owner population. Young adults, non-whites, and those with relatively low income and education levels are particularly likely to be cell-mostly internet users.

Summation

1. Net neutrality is about neither free expression or network management, but about a struggle between two sectors of the corporate community to define the architecture of an emerging industrial field. 2. The debate's complexity reflects its technical complexity and the interest of participants in masking the interests that underlie the debate. 3. The timing of the debate reflects the development and widespread implementation of deep-packet inspection technologies, themselves produced in response to the interest of the world's governments in controlling populations and defending their territories. 4. As a result, "net neutrality" has moved to the Center of the Internet Policy space, representing a unique point of intersection among such issues as intellectual property, digital inequality, privacy and cybersecurity.

Network externalities

1. Network externality has been defined as a change in the benefit, or surplus, that an agent derives from a good when the number of other agents consuming the same kind of good changes. 2. A network has positive externalities if the value of that network increases as a function of the number of persons (or nodes of any kind) that it includes.

Conclusions (end of Rexford)

1. Network neutrality is a complex issue A. What is "openness"? B. What best enables "competition"? C. What is the best way to foster openness? 2. Issue goes far beyond service providers A. Applications, operating systems, devices B. Beyond the purview of the FCC 3. Going forward, need ways to encourage: transparency, education, and competition

Spectrum is very scarce

1. Officially, spectrum belongs to the public 2. Spectrum licensed by the FCC 3. In recent decades, a move toward "ownership" of spectrum, as economists, starting in the 1950s, argued that it was silly to allocate spectrum administratively. 4. Exception: unlicensed spectrum, which was to be used for mundane tasks like opening garage doors. But then WiFi was invented and WiFi uses unlicensed spectrum and is now insanely profitable. Why Spectrum is Scarce It's very difficult to reassign spectrum, and most spectrum was allocated in the 1950s to fit 1950s technology. For instance, TV was given way more spectrum than it needs today. Also to be a mobile carrier, need to buy spectrum

Old v. New Equilibrium

1. Old Equilibrium: •High advances •High promotion costs •Performing (in stadiums) promoted records •Distribution expensive - unsold records returned •CD in black at 500,000 units •Artists care a lot about IP •Winner take all reward structure •Artists burn out from touring, sue labels 2. New Equilibrium: •Low or no advances •Promotion online, often viral •Performing (in clubs) is major revenue source for band; merch important too •Distribution virtually costless •Matador etc.: Profit at 25,000 units; CD sales at concerts; CD purchase c. donation •Artists don't care much about IP •More equal returns •Can artists afford to stay in the game?

Differing Perspectives

1. One person's mash-up is another person's (corporation's) cybercrime. 2. One person's democracy is another person (or state's) insurrection. 3. From many perspectives, the Internet is wracked by crime e.g. Biggest Piracy Case in U.S. History Gets Its First Conviction, as The Justice Department on Friday announced it has won its first conviction in its long-running criminal complaint against Kim Dotcom and his Megaupload.com for providing pirated access to TV shows and movies. MPAA has been trying to get state AGs to sue Google. Mississippi filed lawsuit. And as much as people try hard to improve the security of the internet and internet devices, iPhone 5s fingerprint sensor hacked within 3 days of launch. Hackers are always a step ahead in the arms race. And hackers have infiltrated over 100 banks in several countries, stealing millions of dollars in possibly the largest bank theft the world has seen. And the Sony Hack is the Poster Child For A New Era Of Cyber Attacks, as what made the Sony breach unique is the combination of four common tactics into a single orchestrated campaign designed to bend a victim to the will of the attackers. Obama had big cybersecurity summit on February 13 for both geopolitical and economic security. Security issues are getting worse

PageRank

1. PageRank is what Google uses to determine the importance of a web page. It's one of many factors used to determine which pages appear in search results. 2. PageRank measure's a web page's importance. Page and Brin's theory is that the most important pages on the Internet are the pages with the most links leading to them. PageRank thinks of links as votes, where a page linking to another page is casting a vote. 3. Now that people know the secrets to obtaining a higher PageRank, the data can be manipulated.

Application-blindness

1. Part of van Schewick's Framework for Evaluating Network Neutrality Rules 2. The network is application-blind. An application-blind network is unable to distinguish among the applications on the network, and, as a result, is unable to make distinctions among data packets based on this information.

P2P networks

1. Peer-to-peer (P2P) is a decentralized communications model in which each party has the same capabilities and either party can initiate a communication session. 2. Unlike the client/server model, in which the client makes a service request and the server fulfills the request, the P2P network model allows each node to function as both a client and server.

Peering ( inc. multihoming & secondary peering)

1. Peering: Where one internet operation connects directly to another, so that they can trade traffic. This could be a connection between an ISP such as Comcast and an internet backbone provider such as Level 3. But it could also be a direct connection between an ISP and a content provider such as Google. 2. Multihoming: regional ISPs have begun to connect to more than one backbone, a practice known as multihoming 3. Secondary peering: Regional ISPs that did not have sufficient volume to peer with the tier-1 backbones also began to find that they did have sufficient volume to peer with other regional ISPs, a practice known as secondary peering.

Private ordering

1. Private ordering: the coming together of non-governmental parties in voluntary arrangements - as a central institutional form of law making and law applying. 2. The goal of this report is to describe the way in which the self-organized and selfregulating structures that govern today's global Internet—including the arrangements that enable ISPs to connect their networks to each other—have evolved naturally, over a period of roughly 35 years, according to principles that are deeply embedded in the Internet architecture. These structures are selforganized and self-regulating not because the Internet is an anachronistic "untamed and lawless wild west" environment, but because years of experience have shown that self-management is the most effective and efficient way to preserve and extend the uniquely valuable properties of the Internet.

Diffuse and Concentrated Interests in (one part of) the Intellectual Property Domain

1. Pro: Intellectual Property. Diffuse: General interest in encouragement of creativity and fair return; public and artists Concentrated: Media conglomerates, Recording Industry, Book publishers, Music publishers, Software manufacturers 2. Pro: Information Commons Diffuse: Public interest in fair use, access to information and ability to use information in creative works. Concentrated: Electronics industry, File-sharing services, Google (mostly), Libraries, schools, Universities (to a point)

UNIX

1. Pronounced yoo-niks, a popular multi-user, multitasking operating system developed at Bell Labs in the early 1970s. 2. Created by just a handful of programmers, UNIX was designed to be a small, flexible system used exclusively by programmers. 3. UNIX was one of the first operating systems to be written in a high-level programming language, namely C. This meant that it could be installed on virtually any computer for which a C compiler existed. 4. This natural portability combined with its low price made it a popular choice among universities. (It was inexpensive because antitrust regulations prohibited Bell Labs from marketing it as a full-scale product.) 5. Bell Labs distributed the operating system in its source language form, so anyone who obtained a copy could modify and customize it for his own purposes. By the end of the 1970s, dozens of different versions of UNIX were running at various sites.

Product Rating Systems

1. Rating systems: Products, services and sellers A. Why would someone contribute? B. Can you believe them? e.g. Amazon: Reviews; and Reviews of Reviews (human collaborative filtering)

The Shift from Stallman to Raymond

1. Raymond rejects Stallman's moral argument 2. "Free software" becomes "open source" software 3. Open source is good business: this is Raymond's argument. Many economists scoffed at this, but there's been a reconsideration in light of the success of Linux and others.

Paradox: In real life & in lab people free ride less than economists expect

1. Real life: A. Voting B. Not embezzling C. Open source D. Volunteering E. Giving to charities 2. Laboratory A. Less free riding than expected B. In "trust game" A usually gives about half of $$ to B; and B usually gives a little more back. And it often doesn't take many altruists to create a public good, even if most people free ride.

Walled garden

1. Refers to a network or service that either restricts or makes it difficult for users to obtain applications or content from external sources. Cable TV and satellite TV are walled gardens, offering a finite number of channels and programs to their subscribers. When AOL was king of the Internet providers, it did an excellent job of keeping users on AOL-affiliated sites. 2. In 2007, Apple's iPhone was a walled garden with a basic set of applications. Soon after, Apple opened the iPhone, encouraging third-party developers to write apps as long as they were approved by Apple. Thus, the wall was broken, but not entirely, because Apple can disapprove any app that is submitted to its online store (see jailbreaking). Google Play (formerly Android Market) accepts all apps as long as they are not xxx rated.

Congress and Open Internet Rules

1. Republicans in Congress tried to obstruct rulse: A. House defunded FCC rule-making in this area (Senate killed bill). B. Republicans demanded delay for cost-benefit analysis. C. Rep. Issa investigated improper White House influence. (Didn't find it.) D. House held FCC broadband spectrum auction hostage. (The House blinked.) E. Efforts to limit or amend the rules will likely continue, with new legislation introduced in every session. 2. Some Democrats (Markey, Cantwell, Wyden) favor legislative enactment of Open Internet rules, also without success.

The Wikipedia Experiment: Does peer approval motivate contributions?

1. Restivo & Van de Rijt, 2012 A. Wikipedia contributors can award "barnstars" to other contributors they value B. But lots of strong contributors don't get them Working with Wikipedia, the researchers give barnstars to 100 people randomly selected from the most active 1 percent of Wikipedia contributors (out of 140,000+ who had made one edit in previous month). C. Compared them to randomly selected 100 who did not get the stars. 2. What did they find? A. Treatment group members contributed 60 percent more than control group in the ninety days after the experiment B. They were also significantly more likely to get additional stars from other users (to an extent that cannot be explained by greater productivity)

Technology and Industrial Transformation (see Paul Starr, Tim Wu, Russ Neuman, Eli Noam)

1. Roads, canals and railroads: The Postal Service and local newspapers 2. Telegraph and Radio: The emergence of separate, national, private information and communications networks 3. Teletext and digital switches: Communications + Computing = Compunications (and end of regulated monopoly/oligopoly) 4. The Internet + Moore's Law= Convergence of Communications, Information, Computing, & Media= ?? The net neutrality debate is about the structure, mode of governance, regulation, and distribution of rents in this new industrial field.

Moore's Law (explained)

1. Rule of thumb 2. No theory as to why this should be the case 3. Has held true since roughly 1960 4. Cost of some unit of circuitry, memory etc. is cut in half every 18 months 5. Equivalently, capacity (this can mean anything) available at fixed cost doubles every 18 months. 6. This is incredibly fast growth 8. Cost of 50 GB of storage 1981: $15 million 1990: $500,000 2000: $500 2012: $5 2015: free at Box.com 9. This is power of Moore's Law: businesses now give away as a loss-leader at loss of 40-50 cents 10. There are cities full of engineers making technology better 11. People in the industry assume it will hold true 12. Moore's Law gives you the assumption that you'll get a new phone in 2 years

How the 2014 (2015?) Open Internet Policy differs from the 2010 Policy

1. Rules apply to mobile carriers 2. Broadband Internet access service reclassified as telecommunications service under Title II of the 1934 Communications Act. 3. Rules now apply to traffic coming into ISPs: interfering at the edge is also illegal.

What 2015 Open Internet rules add to 2010 rules

1. Rules apply to mobile carriers 2. Broadband Internet access service reclassified as telecommunications service under Title II of the 1934 Communications Act. 3. Rules now apply to traffic coming into ISPs: interfering at the edge is also illegal.

User & Carrier: Service Agreements

1. Service agreements and pricing plans A. Customers: clarity and flexibility B. Carriers: recoup costs and limit risk C. Unlimited, usage cap, usage-based pricing 2. Policies affecting openness A. Billing models (from unlimited to usage-based) B. Device locking (and the role of device subsidies) C. Restrictions on tethering D. Application restrictions (e.g. FaceTime) E. Zero-rating ("toll-free") trend outside the US

ISP

1. Short for Internet Service Provider, it refers to a company that provides Internet services, including personal and business access to the Internet. 2. For a monthly fee, the service provider usually provides a software package, username, password and access phone number. 3. Equipped with a modem, you can then log on to the Internet and browse the World Wide Web and USENET, and send and receive e-mail. For broadband access you typically receive the broadband modem hardware or pay a monthly fee for this equipment that is added to your ISP account billing. 4. In addition to serving individuals, ISPs also serve large companies, providing a direct connection from the company's networks to the Internet. ISPs themselves are connected to one another through Network Access Points (NAPs). ISPs may also be called IAPs (Internet Access Providers).

SSO technologies ["wallets" and "identity layer" of Internet])

1. Single sign-on (SSO)is a session/user authentication process that permits a user to enter one name and password in order to access multiple applications. 2. An online wallet is a program or web service that allows users to store and control their online shopping information, like logins, passwords, shipping address and credit card details, in one central place. It also provides a convenient and technologically quick method for consumers to purchase products from any person or store across the globe. Greater regulation has entailed a movement from anonymity and freedom to identity and accountability. e.g. Lessig's "Identity Layer" - Protocols that Authenticate Users (1996-2013) 1. Permanent IP addresses and caching of temporary addresses to tie users to sessions. 2. IP Mapping software to enable states to locate activity in space (essential for identifying violation of state or national laws; useful for retail) 3. Site specific security requiring public/private key encryption for sensitive websites 4. Permanent identities for particular domains (e.g., what you do at www.princeton.edu) 5. Stable identity within most of what you do online (Google, Facebook)

Small Number of Big Players

1. Smartphone vendor shipments: Apple (38%), Samsung (29%), LG (10%) 2. Smartphone OS Market Share: Google Android (56%), Apple iOS (38%) 3. Mobile provider market share: Verizon (34%), AT&T (30%), Spring (16%), T-Mobile (12%) 4. Radio access equipment vendors: Ericsson (50%), Alacatel-Lucent (36%), Nokia-Siemens (10%) 5. Applications developers: Many, diverse, most make <$500/month, but a small fraction are very successful

How Software Is Made

1. Source code: textual, written by people (professionals). This is akin to the blueprint for a building. 2. After "compiling" or "building," the source code becomes machine code, which is binary and can run on the machine. 3. Open Source is a way of working with code. Open Source is contested, and it's contested whether we should even call it Open Source.

Open source

1. Source code: textual, written by people (professionals). This is akin to the blueprint for a building. 2. After "compiling" or "building," the source code becomes machine code, which is binary and can run on the machine. 3. Open Source is a way of working with code. Open Source is contested, and it's contested whether we should even call it Open Source.

Stored program computer

1. Storage of instructions in computer memory to enable it to perform a variety of tasks in sequence or intermittently. 2. The idea was introduced in the late 1940s by John von Neumann, who proposed that a program be electronically stored in binary-number format in a memory device so that instructions could be modified by the computer as determined by intermediate computational results. 3. Other engineers, notably John W. Mauchly and J. Presper Eckert, contributed to this idea, which enabled digital computers to become much more flexible and powerful. 4. Nevertheless, engineers in England built the first stored-program computer, the Manchester Mark I, shortly before the Americans built EDVAC, both operational in 1949.

"Constitutive decisions" and "path dependence"

1. Technologies are "path dependent" 2. I.e. Early "constitutive decisions" (Starr) may shape later development in unanticipated ways

Technology does not develop along a single path - there are many false starts (2)

1. Technologies are "path dependent" 2. I.e. Early "constitutive decisions" (Starr) may shape later development in unanticipated ways

"Fact Sheet: Chairman Wheeler Proposes New Rules for Protecting the Open Internet" by Federal Communications Commission

1. The 2015 Fact Sheet summarizes the as yet unpublished proposed rules that the FCC will soon release 2. Chairman Wheeler is proposing clear, sustainable, enforceable rules to preserve and protect the open Internet as a place for innovation and free expression. His common-sense proposal would replace, strengthen and supplement FCC rules struck down by the U.S. Court of Appeals for the District of Columbia Circuit more than one year ago. The draft Order supports these new rules with a firm legal foundation built to withstand future challenges. The Chairman's comprehensive proposal will be voted on the FCC's February 26 open meeting.

General-purpose computer

1. The Big Intellectual Leap: going from a special purpose to general-purpose computer 2. This leap is due to Alan Turing in 1935/1936. Turing is 22 in Cambridge, UK. He's teaching and he has the idea of a general-purpose computer. His advisers are excited and tell him to go work with Alonzo Church at Princeton. Turing at Princeton from 1936-1938. Turing's idea is to build a circuit that emulates circuits, a so-called "universal circuit." The Universal Circuit 1. Inputs entering the circuit from the left: information specifying which circuit the universal circuit is to emulate and inputs to the circuit that the universal circuit is supposed to emulate 2. Output exiting from the right side of the universal circuit: what emulated circuit would do, given the provided input Turing's 1938 Paper Turing showed: 1. Universal circuit is possible 2. How to build it 3. Universal circuit is much simpler than you expect: if you couple universal circuit with memory, it could be simpler than the emulated circuit Universal Circuit, Software, and Programming 1. Big advantage of coupling universal circuit with memory is that you don't need to design circuit for every purpose, as you just need to change the inputs 2. This is the idea of software and programming: you can decide later what you want it to do Schema of Universal Circuit with Memory 1. Universal circuit produces outputs and data. The data enters the memory. Memory produces data, which, along with inputs and code, enters the universal circuit. The code and data come from the memory: the inputs do not. The universal circuit is the computer. 2. Turing proved the universality of this. Any logical computation that can be done can be done by this device. 3. Over decades, people figured out how to build this at low-cost. The light-switch in Robertson 002 probably has a general purpose computer in it. Universal circuits all over the place e.g. car has 20-30 computers in it. 4. Real computer e.g. laptop, mobile phone (computer has code that allows it to make calls) 5. Circuitry has many billions of switches, usually highly miniaturized; memory stores many billions of bits. 6. Recently, batteries have improved. Sensors like microphone, camera, accelerator (phone knows which way is up), barometer all embedded in phones.

The Clipper Chip and the stakes of encryption

1. The Clipper Chip is a cryptographic device purportedly intended to protect private communications while at the same time permitting government agents to obtain the "keys" upon presentation of what has been vaguely characterized as "legal authorization." The "keys" are held by two government "escrow agents" and would enable the government to access the encrypted private communication. While Clipper would be used to encrypt voice transmissions, a similar chip known as Capstone would be used to encrypt data. 2. The underlying cryptographic algorithm, known as Skipjack, was developed by the National Security Agency (NSA), a super-secret military intelligence agency responsible for intercepting foreign government communications and breaking the codes that protect such transmissions. The agency has played a leading role in the Clipper initiative and other civilian security proposals, such as the Digital Signature Standard. NSA has classified the Skipjack algorithm on national security grounds, thus precluding independent evaluation of the system's strength. 3. Cryptography contributes to commercial, political, and personal life in a surprising number of ways. Now that modern cryptographic techniques have put strong, perhaps uncrackable, cryptography within the reach of anyone with a computer or even a telephone, the use of strong cryptography is likely to increase further. 4. As a result, worried law enforcement and intelligence agencies have developed the Clipper Chip in order to retain their capability to eavesdrop on private electronic communications.

TLD

1. The Internet's domain-name system (DNS) allows users to refer to web sites and other resources using easier-to-remember domain names (such as "www.icann.org") rather than the all-numeric IP addresses (such as "192.0.34.65") assigned to each computer on the Internet. Each domain name is made up of a series of character strings (called "labels") separated by dots. The right-most label in a domain name is referred to as its "top-level domain" (TLD). 2. The DNS forms a tree-like hierarchy. Each TLD includes many second-level domains (such as "icann" in "www.icann.org"); each second-level domain can include a number of third-level domains ("www" in "www.icann.org"), and so on. 3. The responsibility for operating each TLD (including maintaining a registry of the second-level domains within the TLD) is delegated to a particular organization. These organizations are referred to as "registry operators", "sponsors", or simply "delegees."

JPEG

1. The Joint Photographic Experts Group (JPEG) committee has a long tradition in the creation of still image coding standards. 2. JPEG is a joint working group of the International Standardization Organization (ISO) and the International Electrotechnical Commission (IEC). 3. Group of experts that develops and maintains standards for a suite of compression algorithms for computer image files.

Backbone

1. The National Science Foundation (NSF) created the first high-speed backbone in 1987. Called NSFNET, it was a T1 line that connected 170 smaller networks together. IBM, MCI and Merit worked with NSF to create the backbone and developed a T3 (45 Mbps) backbone the following year. 2. Backbones are typically fiber optic trunk lines. 3. Today there are many companies that operate their own high-capacity backbones, and all of them interconnect at various NAPs around the world. In this way, everyone on the Internet, no matter where they are and what company they use, is able to talk to everyone else on the planet. The entire Internet is a gigantic, sprawling agreement between companies to intercommunicate freely. 4. What happens before the last mile? Before internet traffic gets to your house, it goes through your ISP, which might be a local or regional network (a tier 2 ISP) or it might be an ISP with its own large-scale national or global network (a tier 1 ISP). There are also companies that are just large-scale networks, called backbones, which connect with other large businesses but don't interact with retail customers.

"America's First Information Revolution" by Paul Starr

1. The Starr reading focuses on the communications policy traditions in the U.S. 2. Here were some of the innovations that made up the first American revolution in information and communications: The United States established free speech as a constitutional principle, and the Constitution itself was written and published so that ordinary citizens could read it. It created a comprehensive postal network and assured postal privacy. It introduced a periodic census, published the aggregate results, and assured individuals anonymity. Primarily through local efforts, it extended primary schooling earlier to more of its population, including women. Protestantism and the market economy, however, cannot explain the United States moved ahead more quickly in communications than did the Protestant, commercial countries of Europe. The transformation of postal service, news, education, and the census followed in the wake of the American Revolution. Some of the key changes, such as the postal system, the census, and public education, critically involved law and policy and doubtless helped to make the United States more powerful. But the federal government promoted communications in part by desisting from the use of power: It conducted no surveillance of mail, refrained from using the census to maintain information about individuals, and helped to finance and stimulate the development of common schools at the local level, but did not control what the schools taught. The government promoted communications by making credible commitments not to control their content.

Application layer

1. The application layer is a layer in the Open Systems Interconnection (OSI) seven-layer model and in the TCP/IP protocol suite. It consists of protocols that focus on process-to-process communication across an IP network and provides a firm communication interface and end-user services. 2. The application layer is the seventh layer of the OSI model and the only one that directly interacts with the end user. Its major network device or component is the gateway.

Von Neumann architecture

1. The basic concept behind the von Neumann architecture is the ability to store program instructions in memory along with the data on which those instructions operate. 2. Until von Neumann proposed this possibility, each computing machine was designed and built for a single predetermined purpose. All programming of the machine required the manual rewiring of circuits, a tedious and error-prone process. If mistakes were made, they were difficult to detect and hard to correct. 3. Von Neumann architecture is composed of three distinct components (or sub-systems): a central processing unit (CPU), memory, and input/output (I/O) interfaces.

"The Mobile Revolution" by Lee Rainie and Barry Wellman

1. The chapter recounts the development of mobile phones into affordable, easily portable, multi-functional devices, from talking to texts to continuous Internet access. 2. One interesting survey finds that African Americans and Latinos are less likely than whites to be wired Internet users but more likely to access the Internet through their cell phones.

"Code 2.0" by Lawrence Lessig (Chapter 4-5)

1. The chapters in Lawrence Lessig's classic Code2.0 - first published in 1998 as Code and Other Laws of Cyberspace and then published in a second edition in 2006 as Code 2.0.— describe some paradigmatic online dilemmas 2. Ch. 4: Lessig describes the changes that could—and are—pushing the Net from the unregulable space it was, to the perfectly regulable space it could be. These changes are not being architected by government. They are instead being demanded by users and deployed by commerce. They are not the product of some 1984-inspired conspiracy; they are the consequence of changes made for purely pragmatic, commercial ends. 3. Ch. 5: COMMERCE HAS DONE ITS PART—FOR COMMERCE, AND INDIRECTLY, FOR governments. Technologies that make commerce more efficient are also technologies that make regulation simpler. The one supports the other. There are a host of technologies now that make it easier to know who someone is on the Net, what they're doing, and where they're doing it. These technologies were built to make business work better. They make life on the Internet safer. But the by-product of these technologies is to make the Net more regulable. More regulable.Not perfectly regulable.

Verizon v FCC

1. The court upheld the FCC's authority to regulate broadband Internet access providers, and upheld the disclosure requirements of the Open Internet Order, but struck down the specific anti-blocking and nondiscrimination rules contained in the Order. •Verizon took FCC before DC Circuit Court of Appeals in Washington DC, claiming that FCC had no authority to regulate its Internet practices -First amendment: Free speech -Fifth amendment: Permanent easement on its system (prohibition of pay-for-priority) represents "illegal taking" •Relevant precedents cut both ways -Comcast v. FCC (2010, DC Circuit Court of Appeals) voided FCC judgment against Comcast's degradation of Bittorrent traffic (mooted by terms of Comcast/NBC merger agreement) -City of Arlington v. FCC (2013, Supreme Court) affirmed permissive standards for FCC use of congressional authority in rule-making (in case involving regulation of municipal licensing of wireless facilities) -Decision against FCC in February 2014 - Will decisively shape future of FCC's role in regulating Internet

Comcast V. FCC

1. The court, in an April 6, 2010, decision, ruled (3-0) that the FCC did not have the authority to regulate an Internet service provider's (in this case Comcast's) network management practices and vacated the FCC's order 2. Comcast v. FCC (2010, DC Circuit Court of Appeals) voided FCC judgment against Comcast's degradation of Bittorrent traffic (mooted by terms of Comcast/NBC merger agreement)

Hypertext and why it matters that the web has a "hypertextual structure"

1. The decision to use this network metaphor also didn't arise out of thin air; it's an application of a computer-assisted style of authoring known as hypertext that had been explored and refined since the middle of the twentieth century. The motivating idea behind hypertext is to replace the traditional linear structure of text with a network structure, in which any portion of the text can link directly to any other part — in this way, logical relationships within the text that are traditionally implicit become first-class objects, foregrounded by the use of explicit links. In its early years, hypertext was a cause passionately advocated by a relatively small group of technologists; the Web subsequently brought hypertext to a global audience, at a scale that no one could have anticipated. 2. Why it matters that the web has a "hypertextual structure": the use of a network structure truly brings forth the globalizing power of the Web by allowing anyone authoring a Web page to highlight a relationship with any other existing page, anywhere in the world

DNS

1. The domain name system (DNS) is the way that Internet domain names are located and translated into Internet Protocol addresses. 2. The Internet Corporation for Assigned Names and Numbers (ICANN) coordinates the Internet Assigned Numbers Authority (IANA) functions, which are key technical services critical to the continued operations of the Internet's underlying address book, the Domain Name System (DNS).

DNS system

1. The domain name system (DNS) is the way that Internet domain names are located and translated into Internet Protocol addresses. 2. The Internet Corporation for Assigned Names and Numbers (ICANN) coordinates the Internet Assigned Numbers Authority (IANA) functions, which are key technical services critical to the continued operations of the Internet's underlying address book, the Domain Name System (DNS).

"Peering" vs. "transit pricing"

1. The early Internet was also characterized by relatively simple business relationships. End users typically purchased Internet access through some form of all-you-can-eat pricing, which allowed them to consume as much bandwidth as they would like for a single flat rate. Relationships between network providers typically fell into two categories. 2. Tier-1 ISPs entered into peering relationships with one another, in which they exchanged traffic on a settlement-free basis and no money changed hands. The primary justification for foregoing payment is transaction costs. Although the backbones could meter and bill each other for the traffic they exchanged, they could avoid the cost of doing so without suffering any economic harm so long as the traffic they exchanged was roughly symmetrical. Such arrangements would not be economical with when the traffic being exchanged by the two networks was severely imbalanced. Thus tier-1 ISPs will not peer with other networks that are unable to maintain a minimum level of traffic volume. In addition, peering partners typically require that inbound and outbound traffic not exceed a certain ratio. 3. Networks that cannot meet these requirements must enter into transit arrangements in which they pay the backbone to provide connectivity to the rest of the Internet.

End-to-end principle

1. The end-to-end principle is one of the underlying system principles of the Internet, which states that network features should be implemented as close to the end points of the network -- the applications -- as possible. 2. This is commonly expressed by describing the system as a "dumb" network with "smart" terminals. Lawrence Lessig and Robert W. McChesney argue that the end-to-end principle is what has made the Internet such a success, since "All of the intelligence is held by producers and users, not the networks that connect them." 3. One of the most basic arguments in favor of Net Neutrality is that it is needed in order to preserve what is known as the "end-to-end principle."

Generativity vs. security (Zittrain)

1. The generative capacity for unrelated and unaccredited audiences to build and distribute code and content through the Internet to its tens of millions of attached personal computers has ignited growth and innovation in information technology and has facilitated new creative endeavors. 2. It has also given rise to regulatory and entrepreneurial backlashes. A further backlash among consumers is developing in response to security threats that exploit the openness of the Internet and of PCs to third-party contribution. 3. A shift in consumer priorities from generativity to stability will compel undesirable responses from regulators and markets and, if unaddressed, could prove decisive in closing today's open computing environments. 4. Zittrain explains why PC openness is as important as network openness, as well as why today's open network might give rise to unduly closed endpoints. 5. He argues that the Internet is better conceptualized as a generative grid that includes both PCs and networks rather than as an open network indifferent to the configuration of its endpoints. 6. Applying this framework, he explores ways - some of them bound to be unpopular among advocates of an open Internet represented by uncompromising end-to-end neutrality - in which the Internet can be made to satisfy genuine and pressing security concerns while retaining the most important generative aspects of today's networked technology.

State regulation of municipal wireless efforts

1. The lack of widely available, affordable broadband Internet access has spurred a movement in which municipalities are rolling out wireless broadband networks. 2. As cities use wireless broadband technology to enhance services to citizens, the growth of municipal wireless deployments has transitioned from linear to exponential. 3. In response, many states have passed laws to regulate and restrict cities' ability to own, operate, deploy, or profit from either telecommunications or information services.

Layering principle

1. The layering principle, as applied to networking, prescribes that a lower-layer protocol may not make any assumptions about the content or meaning of the message (or, more technically, protocol data unit) passed to it by a higher-layer protocol for delivery to its higher-layer protocol peer 2. The original architecture of the Internet was based on the layering principle and on the broad version of the end-to-end arguments.

John Perry Barlow Protocol

1. The leap to speaking about the decentralized routing protocols represents clearly the shared moral and technical order of geeks, derived in this case from the specific details of the Internet. 2. In the early 1990, this version of the technical order of the Internet was part of a vibrant libertarian dogma asserting that the Internet simply could not be governed by any land-based sovereign and that it was fundamentally a place of liberty and freedom. This was the central message of John Perry Barlow.

Yoo: How Internet's "middle mile" has changed

1. The network no longer adheres to the rigid and uniform hierarchy that characterized the early Internet and its predecessor, the NSFNET. 2. Packets can now travel along radically different paths based on the topology of the portion of the network through which they travel. This is the inevitable result of reducing costs and experimenting with new structures. At the same time that network providers are experimenting with new topologies, they are also experimenting with new business relationships. 3. Gone are the days when networks interconnected through peering and transit and imposed all-you-can eat pricing on all end users. 4. That fairly simple and uniform set of contractual arrangements has been replaced by a much more complex set of business relationships that reflect creative solutions to an increasingly complex set of economic problems. 5. Again, these differences mean that the service that any particular packet receives and the amount that it pays will vary with the business relationships between the networks through which it travels. 6. Although many observers reflexively view such deviations from the status quo with suspicion, in many (if not most) cases, they represent nothing more than the natural evolution of a network trying to respond to an ever-growing diversity of customer demands. Imposing regulation that would thwart such developments threaten to increase costs and discourage investment in ways that ultimately work to the detriment of the consumers that such regulation is ostensibly designed to protect.

Internet layers: Physical, logical, content

1. The physical layer refers to the material things used to connect human beings to each other. These include the computers, phones, handhelds, wires, wireless links, and the like. 2. The content layer is the set of humanly meaningful statements that human beings utter to and with one another. It includes both the actual utterances and the mechanisms, to the extent that they are based on human communication rather than mechanical processing, for filtering, accreditation, and interpretation. 3. The logical layer represents the algorithms, standards, ways of translating human meaning into something that machines can transmit, store, or compute, and something that machines process into communications meaningful to human beings. These include standards, protocols, and software—both general enabling platforms like operating systems, and more specific applications. 4. A mediated human communication must use all three layers, and each layer therefore represents a resource or a pathway that the communication must use or traverse in order to reach its intended destination. In each and every one of these layers, we have seen the emergence of technical and practical capabilities for using that layer on a nonproprietary model that would make access cheaper, less susceptible to control by any single party or class of parties, or both.

Physical, logical, and content layers

1. The physical layer refers to the material things used to connect human beings to each other. These include the computers, phones, handhelds, wires, wireless links, and the like. 2. The content layer is the set of humanly meaningful statements that human beings utter to and with one another. It includes both the actual utterances and the mechanisms, to the extent that they are based on human communication rather than mechanical processing, for filtering, accreditation, and interpretation. 3. The logical layer represents the algorithms, standards, ways of translating human meaning into something that machines can transmit, store, or compute, and something that machines process into communications meaningful to human beings. These include standards, protocols, and software—both general enabling platforms like operating systems, and more specific applications. 4. A mediated human communication must use all three layers, and each layer therefore represents a resource or a pathway that the communication must use or traverse in order to reach its intended destination. In each and every one of these layers, we have seen the emergence of technical and practical capabilities for using that layer on a nonproprietary model that would make access cheaper, less susceptible to control by any single party or class of parties, or both.

"Mobile is eating the world" by Benedict Evans

1. The smartphone and tablet business is now nearly 50% of the global consumer electronics industry. 2. Smartphones will continue to be the online on-ramp of choice regardless of global region. As of 2014, there are 3B people online and 2B people using smartphones. Andreessen Horowitz's Benedict Evans predicts that by 2020 there will be 4B people online, all using smartphones. Global smartphone adoption will attain a global compound annual growth rate of 10.41% in the next seven years. By 2020, 80% of the adults on earth will own a smartphone. 3. Every smartphone sensor creates a new business. Sensors in smartphones are accelerating the next generation of analytics, applications, APIs, and fundamentally changing the smartphone user experience. Andreessen Horowitz's Benedict Evans predicts that there will be 2 - 3X more smartphones than PCs by 2020, and when they are multiplied by their many uses including sensor-based applications, the total opportunity grows 10 times in this time period. All of these factors are leading to a mobile leverage effect that reduces the costs of develop new mobile applications that have the potential to attract exponentially larger customer bases. The new iPhone CPU has 625 times more transistors than Intel INTC -1.49%'s Pentium processor produced in 1995. During the iPhone launch weekend earlier this year, Apple AAPL +0.64% sold 25X more CPU transistors than were in all the PCs on Earth in 1995. 4. The utility of mobile increases as income falls. 70% of Sub-Saharan Africa is under cellular coverage, with 20% having 3G coverage today increasing to 65% by 2019, attaining a CAGR of 21.71%. The most significant constraint to greater adoption is data pricing and the ability to charge a phone. Using this region's cellular adoption rates and economics as an example, Benedict vans shows how the rapid cost reductions in smartphone costs are fundamentally changing the Internet. 5. Mobile apps now dominate the proportion of time spent online. The time spent on mobile apps is greater than all the time spent on the Web in the U.S. today.

Takedown notices

1. The so-called "DMCA take down notice" is a creature of Title II of the Digital Millennium Copyright Act ("DMCA"). 2. Compared to the normal legal process for getting an injunction to remove an infringing copy from the network, which takes a long time and an enormous amount of resources, a DMCA takedown notice is fast, simple, and can be drawn up by a copyright holder without the help of a lawyer. It really is very powerful. Moreover, you can issue takedown notices not only for the infringing material itself, but also "information location tools" pointing to the material—including "directory, index, reference, pointer, or hypertext link." 17 USC §512(d). 3. A DMCA takedown notice can be a cost-effective, quick, and powerful tool to remove material that infringes your copyright. In an age where electronic publication has made piracy an often-discussed topic, it gives individual authors more power to protect their rights. At the same time, the DMCA takedown mechanism has certain safeguards in place to protect the rights of those who have a right to publish material that is not infringing

NSFnet

1. The term "NSFNET" refers to a program of coordinated, evolving projects sponsored by the National Science Foundation that was initiated in 1985 to support and promote advanced networking among U.S. research and education institutions. Participants in NSFNET projects began with the national supercomputer centers and the National Center for Atmospheric Research (NCAR) and continued over time with a partnership team including Merit Network, Inc., IBM, MCI, Advanced Network & Services, Inc., and the State of Michigan; regional networks; and many institutions in research and education. Projects included the construction of data networks as well as the outreach required to spur adoption of networking technologies by researchers and educators. 2. NSFNET is also the name given to a nationwide physical network that was constructed to support the collective network-promotion effort. That network was initiated as a 56 kbps backbone in 1985. The network was significantly expanded from 1987 to 1995, when the early version of NSFNET was upgraded to T1 and then T3 speeds and expanded to reach thousands of institutions. Throughout this period, many projects were associated with the NSFNET program, even as the backbone itself became widely known as "the NSFNET." 3. From its inception as a part of NSF's overall inventory of high speed computing and communications infrastructure development, the NSFNET program was a pioneering force in academic computing infrastructure development and in the enhancement of research efforts through advanced network services. The NSFNET backbone, in its support of the broader set of NSFNET programs, linked scientists and educators on university campuses nationwide to each other and to their counterparts in universities, laboratories, government agencies, and research centers throughout the world. 4. By design, the NSFNET backbone made high speed networking available to national supercomputer centers and to inter-linked regional networks, which in turn worked to extend network availability to other research and educational organizations. Previously, only specific communities in computer science had limited access to networks such as CSNET, BITNET, and ARPANET, so the introduction of the NSFNET backbone represented a significant development in creating a unified and more comprehensive network infrastructure. By combining high-speed networking and connection between the supercomputing centers and regional networks, NSF created a "network of networks" that served as the focal point of nationwide networking during a critical period of pivotal development and that laid the foundation for today's Internet. 5. The most fateful policy choice in the Internet's history was the Clinton administration's decision to place the web in private hands and open it to commercial enterprise.

Long tail, and its three mechanisms

1. The theory of the long tail can be boiled down to this: Our culture and economy are increasingly shifting away from a focus on a relatively small number of hits (mainstream products and markets) at the head of the demand curve, and moving toward a huge number of niches in the tail. In an era without the constraints of physical shelf space and other bottlenecks of distribution, narrowly targeted goods and services can be economically attractive as mainstream fare. Mechanisms to reduce the cost of reaching niches 1. Democratize the tools of production e.g. PC 2. Democratize the tools of distribution e.g. Internet to cut cost of consumption 3. Connecting supply and demand e.g. introducing consumers to these newly available goods and services and driving demand down the tail

Application-specific discrimination

1. The two definitions of application-specific discrimination used in the text - "discrimination based on application or class of application" and "discrimination based on criteria that depend on an application's characteristics" - describe the same concept. 2. In van Schewick's paper, "application" refers to a specific instance of a specific type of application. Thus, "discrimination based on application" is differential treatment of different instances of the same application type depending on which instance the user is using (e.g., Skype vs. Vonage). The specific instance of an application a user is using is also a characteristic of the application (i.e. it is a characteristic of the application whether it is Vonage or Skype).

Reputation systems

1. There are currently very few practical methods for assessing the quality of resources or the reliability of other entities in the online environment. This makes it difficult to make decisions about which resources can be relied upon and which entities it is safe to interact with. 2. Trust and reputation systems are aimed at solving this problem by enabling service consumers to reliably assess the quality of services and the reliability of entities before they decide to use a particular service or to interact with or depend on a given entity.

Reputation systems/ trust systems

1. There are currently very few practical methods for assessing the quality of resources or the reliability of other entities in the online environment. This makes it difficult to make decisions about which resources can be relied upon and which entities it is safe to interact with. 2. Trust and reputation systems are aimed at solving this problem by enabling service consumers to reliably assess the quality of services and the reliability of entities before they decide to use a particular service or to interact with or depend on a given entity.

IPv4 vs. IPv6

1. There are two standards for IP addresses: IP Version 4 (IPv4) and IP Version 6 (IPv6). All computers with IP addresses have an IPv4 address, and many are starting to use the new IPv6 address system as well. Here's what these two address types mean: 2. IPv4 uses 32 binary bits to create a single unique address on the network. 3. IPv6 uses 128 binary bits to create a single unique address on the network. At the dawn of IPv4 addressing, the Internet was not the large commercial sensation it is today, and most networks were private and closed off from other networks around the world. When the Internet exploded, having only 32 bits to identify a unique Internet address caused people to panic that we'd run out of IP addresses. 4. Under IPv4, there are 232 possible combinations, which offers just under 4.3 billion unique addresses. IPv6 raised that to a panic-relieving 2128 possible addresses.

The distinction between desktop/laptops and smart phones

1. Think of a laptop as a desk-top computer on the go. It basically performs the same functions as a desk-top with the cool convenience of being able to use it anywhere. And with Wi-Fi (wireless Fidelity)connecting to the internet is not a problem. Laptops have CD/DVD internal drives and built in batteries that gives it on average four to ten hours of continuous use, with an added feature of sleep mode that allows it to remain on to save on battery power when not in use. Generally laptops are scaled down and weighs a few pounds causing the keyboard and screen to be smaller than a desktop. The processing abilities of laptops are impressive with some having a standard five-hundred gigabytes of storage space or even up to one terabyte (1000 gigabytes). External ports for peripherals such as mouse, printer and internet connectivity are included. - 2. Commonly referred to as a personal computer (pc) simply sits on a desk because it cannot be portable. A desktop consists of several pieces of external hardware such as a keyboard, mouse, monitor and tower with internal CD/DVD hard-drives. Standard processing is AMD with five-hundred gigabytes of storage space. It has all the external ports for a printer, scanner, fax-macine and connects to the internet via an Ethernet cable or through Wi-Fi. 3. t is great to have a device that fits in the palm of your hand, that can make calls,store files, send and receive emails and text messages, play games, record video and pictures and other cool capabilities such as processing credit-card transactions. No matter where you are in the world! Smartphones uses an operating system from its creator for example Apple's iPhone uses iPhone OS.

Uses of Special Purpose Computer

1. This can be used to control stop light: one wire for each light (output), and you can figure out the logic e.g. red light for 25 seconds, have a circuit produce output, have it remember that we're giving green to the people on Prospect Avenue. You subtract time from the counter and now the circuit says to change the output. 2. It is tedious to work out this logic. 3. You could have input that corresponds to button for people to push: you just need to specify what you want 4. This is the way traffic lights used to work 5. These computers can do anything as long as you express it in logic 6. The drawback is that you have to make a chip for each function.

ICANN

1. To reach another person on the Internet you have to type an address into your computer -- a name or a number. That address must be unique so computers know where to find each other. ICANN coordinates these unique identifiers across the world. Without that coordination, we wouldn't have one global Internet. 2. In more technical terms, the Internet Corporation for Assigned Names and Numbers (ICANN) coordinates the Internet Assigned Numbers Authority (IANA) functions, which are key technical services critical to the continued operations of the Internet's underlying address book, the Domain Name System (DNS). The IANA functions include: (1) the coordination of the assignment of technical protocol parameters including the management of the address and routing parameter area (ARPA) top-level domain; (2) the administration of certain responsibilities associated with Internet DNS root zone management such as generic (gTLD) and country code (ccTLD) Top-Level Domains; (3) the allocation of Internet numbering resources; and (4) other services. ICANN performs the IANA functions under a U.S. Government contract.

Network Neutrality

1. Treat all data on the internet equally A. Not block, discriminate, or charge differently B. by user, content, site, platform, app, etc. 2. Proponents A. Openness is the hallmark of the internet B. Net neutrality preserves competition C. Service providers have a near monopoly 3. Opponents A. Good to have a variety of services prices/plans B. Broadband space is sufficiently competitive C. Broadband industry is young and evolving

Four Kinds of Trust

1. Trust in competence 2. Trust in good will (you are honorable) 3. Secured trust (I have a hostage - R. Hardin etc.) 4. Phenomenological trust (A. Schutz: trust as the default position)

Four kinds of trust (in competence, in good will, secured trust, phenomenological trust)

1. Trust in competence 2. Trust in good will (you are honorable) 3. Secured trust (I have a hostage - R. Hardin etc.) 4. Phenomenological trust (A. Schutz: trust as the default position)

Trusted systems

1. Trusted systems are systems that can be trusted by outsiders against the people who use them. In the consumer information technology context, such systems are typically described as "copyright management" or "rights management" systems, although such terminology is loaded. 2. As critics have been quick to point out, the protections afforded by these systems need not bear any particular relationship to the rights granted under, say, U.S. copyright law. Rather, the possible technological restrictions on what a user may do are determined by the architects themselves and thus may (and often do) prohibit many otherwise legal uses. An electronic book accessed through a rights management system might, for example, have a limitation on the number of times it can be printed out, and should the user figure out how to print it without regard to the limitation, no fair use defense would be available. Similarly, libraries that subscribe to electronic material delivered through copyright management systems may find themselves technologically incapable of lending out that material the way a traditional library lends out a book, even though the act of lending is a privilege - a defense to copyright infringement for unlawful "distribution" - under the first sale doctrine

2014 Netflix throttling controversy

1. US cable giant Comcast has announced a deal with Netflix allowing Netflix's video-streaming service a more direct route through Comcast's network, which should improve streaming video quality for viewers. The first indications of the new deal between the companies came last week after App.net founder Bryan Berg observed more direct routes for Netflix data through Comcast's network. The Wall Street Journal reported on Sunday night that the change was the result of a formal, paid agreement between the two companies, but Comcast does not specify how much the deal is worth. 2. The result comes after a number of troubling moves by Comcast, which had seen Netflix speeds plummet on the network, as reflected in the service's monthly ISP rankings. Comcast has sworn it isn't throttling Netflix traffic, but the simple fact is that Netflix traffic has grown increasingly difficult to deliver onto Comcast's network, while other ISPs see little degradation. As analyst Rich Greenfield put it, "How come Time Warner is showing solid Netflix performance without paid peering?" 3. NB: Bandwidth throttling is the intentional slowing of Internet service by an Internet service provider.

Schema of Universal Circuit with Memory

1. Universal circuit produces outputs and data. The data enters the memory. Memory produces data, which, along with inputs and code, enters the universal circuit. The code and data come from the memory: the inputs do not. The universal circuit is the computer. 2. Turing proved the universality of this. Any logical computation that can be done can be done by this device. 3. Over decades, people figured out how to build this at low-cost. The light-switch in Robertson 002 probably has a general purpose computer in it. Universal circuits all over the place e.g. car has 20-30 computers in it. 4. Real computer e.g. laptop, mobile phone (computer has code that allows it to make calls) 5. Circuitry has many billions of switches, usually highly miniaturized; memory stores many billions of bits. 6. Recently, batteries have improved. Sensors like microphone, camera, accelerator (phone knows which way is up), barometer all embedded in phones.

"Network Neutrality and Quality of Service: What a Non-Discrimination Rule Should Look Like" by Barbara Van Schewick

1. Van Schewick makes the case for net neutrality regulations 2. Over the past ten years, the debate over "network neutrality" has remained one of the central debates in Internet policy. Governments all over the world, including the United States, the European Union, the United Kingdom, France and Germany, have been investigating whether legislative or regulatory action is needed to limit the ability of providers of Internet access services to interfere with the applications, content and services on their networks.Beyond rules that forbid network providers from blocking applications, content and services, non-discrimination rules are a key component of any network neutrality regime. Nondiscrimination rules apply to any form of differential treatment that falls short of blocking. Policy makers who consider adopting network neutrality rules need to decide which, if any, forms of differential treatment should be banned. Network neutrality proponents generally agree that network neutrality rules should preserve the Internet's ability to serve as an open, genera lpurpose infrastructure that provides value to society over time in various economic and noneconomic ways. There is, however, a lot of uncertainty on how to get from a high-level commitment to network neutrality to a specific set of rules. The decision for a non-discrimination rule has important implications: Non-discrimination rules affect how the core of the network can evolve, how network providers can manage their networks, and whether they can offer Quality of Service. Often, it is not immediately apparent how a specific non-discrimination rule affects network providers' ability to offer Quality of Service. At the same time, it is unclear which forms of Quality of Service, if any, a network neutrality regime should allow. This paper proposes a framework that policy makers and others can use to choose among different options for network neutrality rules and uses this framework to evaluate existing proposals for non-discrimination rules and the non-discrimination rule adopted by the FCC in its Open Internet Order. In the process, it explains how the different non-discrimination rules affect network providers' ability to offer Quality of Service and which forms of Quality of Service, if any, a non-discrimination rule should allow.

Freemium model

1. Version 1: Give something away that will require many people to purchase something else. 2. Version 2: Give something away and hope that people will like it so much they will want to pay more for even better.

Stakeholders: The Antis

1. Visible: Anti-regulation Ideologues (Heartland Institute, National Review, Tea Party elements, Americans for Prosperity (Koch Bros.), Internet Freedom Association) 2. ISPs - National Cable & Telecommunications Association, AT&T, Verizon, etc. A. Long term dream: Internet as cable television (AOL 1998) B. Per-service fee structure (e.g., plans with menu of fees) C. Fees from content originators D. Competitive advantage for own telephone & video services 3. Technology providers (Cisco, Jupiter): Protect market for network management hardware and software

Fate of News Industry

1. WaPo sold to Jeff Bezos, Amazon founder 2. NYT sells Boston Globe at 93 percent loss 3. There is now an newspaper death watch website

The History of Apache

1. Web server software developed at the University of Illionois 2. Turned into open source project. This is not a Stallman-free project. The license is viral and does not require you to pass along the rights to others. The license only requires you to disclose the presence of Apache software. 3. Apache outcompeted commercial products. This is an existence proof that open source software can beat deep-pocketed competitors in some cases.

Apache

1. Web server software developed at the University of Illionois 2. Turned into open source project. This is not a Stallman-free project. The license is viral and does not require you to pass along the rights to others. The license only requires you to disclose the presence of Apache software. 3. Apache outcompeted commercial products. This is an existence proof that open source software can beat deep-pocketed competitors in some cases. Web Server Market Share Apache: 50% nginx (also open source): 20% Microsoft: 13%

Weighted centrality

1. Weighted centrality: sum of differences between maximal centrality of one point and the centrality of every other point. The sum is then divided by its maximal value. Google's big innovation was to use a version of weighted centrality.

"Reputation Formation and the Evolution of Cooperation in Anonymous Online Markets" by Andreas Diekmann, Ben Jann, Wojtek Przepiorka and Stefan Wehrli

1. What are the secrets of online business? 2. There are several: another ingredient in many systems, especially auctions, is trust and reputation (Diekmann et al. 2014) 3. Theoretical propositions stressing the importance of trust, reciprocity, and reputation for cooperation in social exchange relations are deeply rooted in classical sociological thought. Today's online markets provide a unique opportunity to test these theories using unobtrusive data. Our study investigates the mechanisms promoting cooperation in an online-auction market where most transactions can be conceived as one-time-only exchanges.using a large dataset comprising 14,627 mobile phone auctions and 339,517 DVD auctions, our statistical analyses show that sellers with better reputations have higher sales and obtain higher prices. Furthermore, we observe a high rate of participation in the feedback system, which is largely consistent with strong reciprocity—a predisposition to unconditionally reward (or punish) one's interaction partner's cooperation (or defection)—and altruism—a predisposition to increase one's own utility by elevating an interaction partner's utility. Our study demonstrates how strong reciprocity and altruism can mitigate the free-rider problem in the feedback system to create reputational incentives for mutually beneficial online trade.

"Needles in the Haystack: Google and Other Brokers in the Bits Bazaar" by Hal Abelson, Ken Ledeen and Harry Lewis

1. What are the secrets of online business? 2. There are several: gathering and selling consumer information (Abelson et al.) and "access to eyeballs" is at the core of most of them

"The Economics of Giving it Away" by Chris Anderson

1. What are the secrets of online business? 2. There are several: giving things away for free (Anderson 2009) is a major tool. (Google provides free search, Facebook gives you a social network and tools to reach it; Pandora provides music while Spotify provides music and network services.) Another important element of many models is making users work for free, either producing value directly (Facebook) or through recommendation systems (Yelp, Amazon, eBay)

Analyzing a Policy

1. What is the issue? Why does it matter? 2. What are the major arenas in which the problem can be addressed? (Congress? Courts? Code? Markets?) 3. What are the policy options? 4. What values does each option promote or corrode? 5. Who are the stakeholders and what are their interests? What is the configuration of influence and power? 6. How feasible are the options politically and technically?

"Reachability"

1. When a directed graph is not strongly connected, it's important to be able to describe its reachability properties: identifying which nodes are "reachable" from which others using paths. 2. To define this notion precisely, it's again useful to draw an analogy to the simpler case of undirected graphs, and try to start from there. 3. For an undirected graph, its connected components serve as a very effective summary of reachability: if two nodes belong to the same component, then they can reach each other by paths; and if two nodes belong to different components then they can't. 4. All nodes within a strongly connected component can reach each other, and strongly connected components correspond as much as possible to separate "pieces," not smaller portions of larger pieces.

Two ways in which the Internet is a network (physical and hyperlinked)

1. When most people think of the Internet, the first thing they think about is the World Wide Web. Nowadays, the terms "Internet" and "World Wide Web" are often used interchangeably—but they're actually not the same thing. 2. The Internet is the physical network of computers all over the world. 3. The World Wide Web is a virtual network of websites connected by hyperlinks (or "links"). Websites are stored on servers on the Internet, so the World Wide Web is a part of the Internet.

Stakeholders

1. Who represents the public interest? 2. Who defines the public interest?

Economics of Open Source (supply side)

1. Why develop? A. "Scratch your own itch" in Raymond's terms: you can fix problem that bugs you, and you can do it, which is not case with commercial software e.g. Amazon used Linux in its data centers in the beginning. B. Demonstrate skill: contribution is public. You can say, "I built the cryptography engine in Linux." And discussion of contribution is public, so people can hire you. 2. Why contribute your changes? Don't have to put in extra effort to keep updating software. Of course, not all users in this mode. Linux has 50 million users. If only 0.1% of users contribute, you have a team of 5,000 programmers, which is bigger than the teams of Microsoft and Apple. Linux has 1,000 people working for it every day. 75% of people are doing Linux at work. A lot of people get paid to make it better. 3. How to bootstrap from one Finnish grad student to millions is even tougher.

Examples of Networks

1. WiFi: radio-based network that relies on a central hub. Computer has individualized connection it is using. Packet goes to access point, then to next computer. 2. 3G/4G mobile network: there is a cell tower with antennas; phone talks to tower; again, one central radio point. 3. Local wired networks: switch forwards information between devices All these networks are local. Internet is much bigger.

The stakes are high.

1. Will the Internet be a frontier - dangerous, but open to innovation, creativity, and economic initiative? Or will it be a set of walled gardens, each attractive and easy to use, but limited in the affordances it offers? 2. Will the Internet be a tool that we use? Or will we (and our attention) be the products, not the producers?

Aggregation

1. Winners A. eBay (here too), Facebook , B. Chowhound - etc. C. Viral marketing 2. Losers A. Late movers B. Critics C. Traditional ad agencies 3. Strategies A. First-mover advantages + exploitation of network externalities (the larger the network the greater its value) B. Crowdsourcing advice systems, rating systems, etc. can substitute for identity-based trust

The Hierarchy of Complexity

1. Wires/Switches (the most simple building block) 2. Circuits 3. Special purpose computers (i.e. device does computing) 4. General-purpose computers (this is the leap Turing made) 5. Real computers (the device you have)

Automatic updating

1. With automatic updating, the computer regularly checks - typically daily- for updates from the OS publisher and from the makers of any software installed on the PC. 2. With automatic updating, the OS and attendant applications become services rather than products. 3. Automatic updating works in concert with appliancization, allowing manufacturers to see when their software has been hacked or altered- and to shut down or reinstall the original OS when they have.

Creative commons license

1. You are free to: Share — copy and redistribute the material in any medium or format 2. The licensor cannot revoke these freedoms as long as you follow the license terms. 3. Under the following terms:Attribution — You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use. 4. NonCommercial — You may not use the material for commercial purposes. 5. NoDerivatives — If you remix, transform, or build upon the material, you may not distribute the modified material.

An Example of Disintermediation

1. You used to need experts to: A. find the products B. create a market 2. Now eBay does it.

From Circuit to Special Purpose Computer

1. Zoom out and view circuit from macro level: input goes into the square circuit from the left and exits as output on the right. The output is some logical formula defined over the inputs 2. To turn this into a special purpose computer, need to add time and memory. Outputs enter memory, which in turn enters along with inputs into the circuit. A clock sends information to both the memory and the circuit, and is situated between the two. Every time the clock ticks, memory and inputs are delivered to the circuit, which produces outputs and memory. This is how the circuit remembers what it produced earlier.

Eric Raymond

1. a long-time hacker, active in the Internet culture since the 1970s, who got unexpectedly famous in the late 1990s as a result of events he has described elsewhere. 2. Either founded or re-invented (depending on who you ask, and how some history is interpreted; he prefers 're-invented', myself) the open source movement. If that term means nothing to you, think Linux. Linux is the open-source community's flagship product.

"Innovations in the Internet's Architecture that Challenge the Status Quo" by Christopher Yoo

1. aA paper by Yoo, a leading opponent of net neutrality, that provides a history of peering 2. The network no longer adheres to the rigid and uniform hierarchy that characterized the early Internet and its predecessor, the NSFNET. Packets can now travel along radically different paths based on the topology of the portion of the network through which they travel. This is the inevitable result of reducing costs and experimenting with new structures. At the same time that network providers are experimenting with new topologies, they are also experimenting with new business relationships. Gone are the days when networks interconnected through peering and transit and imposed allyou-can eat pricing on all end users. That fairly simple and uniform set of contractual arrangements has been replaced by a much more complex set of business relationships that reflect creative solutions to an increasingly complex set of economic problems. Again, these differences mean that the service that any particular packet receives and the amount that it pays will vary with the business relationships between the networks through which it travels. Although many observers reflexively view such deviations from the status quo with suspicion, in many (if not most) cases, they represent nothing more than the natural evolution of a network trying to respond to an ever-growing diversity of customer demands. Imposing regulation that would thwart such developments threaten to increase costs and discourage investment in ways that ultimately work to the detriment of the consumers that such regulation is ostensibly designed to protect.

Mobile phone replaces older devices

1. alarm clock 2. watch 3. landline phone

Pets.com

1. b. 1998; d. 1999 2. Slogan:"Because pets can't drive" 2. High point: Well received Superbowl ad. 3. Low point: Marketing budget exhausted after well-received Superbowl ad. Lessons learned: If you sell kitty litter, don't offer free shipping.

How to be a mobile carrier

1. buy/license spectrum 2. build radio towers: divide area into "cells" How to build the best cells If you build one giant cell with 100 channels, can only have 100 users. If you build many tiny overlapping cells, can use same spectrum in cells that don't overlap, thereby increasing number of users. Nevertheless, it can be expensive to have little cells, because you must negotiate with the owners of land and figure out ways communities can accommodate cell towers. Can't forget that another disadvantage of faraway towers is that they drain phones' battery life, as they must expend energy to communicate with the tower. Also spectrum is licensed regionally, so to be a good mobile carrier, you need to make a deal with companies that have infrastructure in other areas. Because of these high costs, we really only have 2.5 mobile carriers. Luckily, allocation of spectrum and creation of towers are good leverage points for policy. For carriers, relationship with the government is central.

Ideas of Freedom on Net Still Big

1. e.g. Arab Spring 2. Also freedom to mix up old stuff e.g. Mickey Mouse as Wolverine. Lessig wrote book called Remix, which celebrated this remixing capacity. Fan fiction is an example of this.

"Bow-tie structure of the web"

1. the Web contains a giant strongly connected component 2. IN: nodes that can reach the giant SCC but cannot be reached from it — i.e., nodes that are "upstream" of it. 3. OUT: nodes that can be reached from the giant SCC but cannot reach it — i.e., nodes are "downstream" of it. 4. The "tendrils" of the bow-tie consist of (a) the nodes reachable from IN that cannot reach the giant SCC, and (b) the nodes that can reach OUT but cannot be reached from the giant SCC. 5. Disconnected: Finally, there are nodes that would not have a path to the giant SCC even if we completely ignored the directions of the edges. These belong to none of the preceding categories. 6. The bow-tie picture of the Web provides a high-level view of the Web's structure, based on its reachability properties and how its strongly connected components fit together. From it, we see that the Web contains a central "core" containing many of its most prominent pages, with many other nodes that lie upstream, downstream, or "off to the side" relative to this core. It is also a highly dynamic picture: as people create pages and links, the constituent pieces of the bow-tie are constantly shifting their boundaries, with nodes entering (and also leaving) the giant SCC over time. But subsequent studies suggest that the aggregate picture remains relatively stable over time, even as the detailed structure changes continuously.

The Long Tail

1.On-line retailing + 2.Cheap production + 3.Robotics in warehousing + 4.GPS in trucking fleets = 5.Robustness of long tail strategy. When demand is highly dispersed, selling a few copies of lots and lots of different things may work. (But you need recommendation systems for consumers to find the products they want.)

Cloud (including Noam's view of cloud computing's regulatory implications)

1.What we call today a 'cloud' is really just a continuation of concept that earlier was called 'time sharing', 'grid computing', 'utility computing', 'thin clients', 'terminal computing', and 'network computer'. The words change, the players rotate, but the plot stays familiar. The basic idea is constant: for a user to obtain computing resources such as storage,processing, databases, software, networks, platforms, etc, from somewhere else. 2. Eli Noam: More and more functions will move to a few powerful cloud providers, threatening interoperability and innovation. 3. A few companies will potentially have immense control and information over the Internet and how we use it, unless regulations protect competition. 4. Why? High fixed costs + low marginal costs + high network externalities = pressures toward industrial concentration

How Networks Work

A basic local network 1. Three computers, each represented by a box with a line connecting to a central line, which is a wire 2. This is the simplest network, which connects computers to each other: you can also do this wirelessly. WiFi is represented in the same way. This is also a local network

Network Externalities (explained)

A network has positive externalities if the value of that network increases as a function of the number of persons (or nodes of any kind) that it includes.

"Issue domain"

A set of policies that 1. are linked by common values, interests and stakeholders; and 2. choices about each have implications for choices about the others Key Issue Domains: •Regulation of Markets •Intellectual Property •Equality and Access •Security •Privacy •Freedom of Expression •Governance •Sovereignty •eGovernment •Social Media

Path-Dependent Nature of Technology

ARPANET as a case in point 1. As first envisioned by Rand Corporation scientists, ARPANET was meant to have military value, which meant it was produced to be durable, robust, and hard-to-kill... 2. When libertarian hackers use that kind of technology, they are able to foster very different values (decentralization, free speech, easy mobilization of collective action, not to mention less noble forms of hacking) based on: A. Open architecture B. Distributed computing C. Redundant functions

Instruction set

An instruction set is a group of commands for a CPU in machine language. The term can refer to all possible instructions for a CPU or a subset of instructions to enhance its performance in certain situations.

Edge providers

Any individual or entity that provides any content, application, or service over the Internet, and any individual or entity that provides a device used for accessing any content, application, or service over the Internet.

ATT&T/FaceTime Case Study

Apple FaceTime • High-quality video chat service • Originally available only over WiFi

With shifts from PCs to portable devices, Zittrain fears the substitution of "appliances" for the generative web.

Appliances: 1. offer fewer affordances 2. are less programmable 3. are less interoperable 4. less generative

Complex Interrelationsips

Between mobile service providers and network equipment vendors

BitTorrent

BitTorrent is a protocol for the practice of peer-to-peer file sharing that is used to distribute large amounts of data over the Internet. BitTorrent is one of the most common protocols for transferring large files.

Music Industry reality

But what's bad for major integrated recording companies is not necessarily bad for music.

One Implication of Mobile Technology

Can reach a person anywhere. This affects how people plan where and when to meet. Before mobile phones, needed to meet people in exact location.

Foresight is limited by both undue optimism

Cases of undue optimism 1. Picture phones in the Jetsons 2. NY Mayor Wagner and friend talking with Mrs. Ladybird Johnson on picturephone, 1964: this is Skype today: creators tossed out technology, not realizing that people would become less formal and not care about being seen 3. Electro & Sparko: Elektro was built in 1937/38 by Westinghouse at their factory in Mansfield, with the idea that everyone would have robots in the next 20-30 years. He was over seven foot tall and had a vocabulary of over 700 words. He first appeared at the 1939 New York World's Fair and returned there in 1940 with Sparko the dog, his operators putting on a twenty minute show every hour during the Fairs two summer run. Elektro was retired in the late 50s to be a static display at Palisades Theme Park in Ocean Side, California and after appearing in a few movies was sold for scrap. Even today, robot AI is bad, even though robots are very good at manufacturing

Network centrality

Centrality = (popularity weighted by between-ness and/or by the popularity of the nodes that send ties) A. Popularity (number of incoming ties) B. Between-ness (how many other nodes must go through a given node = power, influence)

A group of insightful legal scholars from Harvard, Stanford and Yale have developed an analysis of this phenomenon that clarifies the relationship between technology, code, law, markets and norms. If one considers their work together, it adds up to a thorough analysis of the Internet's potential as a force for liberty or tyranny.

Common Themes: 1. Something has been lost in the shift from the "Navigational Web" to the "Transactional Web" (Easley & Kleinberg) 2. Architecture and Code embody values 3. Tension or misfit between values embodied in early web and those of existing law and commerce 4. Shift from regulation through norms to regulation through laws that use code 5. From anonymity to identity and accountability 6. From privacy to monitoring 7. Degradation of "End to End Principle" 8. From dumb pipes to smart pipes 9. The policy challenge: Providing necessary security while minimizing loss of privacy and autonomy.

Memory

Computer memory: device that is used to store data or programs (sequences of instructions) on a temporary or permanent basis for use in an electronic digital computer. Computers represent information in binary code, written as sequences of 0s and 1s. Each binary digit (or "bit") may be stored by any physical system that can be in either of two stable states, to represent 0 and 1. Such a system is called bistable. This could be an on-off switch, an electrical capacitor that can store or lose a charge, a magnet with its polarity up or down, or a surface that can have a pit or not. Today capacitors and transistors, functioning as tiny electrical switches, are used for temporary storage, and either disks or tape with a magnetic coating, or plastic discs with patterns of pits are used for long-term storage.

How to Make a Big Network

Connect smaller networks. 1. Imagine two computers connected to one wire that connects to a third computer which is also connected to a network of three computers. The computer linking these two networks together is known as the "gateway" or "router." The gateway allows the connection of multiple network 2. This idea of connecting networks was originally called the "internetwork." As soon as this capability came about, people started connecting everything. The "Internet" is formed by these "internetworks." 3. To get on the Internet, need to find someone who's already on and use a gateway to link up your network. 4. Internet connectivity is not centrally managed. It's actually tough to make the Internet work at a large scale.

Values Motivate Preferences

Constitutive values of first U.S. communication infrastructure were the values of democracy (Paul Starr)

What is "Net Neutrality" 2nd cut: Consensus definition

Corporations and governments may not interfere with users' right to access any part of the World Wide Web for any legal purpose. 1. They may not interfere with the transmission of particular content, particular sites, classes of websites, or classes of web traffic. 2. They may not interfere by: A. blocking B. discriminating with technology C. discriminating through pricing

The Critical Role of DPI

Deep Packet Inspection can be used for: -Analyzing data flows -Responding to government information requests -Network security -Facilitating services like VoIP that require special treatment -(DPI and "packet sniffing") BUT! DPI also gives operators the opportunity to block content they don't like; block competing services; and gather data on customers.

Another example of complex relationship between commercial and state interests

Deep-Packet Inspection •Backdoors in pipes mandated under powers given the Justice Department under the Communications Assistance to Law Enforcement Act (CALEA) of 1994 •But backdoors can be used by ISPs to manage traffic and gain commercial advantage •They also make ISPS vulnerable to demands that they serve as IP law enforcement agents The result has been the degradation of the "End to End Principle" -- From dumb pipes with smart terminals to smart pipes that know what the smart terminals are doing.

Values I: Circulation of Information and a Vigorous Civil Society

Democracy requires publics with access to information about government and the ability to share ideas among one another. Implies: 1. Freedom of the press; 2. An inexpensive, national postal system, free from politics (common carriage); 3. Indirect subsidy of the press through cheap postal service (special rates)

Policy Issues and Policy Arenas

Different policy issues tend to live in different combinations of policy arenas.

DNS and Names

Domain Name System e.g. www.google.com 1.DNS: system that maps names to IP addresses 2. DNS is administratively controlled 3. It works by delegation e.g. any name that ends with .princeton.edu is delegated to the University 4. This system exists to make things easier for people

Secrets of Online Commerce

E-Commerce Retail Sales, 1998-2014: grown more than 9 times between 2000 and 2014

The Story of Linux

Finnish grad student Linus Torvalds 1. Build GPL-licensed OS kernel (UNIX-like). He emailed it to friends to say that they were welcome to work on it. 2. Kernel grew. Linus successfully built a kernel. 3. Linux kernel (Linus) + GNU Utilities (Stallman) = Linux. Linux grew organically from one guy to thousands of Linux programmers. The development team is as big as those at Windows or Mac OS.

Linux

Finnish grad student Linus Torvalds 1. Build GPL-licensed OS kernel (UNIX-like). He emailed it to friends to say that they were welcome to work on it. 2. Kernel grew. Linus successfully built a kernel. 3. Linux kernel (Linus) + GNU Utilities (Stallman) = Linux. Linux grew organically from one guy to thousands of Linux programmers. The development team is as big as those at Windows or Mac OS. Linux Market Share 1. Desktop: 1% 2. Handheld devices: 53%, as Linux is at the base of Android and many mobile gaming devices. 3. Server as in server computers: 37%. Linux is the leader in this market. That is to say, Linux has outcompeted commercial products. That's pretty amazing.

Stallman's Legal Strategy

GNU General Public License (GPL) 1. Because software is a creative work, you don't need copyright. 2. GPL is a statement Stallman put on his own software. 3. GPL gives anyone the right to share or modify the software, but shared/modified versions must carry the same license. 4. This is called a "viral license." Stallman is taking his legal right to prevent you from sharing software to allow sharing. He's using the force of copyright to protecting sharing. This is some serious legal judo. 5. Also under the GPL, you must provide the source code. If you violate the GPL, Stallman will sue you for copyright infringement. Once you say you accept the GPL, you must abide by it. Stallman objects to the refusal to share.

Routing

Given a destination IP address, how to get a packet there? 1. Part of answer: one hop at a time. Get closer each time by moving packet to device that is closer. 2. How to learn good path? Answer: gossip Routers speak with each other e.g. I know how to get to one place in one hop, another in six hops, and yet another in nine hops. Neighbors speak about their capabilities, and information floods through routers. 3. What happens if network goes away? Implications will spread through gossip network. But in meantime, incoming packet will meet gossip. Network can adjust, but that will not always work e.g. if a bridge collapses under a car, the car will not get to its destination.

Promoting a Virtuous Cycle

Good networks encourage the use of mobile devices, which in turn encourage people to develop applications, which users use, their use then encourages the creation of well-functioning networks

Governments and Online Behavior

Government has little power to regulate individual on-line behavior (without the help of social norms), but lots of power to regulate companies that can regulate individual behavior

IP mapping services

Greater regulation has entailed a movement from anonymity and freedom to identity and accountability. 1. IP Mapping software enables states to locate activity in space (essential for identifying violation of state or national laws; useful for retail)

Scaling to Critical Mass: Adobe

How can you take over an emerging software market? 1.Give away your reader. 2.Charge for the software (which has more value because everyone can read it on the free reader). (See Freemium strategy.) 3.Bundle features that make more specialized products obsolete (scanner drivers, OCR capability for forms). 4.Enjoy your near monopoly.

The policy challenge: Providing necessary security while minimizing loss of privacy and autonomy.

How do we increase both user privacy, autonomy, generativity, and security/reliability? Increasingly a critical challenge for policy is defining "need." National security generates legitimate needs - but what protections can be retained without sacrificing national security needs. Commercial systems need information to serve their users and manage their networks. Does Google need unlimited access to personal data because its business model depend upon it? 1. Use ID devices that give websites only as much information as they need. 2. Let consumers see the data on them that websites and aggregators are sharing 3. Let consumers take their personal data with them when they leave websites. 4. Limit warrantless searches 5. Placing time limits on personal-data retention 6. Encourage software registration for automatic updates to enhance security 7. Extend warranties for users who opt for more secure configurations

The Politics of Elite Division

Issues make it onto the political agenda when (a) lots of people care deeply about them or when (b) elites are split. 1. It's hard to get people to care about highly technical issues like network neutrality. 2. Network neutrality stays on the agenda because of conflict among major corporations --- the content and service providers vs. the Internet access providers -- over the new political economy of communications and information and entertainment. 3. If edge providers and access providers merge (Comcast/NBC) or reach accommodation (Verizon/Google) the issue will be resolved in favor of large companies.

Technologies are Malleable and dead ends are common

It's not just great minds that think alike... 1. Smithville Problems: 1) Sudden bumps, 2) Only one monorail led to confrontations when riders headed in different directions! 2. Jinnosuke Kajino planned a bicycle railroad. This plan did not materialize. This railroad bicycle does not understand even structure. This plan is...dated Aug-ust, 1889

Commercial success of Linux

Linux Market Share 1. Desktop: 1% 2. Handheld devices: 53%, as Linux is at the base of Android and many mobile gaming devices. 3. Server as in server computers: 37%. Linux is the leader in this market. That is to say, Linux has outcompeted commercial products. That's pretty amazing.

Targetting Eyeballs: Winning Formula

More targeted ads: 1. selling keywords for immediate response; 2. using aggregated data on user behavior (not demographics) History 2003: Yahoo acquires overture.com; 2007, Yahoo acquires yieldmanager.com, adrevolver.com 2005: Google starts Google Analytics, 2007: Google acquires Doubleclick.net, feedburner.com and admobs.com 2007: Microsoft acquires aquantive.com 2004: AOL acquires advertising.com; 2007, AOL acquires tacoda.net, adsonar.com

The .com Boom

NASDAQ has continued to grow, even after boom: it's higher than it was before the boom.

Key tenets of 2010 Internet Policy

On Dec. 21 2010, after more than one year of investigation and debate, the FCC passed (by a 3-2 party-line vote): Preserving the Open Internet and Broadband Industry Practices Report and Order FCC-2010 (Open Internet Order). These rules became effective on November 20, 2011. 1. First principle: Transparency: All providers must clearly disclose network management practices and terms to consumers. 2. Second principle: No Blocking . "Fixed broadband providers may not block lawful content, applications, services, or non-harmful devices; mobile broadband providers may not block lawful websites, or block applications that compete with their voice or video telephony services." 3. Third principle: No unreasonable discrimination: "Fixed broadband providers may not unreasonably discriminate in transmitting lawful network traffic." This includes a near prohibition of "pay-for-priority" tiering of services to content providers. 4. "Reasonable network management" is ok: including transparent tiered plan offerings to subscribers; also application-agnostic discrimination (limit heavy users during congestion); spam blocking and protection against cyber-attack. 5. More lenient rules for cellular devices.

The FCC's Open Internet Order

On Dec. 21 2010, after more than one year of investigation and debate, the FCC passed (by a 3-2 party-line vote): Preserving the Open Internet and Broadband Industry Practices Report and Order FCC-2010 (Open Internet Order). These rules became effective on November 20, 2011. 1. First principle: Transparency: All providers must clearly disclose network management practices and terms to consumers. 2. Second principle: No Blocking . "Fixed broadband providers may not block lawful content, applications, services, or non-harmful devices; mobile broadband providers may not block lawful websites, or block applications that compete with their voice or video telephony services." 3. Third principle: No unreasonable discrimination: "Fixed broadband providers may not unreasonably discriminate in transmitting lawful network traffic." This includes a near prohibition of "pay-for-priority" tiering of services to content providers. 4. "Reasonable network management" is ok: including transparent tiered plan offerings to subscribers; also application-agnostic discrimination (limit heavy users during congestion); spam blocking and protection against cyber-attack. 5. More lenient rules for cellular devices.

Open Internet Advisory Committee

Open Internet Advisory Committee (2012) - Track effects of the Open Internet Order - Provide recommendations to the FCC Mobile broadband working group - Mobile broadband is crucial to the Internet - Yet, the technology is immature Special treatment in Open Internet Order - Transparency - No blocking of competing applications - No discrimination except for management practice

FCC and Open Internet

Openness: "the absence of any gatekeeper blocking lawful uses of the network or picking winners and losers online" 1. Open Internet Order (2010) A. Transparency B. No blocking C. No unreasonable discrimination 2. Verizon vs. FCC (2014) A. FCC has no authority to enforce these rules B. since providers are not common carriers

Introduction to Computers

Path-dependence 1. Technology reflects long history of contingent choices 2. Although there's a big menu of steps forward, menu is limited: there are some things that are easier to do than others. The menu is also the result of past choices. 3. Aspects of the menu are written in part by nature. We need to understand the technical details.

All new technologies are strange

People struggled with books in the same way we struggle with the internet

Unlocking

Phones are typically locked into the network of a carrier. Carriers want to ensure they keep your business. Do not like it when customers unlock and have ability to join other mobile networks.

Who Imagines and Defends the Public Interest?

Policy research and advocacy groups play a critical role in (a) Gathering information; (b) Presenting information; (c) Clarifying values; (d) Transforming issue preferences into coherent positions. (e) Advocating for philosophically consistent policy packages. (f) Presenting particular interests as the "public interest."

Google's Ad Revenue

Rakes in more ad dollars than US print media

Internet registries

Regional Internet Registries (RIRs) are nonprofit corporations that administer and register Internet Protocol (IP) address space and Autonomous System (AS) numbers within a defined region. RIRs also work together on joint projects.

Registry

Regional Internet Registries (RIRs) are nonprofit corporations that administer and register Internet Protocol (IP) address space and Autonomous System (AS) numbers within a defined region. RIRs also work together on joint projects.

Arthur Clarke

Right on technological developments, wrong on the social implications of those development, as technology develops based on what people are inclined to do with it

Search engine optimization

Search engine optimization is a methodology of strategies, techniques and tactics used to increase the amount of visitors to a website by obtaining a high-ranking placement in the search results page of a search engine (SERP) -- including Google, Bing, Yahoo and other search engines.

Free software

Stallman's Position 1. Morally wrong to prevent someone from modifying or sharing software 2. Okay to get paid for stuff, but have to allow people to modify and share software Stallman sees software as being similar to science in that it needs to be advanced 3. Stallman's Economic Argument: software has high fixed costs and zero marginal costs: if it's free to give it to people, you should. 4. Stallman embedded this idea in term "free software." Free has 2 meaning: free speech and free beer. Software is free as in speech not as in beer. Free also known as "libre." One can also call it FOSS (Free and open-source software) or FLOSS (free/libre and open-source software). Stallman was the first to reach this position, and still believes in it strongly.

Chris Anderson's Strategies

The Long Tail

Why has net neutrality been such a thorny issue?

The Politics of Elite Division

3 characteristics of Web 2.0 (E&K citing Tim O'Reilly): (a) collective creation and maintenance of web content; (b) movement of people's personal online data from own computers to the cloud; (c) "growth of linking styles that emphasize online connections between people, not just between documents."

The increasing richness of Web content, which we've encountered through the distinction between navigational and transactional links, fueled a series of further significant changes in the Web during its second decade of existence, between 2000 and 2009. Three major forces behind these changes were (i) the growth of Web authoring styles that enabled many people to collectively create and maintain shared content; (ii) the movement of people's personal on-line data (including e-mail, calendars, photos, and videos) from their own computers to services offered and hosted by large companies; and (iii) the growth of linking styles that emphasize on-line connections between people, not just between documents. Taken together, this set of changes altered user experience on the Web sufficiently that technologists led by Tim O'Reilly and others began speaking in 2004 and 2005 about the emergence of Web 2.0

GNU

The name "GNU" is a recursive acronym for "GNU's Not Unix." GNU Project: 1. Build a GPL licensed "clone" of UNIX 2. NB: UNIX: Pronounced yoo-niks, a popular multi-user, multitasking operating system developed at Bell Labs in the early 1970s. Created by just a handful of programmers, UNIX was designed to be a small, flexible system used exclusively by programmers. 3. Operating systems consists of a kernel as well as utilities. The kernel is the core of the OS, while utilities make the OS useful. 4. Stallman was an amazing programmer, but while the utilities were good, the kernel was bad.

ARPAnet

The precursor to the Internet, ARPANET was a large wide-area network created by the United States Defense Advanced Research Project Agency (ARPA). Established in 1969, ARPANET served as a testbed for new networking technologies, linking many universities and research centers. The first two nodes that formed the ARPANET were UCLA and the Stanford Research Institute, followed shortly thereafter by the University of Utah. 1. As first envisioned by Rand Corporation scientists, ARPANET was meant to have military value, which meant it was produced to be durable, robust, and hard-to-kill... 2. When libertarian hackers use that kind of technology, they are able to foster very different values (decentralization, free speech, easy mobilization of collective action, not to mention less noble forms of hacking) based on: A. Open architecture B. Distributed computing C. Redundant functions

Further Complicating Matters

Then, to complicate matters, the tools used to combat threats from malware, IP-law violation, and cyberterrorism becomes weapons that can be used against the populations they are supposed to protect. 1. Software that can be used to make websites more secure can also be used to violate the privacy of people who use those websites A. e.g. LinkedIn faces lawsuit for alleged email hacking scheme. B. e.g. A major American computer security company has told thousands of customers to stop using an encryption system that relies on a mathematical formula developed by the National Security Agency (NSA). RSA, the security arm of the storage company EMC, sent an email to customers telling them that the default random number generator in a toolkit for developers used a weak formula, and they should switch to one of the other formulas in the product.

Reliability

There's no guarantee that the packet will arrive. 1. The best effort principle: devices make their best effort to get packet to its destination. The packet could get duplicated. To cope with this problem, receiver sends a packet acknowledging receipt of packet. Sender resends, if no acknowledgment received. 2. There can be congestion in the network. There can be such a huge surge that the network cannot handle. Luckily, you can throw stuff away because of the no-guarantee principle. This makes the internet more efficient. Traditional telecoms wanted to get every call through: not true of founders of the internet.

"Preserving the Open Internet" by Federal Communications Commission

This Report and Order establishes protections for broadband service to preserve and reinforce Internet freedom and openness. The Commission adopts three basic protections that are grounded in broadly accepted Internet norms, as well as our own prior decisions. 1. First, transparency: fixed and mobile broadband providers must disclose the network management practices, performance characteristics, and commercial terms of their broadband services. 2. Second, no blocking: fixed broadband providers may not block lawful content, applications, services, or non-harmful devices; mobile broadband providers may not block lawful Web sites, or block applications that compete with their voice or video telephony services. 3. Third, no unreasonable discrimination: fixed broadband providers may not unreasonably discriminate in transmitting lawful network traffic. These rules, applied with the complementary principle of reasonable network management, ensure that the freedom and openness that have enabled the Internet to flourish as an engine for creativity and commerce will continue. This framework thus provides greater certainty and predictability to consumers, innovators, investors, and broadband providers, as well as the flexibility providers need to effectively manage their networks. The framework promotes a virtuous circle of innovation and investment in which new uses of the network—including new content, applications, services, and devices—lead to increased end-user demand for broadband, which drives network improvements that in turn lead to further innovative network uses.

Once the Web became a commercial medium, regulation through norms no longer worked; since 2000, regulation through code has yielded to regulation through laws that use code.

This is particularly the case for Intellectual Property Rights enforcement 1. Industry use of IP addresses to identify downloaders (through BitTorrent sniffing using phony accounts) 2. Legally mandated use of take-down notices to pull allegedly infringing content off of web sites (in return for legal indemnity for cooperating sites) 3. Proposed use of DNS (Domain Name System) registrars to strip domain names from infringing websites (as a way of reaching sites outside of continental U.S.) 4. Graduated response programs through which IP holders enlist ISPs (with or without legal mandate) to contact and warn downloaders, inflicting a series of sanctions on the noncompliant.

Industry: Making Policy through Network Management

Three kinds of network management (Ed Felten) 1. Minimal discrimination - against packets (based on a tiering system) only when load too heavy for all packets to make it 2. Non-minimal discrimination -- Certain classes of packets restricted to some share of the system (e.g. 20% for BitTorrent) even if the system is free 3. Minimal delay discrimination (Van Schewick's "quality of service" (QoS) - Tiering to delay some packets in favor of "high-priority" packets vulnerable to delay (e.g. VoIP telephony, which is vulnerable to "jitter"; or massively multi-player games).

How did we get here?

To become a wide-scale commercial medium, the Internet required changes in law and changes in code

Turing's 1938 Paper

Turing showed: 1. Universal circuit is possible 2. How to build it 3. Universal circuit is much simpler than you expect: if you couple universal circuit with memory, it could be simpler than the emulated circuit

Architecture and code embody values

Values 1. Starr - Democracy: Government transparency, informed citizenry, personal privacy 2. Lessig - "Chicago" values (privacy, autonomy, the market of ideas) vs. "Harvard" values (accountability, safety, authority) 3. DeNardis - Governance through architecture, with values embedded in the architecture: e.g., rules governing the tradeoff of authentication vs. encryption, or regulating deep packet inspection (DPI) weigh the values of privacy vs. security. 4. Benkler - Value of freedom and creativity embedded in the core technology vs. those of hierarchy and property expressed through law. 5. Zittrain - "Generativity" (versatility, adaptiveness, ease of mastery, accessibility)

Network Neutrality Values

Van Schewick's Framework: •Positive Values -User choice -Innovation without permission -Application-blindness -Low cost of innovation •Constraints -Minimal constraint on evolution of network -Transparency to companies affected by policy -Low cost of regulation

Economics of Open Source (demand side)

Why prefer open source 1. Advantage #1: low/zero price of open source software. Zero price products can be tried out easily. It can be tough to justify spending even $1. 2. Advantage #2: not locked in by vendor. Not at mercy of vendor. Support is expensive to come by, but with open source software, can get support from 3rd parties, and software can be tailored to needs: job can be put out to bids. And even if open source vendor goes out of business, can hire someone to keep software going. 3. Advantage #3: can customize product to fit needs.

Change has been driven by tension or misfit between values embodied in early web and those of existing law and commerce

Yochai Benkler's View of the Struggle*: 1. New technology built on new network principles (information is a public good that wants to be free, with previous information and human capital the only important inputs; use creates value) vs. 2. Law, regulation, business models and political alliances represent an older institutional ecology (when information required large investment in production and distribution, with strong IP necessary to permit firms to reclaim costs).

Findings of research on cooperation (Diekmann et al.)

eBay's Dilemma (Diekmann et al.) 1.Enable buyers to establish phenomenological trust - to view site as legitimate and to default to trusting the competence and good will of the people who ran the site. 2.Enable buyers to "distinguish between trustworthy and non-trustworthy sellers" when they can't talk to seller or see goods 3.Encourage sellers to be trustworthy (secured trust) 4.Drive out untrustworthy sellers. Solutions 1. Did it work? A. Diekmann et al. >2/3 of buyers gave feedback and B. Positive feedback increased chance of sale (for new phones & DVDs) and also increases the price commanded for the same item 2. Issues A. Reciprocal or one-way? B. Problems of vengeance, perfidy, and identity? C. How much information?

ATT&T and FaceTime: A Timeline

• Jun'12: Apple announces FaceTime over cellular - Carrier restrictions may apply • Aug'12: AT&T limits use of FaceTime over cellular - Limited to customers with the Mobile Share plan - Sprint and Verizon announce support on all data plans • Aug'12: Some advocates & press denounce -AT&T violated Open Internet Order - FaceTime competes with telephony service - Shouldn't discriminate by data plan • Aug'12: AT&T responds in a blog -AT&T's policy is transparent - AT&T has no video chat app - FCC doesn't regulate preloaded apps • Sep'12: Public interest groups respond - Intent to file an FCC complaint • Oct'12: AT&T customer files FCC complaint - Blocking on his "unlimited" data plan • Nov'12: AT&T relaxes FaceTime limitations - Supporting FaceTime on some plans over LTE • In '13: AT&T rolls out FaceTime over cellular -On all data plans (including unlimited plans)

Technical Background in 2 Slides (2)

• Packet switching networks (messages separated into small packets which are sent separately through fastest paths and then reassembled at the end point) • Simple protocol (TCP/IP - transmission control protocol for sending information/and Internet Protocol (for identifying recipient machines) (other protocols layered on these [like Border Gateway Protocol for peering - organizing flows among networks] and, especially, in the "application layer" - e.g., VoIP and P2PP on top) • End to end principle (modified as pipes became smarter due to demand for security & censorship). Net neutrality became an issue with packet-sniffing technologies that enabled conveyors of information (ISPs) to identify senders and type of packet; and in last 5-10 years with use of deep packet inspection, which permits surveillance of content.

AT&T/FaceTime Issues

• Pre-loaded application - Available to all users of popular phone - Accessed via device's core calling features • High bandwidth usage - Heavy load in both directions -Asymmetric network capacity - Limited adaptation in the face of congestion • Staged deployment - Rapid adoption could lead to unpredictable load - Initially limit the number of users accessing an app • Enforcement point - Usage limited on the device, not in the network

Earlier Statements Revised

•"Information Wants to be Free..." -Stewart Brand, 1st Silicon Valley Hackers Conference, 1984 "Governments of the Industrial World, you weary giants of flesh and steel, I come from Cyberspace, the new home of Mind. On behalf of the future, I ask you of the past to leave us alone. You are not welcome among us. You have no sovereignty where we gather... Your legal concepts of property, expression, identity, movement, and context do not apply to us. They are all based on matter, and there is no matter here." John Perry Barlow - "A Declaration of the Independence of Cyberspace" (Davos, 1996) BUT •... Information also wants to be expensive. That tension will not go away." -Stewart Brand, 1st Silicon Valey Hackers Conference, 1984 "We all get older and smarter" -- John Perry Barlow, 2004

Background: AT&T Case and subsequent reorganization of the industry

•1983 AT&T Antitrust agreement -Breakup of firm into "baby bells" -Common carriage requirements for telephone service and competition in long-distance market -Permission for regional bells to compete in all markets •Mergers and Recombinations (Telecommunications Act of 1996) -Bell Atlantic + NYNEX (1997) = Bell Atlantic + GTE (2000) = Verizon + MCI (2005)=Verizon as ISP+pay TV+ phone service + mobile wireless - Southern Bell + South Central Bell = Bell South; Bell South + AT&T (formerly Southwestern Bell/SBC) + Cingular Wireless (2005)=AT&T as ISP, provider of pay TV, phone service, and mobile wireless -Time Warner Cable (Spun off from Time Warner 2005) starts Road Runner High Speed Online 2005, purchased Paragon (1995) and Adelphia (2006) and Insight (2012) cable systems, purchased Navisight (cloud, hosting) in 2011, and formed joint venture with Spring Nextel (cell service) in 2005. -Comcast + NBC/Universal (2011) = Comcast as a major cable company, ISP, and content provider

Three Mechanisms of Creative Destruction through the Internet

•Aggregation •Disintermediation •Hypersegmentation

A Network Is:

•Any entity that can be conceived of as a set of nodes (objects, "points") connected by a set of ties ("lines," edges). •Such entities include: -The Internet -Neural networks -Social networks (human or animal groups)

Industry: Policy through Code II (Van Schewick)

•Application-specific -Favoring a specific application (Skype vs. Vonage, Hula v. YouTube) -Discriminating against particular types of applications (e.g., those using BitTorrent) -Discriminating in favor of latency-sensitive applications •Application-agnostic -Slowing flow to high-volume users during congestion -Tiered pricing to users to regulate usage volume - "Reasonable network management" exception

Brand X case

•At first, high-speed internet service was provided by telecoms, and was accordingly treated as a telecommunications service. •But when cable systems got into the act (2002), based on historical precedent, their Internet services was treated as an information service. •June 2005: National Cable & Telecommunications Association v. Brand X Internet Services agreed that cable service is an information service (so the cable company didn't have to let Brand X use its system).

National Cable & Telecommunications Association v. Brand X Internet Services

•At first, high-speed internet service was provided by telecoms, and was accordingly treated as a telecommunications service. •But when cable systems got into the act (2002), based on historical precedent, their Internet services was treated as an information service. •June 2005: National Cable & Telecommunications Association v. Brand X Internet Services agreed that cable service is an information service (so the cable company didn't have to let Brand X use its system).

Information? Communications?

•At first, high-speed internet service was provided by telecoms, and was accordingly treated as a telecommunications service. •But when cable systems got into the act (2002), based on historical precedent, their Internet services was treated as an information service. •June 2005: National Cable & Telecommunications Association v. Brand X Internet Services agreed that cable service is an information service (so the cable company didn't have to let Brand X use its system). •August 2005: FCC redefines telecom ISPs as information service, too, relieving them from common carriage obligations. •At same time, FCC enunciated 4 principles (access to lawful content; consumers' right to run applications and services of their choice; right to connect to any legal device; right to competition among providers of services and content).

Self-Regulation

•BITAG (Broadband Internet Technical Advisory Group) •Internet Congestion Exposure Working Group (Internet Engineering Task Force) •Verizon/Google agreement (2010) -Attempt to head off FCC rules -Reasonably strong neutrality with two exceptions: •Applies only to ISP service and not to additional software services. What does this mean? •Does not apply to wireless networks. What was Google thinking?

Some Unhappy Statistics

•BLS found employment in sound recording industry fell 40% from 2001-2012 •Global revenues from recorded music fell 40 percent between 1999 and 2011 •Retail record stores and recording studios declined even more dramatically Majors blamed downloaders

Changes in Film Industry: Challenges

•BitTorrent increased illegal circulation of films •Netflix, Amazon, et al., increased share of legal online distribution •Between 2001 and 2011, 10 percent decline in number of motion picture theatres - but increase in screens per theatre •Box office receipts flat in U.S. (but way up in Asia and Latin America)

Competence-destroying technologies

•Capitalism is periodically revitalized (in whole and in particular industries) by game-changing innovations •These innovations are sometimes referred to as competence-destroying technologies because they make previous ways of doing things (and the knowledge on which these are based) obsolete

Code

•Commercial applications would require a user-friendly graphical interface - i.e. a "browser" -Tim Berners-Lee invented first browser at CERN in 1989 -Mark Andreesen released Mosaic in 1993 at Univ. Illinois - commercialized as Netscape -Thomas Reardon built Internet Explorer (on Spyglass), IE bundled with Windows, takes over world (95% of market) temporarily •Javascript (Netscape 1994; IE 1996) and cookies (Netscape 1994) for individualization & transactions

(Arenas) Policies are Debated and Policy is Made in Many Venues

•Congress •Federal Agencies (FCC, FTC, NTIA) •State and Local Government •The Courts •International Regulatory Regimes (e.g. WIPO) •Corporate Self-Regulation •Private Citizen Action •Computer Scientists Writing Code

Affordances of Digitization for Cultural Production Have Been Substantial

•Connected not to Internet, but to user-owned devices: computers, soundboards and mixers, cameras and video editors •Production costs decline - more players enter -Photography -Digital art -Recorded music -Radio programming (podcasting) -Journalism (blogging) •Less of a distinction between "professional" and "amateur"

Disintermediation

•Creative: because it reduces transaction costs by replacing business intermediaries with automated and efficient on-line transactions •Destructive" because it is competence destroying, making old business plans dysfunctional and old skills less useful.

But economists aren't sure that majors are correct in blaming downloaders

•Different methods yield different results •Most think that file-sharing accounts for part, but not all of the decline (c. 15-30% of sales) •Other problems: -No new physical medium -No major new genre since rap -Legal services (Pandora, Spotify, Deezer, Grooveshark) don't generate as much revenue

Why Discrimination is a Bigger Problem than Blocking

•Discrimination can harm a site or service as much as blocking it: -If Skype is slow and ineffective, you won't use it -If it takes a long time to load Fox News, you will go to Google News instead •Discrimination is less visible so -The ISP is less likely to incur the wrath of customers -And customers are more likely to blame the service or website

Three Cases

•Film industry has fared pretty well •Newspaper industry has suffered terribly •Music - creative destruction par excellence

Established Papers Haven't Easily Moved Online

•Google News gets much of the benefit •On-line papers can't use attractive content to subsidize less attractive content •Physical papers mixed ads and content on same page -- online sites can't do that •Subscription plans have failed miserably •Vicious cycle of cutbacks creating less attractive product, leading to fewer revenues, more cutbacks •Insiders now say philanthropic or government aid is needed for newspapers to survive

The Politics of Ambiguity

•Highly technical issues requiring grasp of technology, law, economics and history •Simplified into framing contest, in which most idealistic and ideological parties are supported as public voices (along with politicians receiving support from industry groups) •Suppression of debate about economic issues, which involve rights, rents, and governance in the construction of a new industrial field.

Equality and Access

•How big a problem are the "digital divide" and "digital inequality" (in connection speeds, skills, networks)? •What is the best route to universal broadband: federal initiatives (tax breaks, common carriage mandates, free right of way) or user subsidies? [Pres. Bush: Broadband for all by 2007. Pres. Obama: Broadband for all by 2012.] •When do algorithms contribute to inequality - and what should we do about it it?

Privacy

•ISPs - should ISPs be permitted to intercept and read subscribers messages for business purposes? (U.S. v. Councilman 2007) Should Google? What does "read" mean? •If you quit Facebook, who should own and control the material you have posted there? •Internet of Things: In you use a service that allows you to monitor your own fitness information and share it with friends, should the service be able to use the information to send you ads? Sell it to an insurance company?

The decline of the majors

•In 1995 there were six major integrated record labels •In 1999 Universal purchased PolyGram and there were five •In 2004 Sony began a process leading to its absorption of BMG - by 2008 there were four •In 2012, Sony and Universal Music Group split of EMI; now just three: Sony, Universal and Warner

Creative Destruction in Cultural Industries

•In traditional cultural industries - film, press, music - production and distribution were integrated: •Large firms employed or held under exclusive contract journalists, directors, editors, or recording artists and were in charge of: (a) production (b) distribution (c) marketing promotion

What are the four instrumentalities through which we can regulate behavior on the Internet according to Lawrence Lessig?

•Law •Markets •Architecture ("code") •Norms

Why is Film Industry OK?

•Lots of downloading, but it doesn't seem to affect box office unless openings are delayed (7% in Europe, no effect in U.S.); Downloads reduce rentals 5-10% but not sales (Danaher et al. 2010) •Film companies have been supple - they learned to license to cable, sell and rent through video stores, deal with VHS long before Internet •Greater bandwidth requirements for film gave them more time to prepare (and they learned from music industry). •Film organized on a project basis: less risk (cost-sharing)_ and fewer fixed costs; and majors can make money off independents through distribution deals •People listen to pirated music the same way they listen to legal music; but you can't watch a downloaded film in a movie theatre - and people seem to like theatres!

The Politics of Policy Making

•Majorities often have less at stake than minorities ("diffuse interests" vs. "concentrated interests") •Majorities are harder to organize than small minorities •Organizations, not people, influence policy - mobilization is everything •"Money is the mother's milk of politics" - Jesse Unruh, 1966 •People vote for packages and personalities --- very few issues determine many people's votes...

"Diffuse interests" vs. "concentrated interests" in the policy process

•Majorities often have less at stake than minorities ("diffuse interests" vs. "concentrated interests") •Majorities are harder to organize than small minorities •Organizations, not people, influence policy - mobilization is everything •"Money is the mother's milk of politics" - Jesse Unruh, 1966 •People vote for packages and personalities --- very few issues determine many people's votes... NB: It is common wisdom among those studying political influence is that diffuse interests are rarely well represented in the political process. Interests such as consumers, senior citizens, good government advocates, and women are spread widely throughout society and encompass diverse type of individuals with differing legislative priorities. Therefore, it is an extreme organizational and political challenge to mobilize any of these groups to compete effectively with narrow specialized interests in achieving legislative objectives

Government Use of Code

•Manufacturing mandates - -Pipes that are accessible to Deep Packet Inspection (DPI) surveillance -Always-on GPS -TVs with V-Chips •Use of architecture (DeNardis): -Termination of web hosting and financial services to take down "Wikileaks" -DNS (Domain Name System) for IP enforcement - can't seize server but can disable domain name thru registrars (SOPA) -Central "kill switches" in authoritarian regimes (Egypt, SF BART 2011)

Regulation of Markets

•May ISPs like Verizon charge websites for premium service? Is it OK if Comcast slows down Netflix or YouTube? •Should VoIP be regulated in the same way as other telephone service? •Should wireless service providers be subject to the same rules as conventional ISPs?

Not Every Cultural Industry is Hurting

•Movie theatre revenues about the same in 2009 as in 1999 •Cable television revenues rose dramatically •Book sales declined but no more than in the decade before the Internet

Privacy Arenas

•Private Companies •Federal Trade Commission •Congress •National Security Agency •Courts •Internet Architecture Designers

We'll Use the Policy-Analysis Framework from the Lecture Two Weeks Ago

•Problems •Values •Arenas •Options •Stakeholders

That model threatened when digitization

•Reduced the cost of distribution dramatically •Reduced production costs to varying degrees •Made it easy to decouple parts of the business that had previously been integrated •And in some cases made it easier for creative workers to manage the process themselves.

Film Industry: Benefits

•Reduction in cost of film production •More independents, more films - 50 percent increase in number of theatrical releases 2000-2010

Key Issue Domains

•Regulation of Markets •Intellectual Property •Equality and Access •Security •Privacy •Freedom of Expression •Governance •Sovereignty •eGovernment •Social Media

Private Control through Code

•Requiring users to enter a serial # to use software. •Cookies or browser fingerprinters that determine -What links you see -What recommendations you get -What price you are offered •Social media sites: -Is your screen identity tied to your real one? -Do your friends know who your other friends are? Do they know when you are on-line? What you are doing on-line? -What control do you have over who knows what about you?

"Affordances" of technology vs. determinism

•Technologies are socially constructed. There are no technological imperatives - technologies provide affordances rather than dictate behavior. •Technology does not develop along a single path - there are many false starts. Early choices can be consequential. Technological development is path dependent.

(Start of lecture notes) Main points:

•Technologies are socially constructed. There are no technological imperatives - technologies provide affordances rather than dictate behavior. •Technology does not develop along a single path - there are many false starts. Early choices can be consequential. Technological development is path dependent. •All new technologies are strange. Purposes must be discovered. Tacit knowledge must become commonplace. •Our ability to predict the technological future is limited: Foresight is subject both to undue optimism and to insufficient vision. •Technological change is a political process involving struggle between incumbents and challengers

The FCC's role: A Tale of Two Titles

•The 1934 Communications Act gives the FCC the authority to regulate communications and information services. •But telecommunications services are regulated under stringent Title 2 as common carriers, whereas information services are regulated under the more lenient Title 1. •Which is Internet service???

The 1934 Communications Act - Titles 1 and 2

•The 1934 Communications Act gives the FCC the authority to regulate communications and information services. •But telecommunications services are regulated under stringent Title 2 as common carriers, whereas information services are regulated under the more lenient Title 1. •Which is Internet service??? (Now clearly under Title 2)

Ad Revenues Devastated

•The Internet (Craigslist) killed demand for newspaper classified ads and want ads. •Online shopping sites have killed department stores, which used to be their major ad buyers.

Early network analysis focused on locations in networks

•The position of a node in a network can be characterized by -Popularity (number of incoming ties) -Between-ness (how many other nodes must go through a given node = power, influence) -Centrality (popularity weighted by between-ness and/or by the popularity of the nodes that send ties) On the Internet, each of these is a plausible means of ranking search hits. Google's big innovation was to use a version of weighted centrality.

Let the Network Produce the Value

•User reviews add value to Amazon's retail platform. •Collaborative Filtering: Use information from consumers to predict what other consumers will like, adding value to service diectly (Netflix; Spotify.)

Why Application Blindness? (Van Schewick's view)

•VS believes that Application-Class neutrality ("application blindness," "application agnosticism") is necessary to prevent ISPs from discriminating against whole classes of applications that compete with their "special services" •If we permit providers to discriminate against classes of services, we rely on them to decide how to categorize services. Should we classify services on the basis of protocol or latency-sensitivity? Doing the former hurts legal video services that run on P2P? (Cox Communications)

The Main Event: Verizon v. FCC

•Verizon took FCC before DC Circuit Court of Appeals in Washington DC, claiming that FCC had no authority to regulate its Internet practices -First amendment: Free speech -Fifth amendment: Permanent easement on its system (prohibition of pay-for-priority) represents "illegal taking" •Relevant precedents cut both ways -Comcast v. FCC (2010, DC Circuit Court of Appeals) voided FCC judgment against Comcast's degradation of Bittorrent traffic (mooted by terms of Comcast/NBC merger agreement) -City of Arlington v. FCC (2013, Supreme Court) affirmed permissive standards for FCC use of congressional authority in rule-making (in case involving regulation of municipal licensing of wireless facilities) -Decision against FCC in February 2014 - Will decisively shape future of FCC's role in regulating Internet

Verizon v. FCC

•Verizon took FCC before DC Circuit Court of Appeals in Washington DC, claiming that FCC had no authority to regulate its Internet practices -First amendment: Free speech -Fifth amendment: Permanent easement on its system (prohibition of pay-for-priority) represents "illegal taking" •Relevant precedents cut both ways -Comcast v. FCC (2010, DC Circuit Court of Appeals) voided FCC judgment against Comcast's degradation of Bittorrent traffic (mooted by terms of Comcast/NBC merger agreement) -City of Arlington v. FCC (2013, Supreme Court) affirmed permissive standards for FCC use of congressional authority in rule-making (in case involving regulation of municipal licensing of wireless facilities) -Decision against FCC in February 2014 - Will decisively shape future of FCC's role in regulating Internet

Cable TV's "Elevator Pitch"

•We're developing a video delivery service that provides literally hundreds of channels. Sure we'll carry the big networks, but mostly we'll be offering niche outlets that only a tiny fraction of our customers will want to watch. Our subscription prices will include rental fees for low-cost hardware like modems, set-top boxes and remote controls. We also plan to build in annual rate hikes that outpace inflation by about, say 400%. And for good measure, our arrangement with the creators of the content that we distribute will ensure that every couple of years we'll be locked in contentious and public renegotiation rights that interrupt service for our customers. If all goes as planned, we should be able to consistently deliver customer satisfaction levels that rank among the lowest of any industry. -Amadou Diallo, Forbes Magazine, May 2013

Questions

•What is net neutrality? (Not as easy as it sounds.) •Why is the issue so contentious? (Not as obvious as it sounds.) •Why is the issue so central to discussion of policy about broadband technology and the Internet, and why has it become so central now?

Intellectual Property

•When is file-sharing technology inherently illegal? •How many years of copyright protection provide the right balance between incentives for creators and public benefit? •Is "fair use" possible (e.g. from 3 seconds of a video) if you have to evade DRM technology to capture it?

Disintermediation in the Music Industry Are the Majors Still Relevant?

•http://bandcamp.com/ (24 million active users) •https://soundcloud.com/explore/electronic (200 million users by 2013) •http://members.cdbaby.com/ (3 million tracks) •Promotion: Twitter, YouTube, & many other platforms


Kaugnay na mga set ng pag-aaral

Library Plagiarism/Paraphrasing Tutorial

View Set

Unit 1 .1 你今天上了什么课?

View Set

Unfair Claims Settlement Practices

View Set

Annual DoD Cyber Awareness Challenge Exam

View Set

Chapter 24: PrepU - Nursing Management: Patients With Intestinal and Rectal Disorders

View Set

RN Targeted Medical Surgical ENDOCRINE (ATI)

View Set