COM 200 EXAM 3 STUDY GUIDE

अब Quizwiz के साथ अपने होमवर्क और परीक्षाओं को एस करें!

Slacktivism (Surowieck):

"A way for people to feel virtuous without doing much." When you care enough to do the very least. Bucket challenges DID make a difference for ALS, but not all involvement is the same. Other examples are changing your profile picture to a flag or liking a post. This doesn't always translate over to an actual private involvement like a donation.

Political communication:

"The communicative activity of citizens, individual political figures, public and governmental institutions, the media, political campaigns, advocacy groups and social movements." -National Communication Association "Purposeful communication about politics." -Brian McNair Not just any small conversation, it has to be purposeful. So it includes all communication about politics that has a goal of achieving something or getting something from it. Must have a purpose. Not only verbal, but also includes the way candidates dress, have their hair, their tone, hand gestures, visual rhetoric. Some research in this area includes studying how conversations between citizens shape politics, communicating across political divides.

Political knowledge:

"the range of factual information about politics that is stored in long-term memory". Reflects an understanding of factual information about politics and government that individuals retain. The study of political knowledge tries to gauge what people need to know or be able to do in order to vote and/or affect government. Political knowledge informs political participation such as voting, conventional political activities (supporting election campaign, going door to door), unconventional/issue based political activities (going to a protest on a narrow political issue), non normative political activities (unauthorized/violent political activities). Political knowledge helps us vote and have a say in an informed way. And this can be built through engaging in social media, but it can also detract from it due to fake news. Little political knowledge and low interest are the most threatening to citizen participation, because if you don't know what is going on, you are less likely to vote and engage.

How to spot fake news video:

1. Be careful of pages with biased agendas. 2. Check photos using reverse image search. Right click image and google it to see when it was truly taken and where. Copy paste it on google and find other thumbnails to find out its true source. 3. Sound can be faked. 4. Check for graphic manipulation.

Federal communications commission:

1934 to regulate interstate and international radio, satellite services, broadcasting. Goal of it is to make sure that there is the widest dissemination of info to promote democratic ideals and innovation.

Deep Fakes:

A computer using artificial intelligence to add audio and effects on somebody's mouth.

Sock puppets (wood):

A fictional identity created to bolster some point of view. These are managed by humans, actual people who have fake identities to deceive others. They want access to a specific group, and will, once in that group, try to sway the conversation. They prey on in-group bias, the people who like them. Payed after posting a certain number of posts, after they have met their quota.

Creating group identity through media: Identification

After being hailed, we see ourselves in characters in media, and become involved and transported into media. However, some people can reject the identification and hailing, and this makes you feel alienated.

Social comparison theory:

Assessing social and personal worth based on comparisons to others, especially those who are central to our definition of self. Everytime we meet somebody or are exposed to the media, we do this. Upward/downward comparisons: Upward: looking up and seeking how you don't compare well. Example: celebrities. Theorists say celebrities on social media have only exacerbated upward comparisons. Can lead to depression. Positive effect: motivation to try something new. Example: my friend took a class, I'll try it. Negative effect: reinforce low-self esteem. Going on instagram and looking at bikini bodies and hating yourself.

Affordance of different media systems:

Available modalities Message availability Control over presentation

Examples of capacities of a technology that make it useful:

Carrying capacity Message durability Message distribution speed Source control over message

Academic response

Center for an informed public at UW, resisting misinformation to promote informed society, bringing 5 professors from different areas together.

Examples of media regulation: Self regulation:

Comic code: put in place by comic book makers. Self censoring organization that received all comic books, and it approved everything not to offend anyone. They also checked that parents and government authorities were never presented as bad. Did this to keep the government out. MPAA: Motion picture association of america. Movie creators also self censored, trying to keep a clean moral tone, but keeping the government out of it (example: PG 13). ESRB: entertainment software rating board. Non profit self regulatory body for video games. It regulates video games to keep the government out.

Uses and gratification theory:

Consumers use media to satisfy specific needs or desires. Possible positive effect: find healthy recipes. Possible negative effect: weight loss sites reinforce thin ideal or pro-bulimia strategies. Person could bring low self esteem into this search, and being led to negative effects will impact them intensely.

"The medium is the message" video:

Deliberately paradoxical. When you get a message, the message itself is the message, the content. However, the way it is delivered is also very important. The technology used changes the message and us, receiving it. Running the same message through different mediums changes it. Example: movie vs. book of Harry Potter. Different experiences and different messages. Printed words encouraged emphasis on the visual, speeches encouraged the ear.

Russian disinformation video:

Disinformation has the goal to change the perception of reality. Forgeries, agents of influence, planting false stories. Stories were implanted into 80 countries' newspapers saying the US created AIDS and it did cause chaos. Raegan changed disinformation, because he began throwing punches instead of ignoring it. The group he created worked a lot to debunk the AIDS report and several others. Social media have taken baby steps, but can't debunk this alone, so the government should get involved. Disinformation warfare is still really dangerous. America is Russia's biggest enemy, and they are trying to create chaos. Disinformation is not new, it has been used as a tool of war and spying for a while.

Regulatory capture:

Economic theory that says regulatory agencies are dominated by the industries or interests they are charged with regulating. Instead of regulation comcast, comcast has people who pass deregulation to their own benefit. FCC is meant to work for the public interest, instead acts in ways that benefit the industry they were meant to regulate.

Net neutrality:

Example of deregulation. Treat info on the internet the same. Whether all info that passes through the internet is all equal. Companies shouldn't charge more or less for different data that passes through it. We end up paying more for things that are critical to our lives. Internet service providers should treat all traffic more or less the same on their network. Some data can't be moved faster. Even if a movie is more data than an email, they can't charge Netflix extra. FCC scrapped this rule, so these companies will probably be charged more.

FCC and media consolidation:

FCC: regulates interstate and international communication over radio, television, wire, satellite and cable. It is overseen by congress. It is meant to ensure security, best use of spectrum, we all get our media. Protects our access to the information but also to protect it for innovation. These people regulate rules the companies that provide internet need to follow.

Reasons for misinformation in society: The Facebook problem:

Facebook enabled disinformation structure: A tech company not a media company. This is where people get info on a daily basis, but it doesn't care enough about what it feeds people, it is not regulated like ABC or radio stations are. Self regulation ineffective? Hasn't worked very well for Facebook Example: 2016 presidential election. Sold our data to campaigns. Example: ethnic cleansing of muslim Rohingya in Myanmar. Fake accounts were created and hated on this person, and it took very long for Facebook to take it down.

Responses to surveillance culture: Law and policy:

General data protection regulation: it is a privacy regulation that was created in response to what Facebook is doing to our data used to give users protection. We should use computer science info on AI for good. 5 components for governments trying to give people their rights back: AI systems must be transparent to get away from black box technology. AI must have a deep rooted right to where the info is collected, we must know what they are trying to do and where they are getting it from. Consumers should be able to opt out of having their data scraped. Purpose of AI must be knowable. If people want data deleted, company should do that. California consumer privacy act: most sweeping data privacy law in the country. Gives residents know what companies know about them and they can ask companies not to sell it.

Dispersed leadership:

Grassroots political organizing wants a networked culture, not a leader. They want a network society outside of institutions of power.

Why do we share it?

Has to do with real news being boring, we want things that are novel, shocking, emotional.

Machine learning video:

Helps us get from place to place, gives us suggestions, translates. With traditional programing, people hand code the solution to a problem step by step. With machine learning, computers learn the solution by finding a pattern in data. Our human biases become part of the technology we create. To improve the tech we need to better our own biases. Google example: black people appeared in Gorilla searches. Google simply removed all Gorilla pictures. People criticized this and thought there should be something else they could do.

Bechdel test:

Importance of protection, writing, directing. Idea of representation in media. A comedian put together gender criteria to encourage people to see how sexist the movies were. New bechdel test: goes more in depth to find different criteria to find whether movies represent minorities well or not. Representation could be a lot better.

Invisibilia: the online version of us vs. reality:

In the guy's teenage years, he took a picture with some friends. His friend shot a guy. Now, when he describes the picture, he repeats what he has been repeating to the police and lawyers, he doesn't even remember it right. He got arrested at a house party and charged with gang conspiracy. He decided to plead guilty. They made gang signs and acted like gangsters, some of his friends were criminals, but didn't think they were one. They just looked like it online. Things we say and do online can be used against us and will build a new profile of us. Police use social media to get black teens in jail. The picture wasn't a gang picture, they were long distance friends, because he had moved away from his neighborhood to create a future for himself.

What is media regulation in the US?

It is a balance between protecting free speech (open to public) from government intrusion and protecting public interest by using government intrusion. Must protect and regulate at the same time. Provide for entrepreneurial innovation and the right to make money from that. Protect public resources in the name of equality and democratic participation.

Possible responses to misinformation and disinformation:

Journalistic responses Social media-company responses Governmental responses Academic response

Commercial surveillance:

Mark Zukerberg: privacy is dead. Anything that has been digitized isn't private anymore, and big companies can tap into our information in minutes and know everything about us. Data is highly valuable. If you are not paying to use an app, you are the product. Any product that connects to Wifi is likely sending data back to the manufacturer. Example 2: Kayla is a doll that has a microphone, and connects to an app. It was outlawed in Germany due to fascist roots and fear of it happening again. You can't have hidden microphones. Example 3: Alexa/Echo. Your taste in music, books, traffic, weather app, where you are going, lighting of your house, etc. Example 4: the ring. Doorbell that has a camera embedded in it, you get to watch your neighbors. Amazon was sharing info with Facebook and Insta without us knowing. Example 5: people hack into baby monitors. Not very safe, people can see your baby. Example 6: roomba vacuums. It captures the dimensions of your house, and sends it back to the company. Example 7: pet feeding camera, you press a button and it feeds your dog. People can hack it and see inside your house. Fitbit study: companies were requiring their workers to use fitbits so they could see how much their workers moved. Panoptic effect, because fitbit was always watching, so people self monitor. Smart vibrators: long distance relationships where the app controls the vibrator. Using this app, they could connect it to somebody's email and find their identity. Could also hack into somebody else's device.

How is it formed?

Media is one of the most consistent influences on a person's political knowledge. Different kinds of media support different levels of politics. News has a stronger baseline of political knowledge due to higher levels of education.

8 Basic Propagandistic Devices:

Name calling Glittering generality Euphemisms Transfer Testimonial Plain folks Bandwagon Card stacking Fear appeals

Journalistic responses

News checking websites such as Snopes. Hilary and Trump's dialogue came in and was fact checked in real time.

Reasons for misinformation in society: Decline of public-service journalism: Pickard reading:

News deserts. How to keep local communities informed? The New York Times is doing well,but public service journalism isn't. New models for public funding of journalism: public service journalism needs new funding mechanisms such as propublica, which is a nonprofit funded by donors, which outlets avoid. PBS is subscriber funded. We need to find a way to keep journalism from declining. Force Facebook and other major media players to fund public service journalism.

Who shares it the most?

Older Americans are more likely to share through Facebook. Potentially because there is less online literacy (ability to identify fake news) or decline in cognitive ability to see this difference.

Selective exposure theory:

People purposefully select messages matching their beliefs. Possible effect: leads to more partisanship and polarization. It is too uncomfortable to leave your bubble. We want to keep cognitive dissonance down, reaching a conclusion we already had, it is easier. We prefer high quality info, but our judgement and our "bubble" affects what we see as high quality.

Criticism of data dignity:

People say their data earnings will be too small. A small family could earn 20,000 a year from the value of their data. People will never pay for Facebook. People pay for Netflix, paying for things makes it better. Poor people will be priced out of the internet: almost like a public library. Tech giants will never go for this: tech companies will stop dominating, but economies in general will grow so much these companies will do too, and probably more than they would've if this wasn't happening.

What is the role of bots?

People, not bots, are sharing this a lot more, bots simply magnify this.

Panopticon:

Prisoners are aware of authority but never know for sure when they are being observed. Leads prisoners to discipline themselves. A form of social control. Michel Focault: interested in how power and control happen without the church and monarchies to control it. How does this apply to us? We live in surveillance cultures and take part in this. Panopticon is a metaphor. We assume we are being watched. We generate info to share with people and we watch others.

3 privacy types:

Privacy fundamentalists Privacy unconcerned Privacy pragmatists

Right to privacy:

Privacy has value and sometimes we "cash it out". Katz case: man was making an illegal phone call in a phone booth about his gambling. He expected privacy there, and this helped his case. Privacy accompanies a person, it isn't a fixed place. Questioning where privacy applies, and whether technology violates it. Main point: "Among many things the tech industry has disrupted 4th amendment jurisprudence".

Family surveillance:

Social media has created more questions about how much parents should limit privacy. Because there are fewer physical places for teens to hang out, it is important for them to have social media so they can engage in the economy and gain status. There is intense pressure on parents to keep tabs on kids. Cultivation theory: "mean world syndrome". Different media cultivates different behaviors. Investigates the effect of exposure to media and implication on a child's view of the world. Parents take precautions to take care of kids, because they become scared of the world and want to keep them safe. Example: child tracker. Fear drives the need parents feel to surveil at all times, and this teaches kids that surveillance culture is ok. Parents don't understand risk assessment and think the danger is much greater than it is.

How to spot bots:

Stock image or missing profile pictures. Long or difficult username. Simple and repetitive posts. Follow more people than are followed. Post more often than a human. Shares the most extreme posts. Lots of filler content (cat videos, inspirational quotes).

Government surveillance examples:

The guardian video: governments collect massive amounts of info about us. Spy companies keep our info for 1 month in the UK and 1 year in the US. How? Working with companies and taping the cables that move info around, then sifting through info and all messages, and storing them. Using relationship w/ tech companies to get emails and other private info. Snowden: 2011, he was a government contractor working for the NSA, who was gathering info about us, and we didn't know about it. It was done under the antiterrorism area of the NSA. Snowden decided everybody needed to know this, and gave this data to news outlets. FBI monitoring social media: The ACLU sued the FBI due to the way they surveil our social media. FBI was looking for a social media to help them develop AI. Boston Police and social media surveillance: The ACLU sued under a freedom of info act to find out what the Boston Police was doing. They were trying out new tech by using hashtags such as "blacklivesmatter" to see whether or not they wanted to use a particular tool. Privacy and surveillance culture: New York Times uses surveillance cameras and matches faces. This is legal. We are all in a police database that runs our pictures against criminals' pictures. Who's got your back? Report from electronic frontier foundation: a chart shows different companies, and to what extent they give our information to the government. How commercial and government work together to use our data.

Third person effect theory:

The perception that media messages have a stronger influence on others than on one's self. Most often these assumptions are just assumptions, not the truth. Possible effect: censorship based on stereotypes of others. You take things away from people because you don't think they can handle it. Example: albums were considered too socially unacceptable and were banned. Books about rape or racism in schools are banned.

Direct effects theory:

Theory that audiences passively accept media messages and would exhibit predictable reactions in response to those messages. This has largely been debunked. Messages aren't just injected into people. Orson Welles: had a radio program called war of the worlds. This was a time when everybody listened to the radio. It was about a county in Nebraska being invaded by aliens. Some listeners freaked out and grabbed their guns because they were scared. A lot of people understood it wasn't real. Each listener, with their sociocultural status, understood a different thing. We are not passive vessels, we change the media we take in to our own understanding.

Governmental responses

US congress could interact with social media to protect governmental environment. They have asked that everyone on a media site has a name, but this goes against free speech. They have also required API (application programming interface) for research, giving academics access to the social media data to make it fair and clean. Say social media should fight international threats.

Glittering generality:

Using virtuous words to elevate a candidate. Such as using democracy and freedom, creating a warm and fuzzy feeling, but not giving any specific details about the candidate.

Definition of Strategic Image/Impression Management:

What we do to protect ourselves from context collapse, and curate ourselves in social media. The effort to shape an idealized version of ourselves online. This can be considered the front stage.

Prosumers:

a consumer who is also a producer. We don't just consume media. Example: when we tweet as we watch a show. We consume the show and interact with it. Also affects how the media is produced. We create a group around the media, and this group has an identity. Fan production through remix and reuse culture.

Artificial intelligence:

a group of algorithms that can modify them and create new ones when learning new inputs and data. It adapts and grows based on data it encounters. Difference between these is cars and flying cars. AI is a flying car, algorithm is a car. Makes judgement about us for ads, videos to suggest, but also other judgements like whether we are likely to bully someone (reading). Predictim, the company that scanned a potential babysitter's life long presence of social media was based on artificial intelligence (reading)

Viral visual rhetoric:

a position, message or idea conveyed through images that goes viral online. Images have persuasive power. When things become viral, the repetition makes an argument.

Symbolic annihilation:

absence or underrepresentation in the media. Doesn't mean people are violently treated or killed in movies. Means only stereotypes are shown or different social identities aren't even displayed. Symbols in movies carry significant weight. If a group doesn't see themselves, this is problematic.

Homophily (Golbeck reading):

algorithms are programmed to find what we like. They are built from our likes and small patterns of behavior, by taking data from social media. It can come from not so obvious info. It comes from homophily, we are friends with people like us. If my friend's friend liked something, it is likely they are like me, because we have a relationship. They can predict our group attributes.

Creating group identity through media: Hailing

all of us respond to different types of media. "Hailing a taxi". Media hails us in, and checks if we see our social identity reflected in that media. Example: ad about spring break hails you in because you are looking for spring break plans. Another example is Bruce Lee watching breakfast at Tiffany's, a racist display of asians as comedic relief. Bruce Lee isn't laughing and everybody is. Here, the white young social identities are hailed in, but the asian identity isn't.

Third party doctrine:

an individual has no reasonable expectation of privacy under 4th amendment if that info is voluntarily disclosed to the 3rd party. Example: if you run your money through a bank, that is a 3rd party, and you can get bank statements and use them in court.

Medium:

any material or electromagnetic spectrum that can be altered by a sender to communicate with a receiver. When we send a message through a medium, it changes. Example: a piece of paper is a medium. Once I add words to it, I alter it and it becomes a media system. Air waves are also a medium, and speaking alters it and creates communication

Mediated social identity:

article speaks about how we go through different templates to offer different versions of ourselves. People look to this to learn things about you. Social media encourages user explicitly and implicitly to compete for attention such as likes and comments. We are never really in control of how we are represented.

Political bots:

automated accounts that publish lots of content and infiltrate online communities to try to sway online conversations. Used to tweet and retweet to amplify something. Happens in any arena where people want to influence others for their own purpose. They pretend to be human. Bandwagon effect appears here, people are either silenced by a big group that believes something, or they will amplify it.

Carrying capacity:

bandwidth. Refers to the load a media system can handle per unit of time.

Privacy unconcerned:

benefits are far greater than any threats. Willing to give up a lot of info for even small benefits.

Plain folks:

candidates being seen as regular people, humble, just like citizens.

Affordances:

capacities of a technology that make that technology useful to its users. What media and technology affords to its users.

Citizen journalism:

citizens on the street stand with traditional journalists and use their cameras to set the narrative for a story. The people in the street dictate the terms of coverage. The 1st Amendment says we can gather our own info and share it.

Media consolidation:

concentration of media ownership by fewer companies. If 1 organization owns all news outlets in one market, are we getting an accurate picture of news? More owners means more outtakes on the news, there should be a variety of owners.

Vertical integration:

controlling production, distribution and exhibition of media. For top to bottom. Example: comcast. When it bought ABC in 2012, it went from being a cable/phone company and now it owns content companies. They own an entire column.

Disinformation:

deliberately false information that is purposely spread to deceive.

Message availability

ease with which a person can both access and revisit a message content varies with different interfaces. Example: writing for the ear on the radio is different than writing for a newspaper. When people are listening, they can't rewind to listen to what they missed, on a newspaper they can just look up.

Possible effects:

electoral impact, making society more cynical, civic actions (don't know who to vote for, so gives up and opts out). Can encourage extremism if fake news is amplified when it is shocking.

Card stacking:

emphasizing one side and repressing another. Not balanced. Candidates are compared in an unfair way.

Parasocial relationships:

energize prosumers. Fans create personal reactions with the media and the characters in it. This drives our desire to support these and create fan cultures, and meet other fans, creating fandoms. Fans can't have the character with them, but they can have other fans. They hail one another around the media.

Bandwagon:

everyone is voting for somebody, so you should too. Show people of influence who support that candidate to create peer pressure.

Control over presentation

extent to which author can exert different degrees of control over the presentation and format of the message. Example: reading on a kindle vs. reading on a book. Watching movies with subtitles on or off. Volume loud or low. Konnikova article: talks about different things Facebook allows you to do. You can just scan and stalk without interacting, instead of liking and creating and interacting. Different affordances have different effects on us. Scanning is negative, creating is positive. These are amplified by Facebook, not caused by it.

Fake news:

fabricated information that mimics news media content in form but notin norms or intent. Disinformation that looks like news in media society. Doesn't have the same purpose or come from the same ethical norms. 2nd definition: articles that look like news content and appear to have gone through journalistic processes but are actually made up.

Misinformation:

false or misleading information. Can be used for ill purposes or not. Causes confusion and chaos.

Deregulation of media:

fewer controls on media transmission has occurred. In regulation, companies can only own so many radio/tv stations, through the years these regulations have loosened, so fewer companies own more media outlets.

Context collapse:

flattening of multiple audiences into a single context. It refers to the infinite audience possible online as opposed to the limited groups a person normally interacts with face to face Friedersdorf reading: is there a loss of freedom/privacy in context collapse? He says technology makes context collapse harder to avoid. He thinks we have lost control of whether we want our worlds to collide or not. Front stage self as performative. Reflects internalized norms for behavior and our role. (example: being in class, in public.) Backstage self as more authentic self. Here, we are free of expectations and norms that exist in the public space. We become our true selves. "Social life is a performance. We perform different parts of ourselves. We are teams of participants in the front stage and backstage." We exhibit different behaviors based on context. We behave differently in professional and intimate situations and settings.

Crowdfunding:

go fund me campaigns have raised funds for films or music outside of big studios. Most podcasts are crowdfunded. We directly give money to the media we support because they are important to our identity.

Mediascapes:

goes out of nationalism, saying mediascapes transcend land boundaries, becoming global through media. Americans in Rome watching the Superbowl creates an international cultural group, where Italians watch it too. Through mediascapes, we construct imagined worlds and social identities. Italians watching the Superbowl form the idea of what it is like to be in the American culture.

Shift from commodity culture to gift economies:

greater values on social motives, not economic like Disney Channel would, for example. Giving gifts and money is how this happens, communal act of giving and taking.

Hashtag activism:

hashtag started murder of a boy in 2012 shot by a part time security guard. The man didn't go to jail. Later, with the killing of Michael Brown who was walking down the street. These gave the movement traction. Use of the hashtag makes the movement grow. Hashtags create emotion and a group around it.

Source control over message:

how much control the sender has in targeting the intended receiver. Example: if I want to target an audience, I'll be on cable tv, not a public show.

Message distribution speed:

how quickly it allows us to distribute a message. Telegraph was much faster than sending a letter on horseback.

Fairness doctrine

idea was to keep airwaves fair. If a station aired something of public importance, they had to make sure opposing views were equally heard. Nowadays, news organizations still need to give equal time to qualified opponents, but it isn't really followed.

Where does it come from:

ideologies (Russian disinformation during war, using fake news and conspiracy stories to forster distrust of democratic institutions such as politics and media in their society). Comercial (not to disrupt society or make it weaker, just to make money, such as crazy stories to evoke emotion and getting you to click on a website).

Propaganda:

information, especially of a biased or misleading nature, used to promote or publicize a particular political cause or point of view. It is pointed in a particular direction. Not necessarily false, it is just manipulated.

Name calling:

link candidates to a negative symbol. Could be literal, such as saying somebody is a cheat, or symbolic, where you cast behaviors in negative ways, such as showing images of somebody being stingy.

Spreadability:

mapping out space where big companies no longer tightly control media distribution, and many of us distribute media. Media users are liberated to spread media to create a place to belong. Ease of replication and distribution. We rely on big media such as Hollywood movies, but we gain power and don't need big production by interacting with it. Ubiquity of social media.

Message durability:

media system must provide stability over time, how long it lasts.

Mediated communication:

message negotiated in some way between sender and receiver. Media system examples: Cave paintings: the cave wall becomes a medium of what they were trying to communicate. Fitbits: messages from my body, such as how many steps I am taking, how much sleep I am getting, etc, is mediated through the fitbit, it translated my body's messages into numbers and something tangible.

Connective action:

not one leader, it is horizontal, a net holding everybody together. People can, this way, personalize their goal, and this helps the movement become stronger. Connected to one another, not connected to a leader.

Networked society:

people feel disaffected from institutions that control their lives, but feel empowered through technologies to create their own identity. Local cultures feel left out of Hollywood movies, for example, and they push against that with the help of technology. Social networking has always existed outside of main power structures, but it is easier now. They develop society from the bottom up. There is more control in the hands of users, where users depend on shareability and spreadability.

Media:

plural of medium.

Privacy fundamentalists:

protecting privacy first. An ideological commitment to privacy.

Horizontal integration:

purchasing other outlets at the same level of production.

Euphemisms:

rather than arousing emotions through words, you use words that are kinder and softer, for example, instead of saying torture, you say "enhanced interrogation", or "colateral damage" instead of killed civilians.

Independent and grassroots media:

relies on funding from followers of that media. Moving away from commodity culture fixated in top down culture. People give away their work or money to keep the media happening.

Fear appeals:

scare people into voting or supporting a candidate.

Available modalities

sensory modalities that communicate through different stimuli. Example: television vs cinematic theatre. It is different to watch something on your computer or at the movie theatre. There are different types of stimuli, and those affect your body and brain differently.

Algorithm:

set of instructions that are written, like a coded recipe that gets executed when it encounters what it is looking for. Google when you search something, for example. Connection to surveillance: AI, in Predictim article, is what grinds up what we expose as our info, such as our likes, facial recognition, finger prints, etc.

Mediate:

someone or something that negotiates between two parties. Normally in legal fields.

The oxygen amplification:

study that helps journalists figure out what to write about and focus on in this tense atmosphere. Example: streamed shooting of muslims. Should people circulate it because it is shocking, or not amplify it to discourage the amplification of hatred? Does amplification make it better or worse, should we give it oxygen?

Social media-company responses

take down offensive/fake content. They say they are tech companies, so they don't regulate companies in the same way as the FCC. They won't pick political ads for example, tug between self regulation or no regulation.

Technological determinism:

technology determines cultural values, social structures and history of a society. All society is driven by technology. Our decisions are made easier or harder through this, and humans are at the root of this. Emerged at the turn from 18th to 19th century, but discussed a lot related to social media. Katie Pierce teaches tech and society, and studies what technology is doing to us. Technology isn't the problem, humans are. Technology only amplifies what is already there. Konnikova article: our attention span isn't short because of Facebook, Facebook simply exacerbates that. Alternative is that we are learning from one another, and the info we learn is more important than the medium, which is Facebook.

Black box technology:

technology we have no idea how it works. It is unknowable and unpackable. Even employers using the technology have no idea where numbers are coming from.

Complicated by: convergence culture:

the flow of content across multiple media platforms, the cooperation between multiple media industries, and the migratory behavior of media audiences in search of entertainment experiences they want. Typically we see differences between TV, print and radio, but nowadays, they go through the same means of distribution, for example using phone apps, or see all of it through a TV. Print, TV shows, and radio are all online. They are all converging through the internet. It is hard to explain why some have stricter media regulations if they all run through the same channels.

Communication is:

the mutual transfer or exchange of information. So, communication... Has intentional or unintentional meanings. We create these meanings. Meanings are mutually constructed. Meaning of the communication is both individual and social.

Surveillance:

the observation of a person or group of people, especially those held under suspicion. Holds a punishing aspect in our culture. Technology makes it easy to watch people, or gather data about people. Everyday objects are being turned into communication technologies (phones, watches, alarm clocks, etc).

Social identity:

the part of our identity defined by membership in social groups. Made up of 2 things: Groups recognized by society. How we view our own identity. Individually I may identify with my groups or not, conforming or going against norms in social categories. Media frames how we gain this. 3 pillars of creating social identity: Family State Church Now we should add media to this.

Tokenism:

the policy or practice of making only a symbolic effort. Insertion of a member to tame its audience's demand, but whiteness is centered.

Media regulation:

the process by which a range of specific, often legally binding tools are applied to media systems and institutions to achieve established policy goals such as pluralism, diversity, competition and freedom. It can be positive or negative. It has traditionally been sector specific, giving different rules based on the type of media, for example print media, telephones, broadcasting on TV and radio. These are the three main realms of media regulation. ABC is more regulated because it is public, cable is private so it is less regulated.

Mediation:

the process of mediation through media changes the message.

Media effects:

the things that occur- either in part or in whole- from media influence. Can occur immediately, or overtime. Media doesn't randomly make you a bad person. Movie parasite: South Korean movie. She had no idea about South Korean culture, but now at least she has this movie's presentation of it. There was an immediate effect here. Watching a show over time can have effects too. Positive or negative. Educate or reinforce, keeping you stuck in a place (cognitive dissonance, which normally attracts you because it keeps you in your comfort zone). Intentional or unintentional. Affect individuals or society.

Social constructivism:

theory of knowledge according to which human development is socially situated and knowledge is constructed through interaction with others.

Privacy pragmatists:

trading some privacy for some benefit.

Transfer:

transfer of authority from one trusted person to another or a group. Candidates associated scientific knowledge to themselves. Ending a speech with a prayer transfer authority of the church to the person.

Testimonial:

when somebody gets others to speak on behalf of the candidate. Or saying "experts agree" and "leading authorities say". Shows somebody with value is in support of you.

Amplification:

when something acquires disproportionate importance or prevalence as a result of being selected for media coverage. In fake news, we need to amplify the right things, how can we tell?

Data dignity:

where people earn what their data is worth, but you will also have to pay for data too. We will need to find a universal way of everybody paying and getting paid, and keep track of all data. MID: mediator individual data, MID is an organization bigger than individual but smaller than nation. Each MID will have their own terms for joining, some will be harder and others will be easier. Some might be for car insurance. Each will pay you at a different rate, and this will build a stream of income from your data.


संबंधित स्टडी सेट्स

Data and Info. Management Chapters 4 and 7 (Eitel Lauria Marist College)

View Set

section 8: Unit 1: Contract Types and Their Legal Effects

View Set

Sectional Anatomy Chapter 4 Spine & Chapter 5

View Set