Final CJRM

¡Supera tus tareas y exámenes ahora con Quizwiz!

Challenges and Ethics

- where do we store this data? - processing data with complex structures - we are making data far faster than our ability to advance analytic techniques - is this new form of data biased? - do online behaviors directly correlate with offline behaviors? - a lot of spam Ethically there are questions to be answered of whether it is right that individuals can access certain data without individuals direct consent (sign up for twitter and now someone can access information about yourself that is not listed on that platform)

Limitations of Qualitative Data

- you have a responsibility to your participants - while all researchers have bias, these are particularly important to consider in this area of research - qualitative research also presents unique challenges

3 reasons why big data is now popular

Method Accessibility More Data Increased technological capacity

____ is the percent of individuals who have completed a survey, while __ is the percent of survey items completed

Response rate, item response rate

____, ____, and ____ are three types of qualitative interviewing

Semi structured, unstructured, focus group

Four concepts needed to read a regression table:

Significance values: how certain you are relationship exists Effect sizes: how impactful variable is Direction of effect: Positive correlation (over 1) negative correlation (under one). Level of uncertainty: Standard errors/confidence intervals. How certain we are in that relationship.

This statistical phenomenon is when the relationship between two variables is the opposite for subgroups within that population

Simpson's paradox

Bad survey qualities:

§ No proper skip pattern. If you say no, still takes you to inappropriate part of the survey. § Not using proper highest level of data. § Not being exhaustive or mutually exclusive § Questions are vague or answers aren't easy to understand (all people don't have same definition for answers). § Range in attributes for indicators isn't consistent. § Using open-ended question when you haven't discussed what point of survey is or what the survey is looking for. § No use of matrix questions. § No neutral option in Likert scales

Three types of regression:

§ OLS regression: · We use it for continuous, normally distributed, variables. · It is the kind of "go to" regression · Scale of mental health averages, for example § Logistic regression: · Next most common · Used for categorical variables · As its base it measures dichotomous outcomes · Yes vs no, disagree vs neutral § Poisson regression: · Measuring counts · Difference between higher or lower counts of something.

Case Studies

- case studies develop in depth descriptions and analysis of a case or multiple cases (cases are topics or events); often studying an event, program, activity, or more than one individual - comes from psychology, law and political science - data comes from multiple sources such as interviews, official documentation, news stories, and pretty much anywhere - about finding themes both between and within these cases Types - instrumental case study: focuses on one case - collective case study: the most frequently used and involves looking at multiple cases - intrinsic case study: focus is the case, not the themes. This can include evaluating programs and how successful they are Genocide Example - crime of all crimes is a book about different genocides that analyzed the themes that made them similar and different - information was gathered via interviews and newspaper articles depicting how/who was victimized - found themes that distinguished genocides from one another but also what makes a genocide a genocide

What is Evaluation Research and how is it used ?

- evaluation research is the application of social research techniques to assess the conceptualization, design, implementation, and utility of social intervention programs - this type of research uses different research methods and allows us to see if what we want a program to accomplish is what is currently being accomplished - we use evaluation research to directly impact policy and practice, to address accountability, and to add to the field of evidence based research Examples - does training help police better understand how to work with persons in crisis? - do tasers decrease the risk of injuries to officers and suspects? - how do body worn cameras impact police use of force? - do CCTVs reduce crime?

Steps for Conducting Evaluation Research

- identify and engage stakeholders - develop research question - design methodology - gather data/evidence - analyze data - identify findings and conclusions - communicate findings and recommendations Both identify and engage stakeholders and communicate findings and recommendations are unique to evaluation research

Know the 8 ethical standards that are more universally used currently:

1. Achieving valid results 2. Honesty and openness 3. Maintain privacy and confidentiality 4. Avoid harming research participants 5. Obtain informed consent 6. Avoid deception in research 7. Protecting research participants 8. Benefits should outweigh foreseeable risks

Two aspects of getting informed consent:

1. Must be provided by individuals COMPETENT to consent - Do they fully understand what's happening in the study, what potential risks of the study are, etc...? 2. Must be voluntary - Does this include research on...students? prisoners? children? pregnant women, fetuses, newborns? non-english speaking persons? Persons with intellectual or developmental disabilities? Is there a possibility of coercion in any of these circumstances?

Causality:

A cause is the explanation for some effect. Not localized to a singular unit of analysis. - Rarely to we actually observe this relationship, rather its often simply assumed. Causal effects: When variation in one variable (IV; "X") leads to, or results in variation of another variable (DV; "Y") when all other variables are held constant.

Normal distribution:

A completely symmetrical distribution that has only one mode. Mean median and mode all lie directly in the center One standard deviation above and below the mean contains 68% of the sample

___ is when researchers are unable to connect data to individual participants, while ___ is when identities are linkable to data, but the researcher purposefully keeps the data private

Anonymity and confidentiality

Anonymity v confidentiality:

Anonymity: Researchers cannot connect data to individual participants. Confidentiality: Researchers are able to link information with a subject's identity BUT promise to keep the data private.

Three components of a true experiment:

At least one experimental and control group - What is the point of a control group? Need a baseline to determine if the effect is actually a result of the cause. - Experimental group: Participants receive some form of treatment. - Participants to whom the experimental group can be compared. No treatment or a different form of treatment. Random assignment: - When you have your participants in an experiment, their randomly being assigned to either an experimental group or a control group. Confounding variables can impact the outcome of your experiment. With random assignment, confounding elements are also randomly assigned. - Often completed through two ways: - Paired matching: Matches subjects based on characteristics relevant to the study. Pair are established and then randomly assigned. - Block matching: Create subgroups of within your sample based on variables. Block variables are the variables the research believes will affect the experiment. The researcher then randomly assigns individuals within blocks and conducts the experiments within blocks. Researcher manipulation of treatment: o This is referred to as the treatment. o Helps to isolate the effect of what you're interested in.

This nightmare-like animal has long finger for hunting bugs

Aye Aye

Cherrypicking data, using misunderstood methodology, or sampling biased data are all examples of

Bad science

Benefits and downsides of surveys:

Benefits of a survey: o Can ask a variety of topics o Is typically more likely to produce generalizable sample because you can ask many people, as about multiple things, etc... Downsides/ ethical considerations: - protection of respondents; think about the potential harm and disclose that in a cover letter to individuals as well as make an effort to reduce any emotional traumas associated with survey participation - confidentiality; socially stigmatized or illegal behaviors must be kept in strict confidence (use ID numbers instead of names)

Approaches to CSS

Big data as a new source of information - behaviors, opinions, and latent traits - interpersonal networks - more affordable experimentation Big data affects social behavior - collective action movements - social capital and interpersonal communication - political attitudes and behaviors

Big Data

Big data is the collection of data that is so large to process it is too difficult to maintain and analyze through traditional systems Researchers can use big data to look at the spreading of hate speech, analyzing videos and photos and as a way to pair big data with traditional data (qualitative, survey data, aggregate data and experiments)

When subgroups of participants are made based on units of interest, and then people are randomly assigned within those groups...

Block matching

Why are surveys used?

Can be incredibly efficient mode of data collection § Can have multiple people doing them at once They can be administered through variety of techniques § Lets you get at wider breadth of individauls. Can survey large amounts of people. Can easily get repeated measures Can be cost effective § For most part, cost effective. § Particularly in age of internet. Can administer surveys online, not paying for postage. § At most, paying for potential renumeration of people taking survey and time of people taking survey. Can increase feeling of confidentiality § Since you can give someone a survey and not gather identifying information, increases perception of confidentiality

Evaluation Assessment

Can the program be evaluated given time and resources?; this is often a preliminary study (qualitative) Designing an evaluation: - black box: identifies if the programs get results; common with impact and outcome evaluations; focus on the change after program exposure (evaluation of a simple input-output model with no attempt to open the black box of program process - theory driven: identifies how the programs get results; program theory is the model of how the program is supposed to produce desired effects; evaluator devises an evaluation to measure how well the theory is implemented; researchers have a clear picture of how the program works and use the program theory as a guide - prescriptive models which is when the designers state how they believe the program will achieve desired effects - descriptive models are when the evaluator derives program theory by observing what happens in the program

Mean, median, mode are all measures of

Central tendency

Three primary ways researchers collected qualitative data:

Collecting non-numerical data to be analyzed in a non-numerical way: Interviews · Can vary in structure · Similar in structure to surveys. Reach out to individuals, can collect face to face, over the phone, etc... · Take a lot of time to do and analyze. · Typically have much fewer participants. · Likely going to be non-probability sampling. · Exploratory research: o Purposive research o Snowball sampling Ethnography · Seldom in most fields. Dying conceptualization of research. · Participant and non-participant observation. o Partipicant observation: You're participating in culture o Non-participant: You're observing from afar. Taking notes, not participating in it. · Going out into the world, writing notes and reflections of things you're doing. · Most purposive of purposive sampling. · Take months to years. o Impossible to do random selection of environments you're going into. Document review · Most effected by big data. · Taking artifacts, things written, and analyzing things that have been written. · Documents can be pulled from variety of sources. · Where documents come from is dramatically influenced by your topic of research. · Can lend itself to computational collection · Often going to be purposive sampling. o Almost entirely non-probability sampling. o Can't get a simple random sample of things written in world. · Using publicly available information reduces "friction" to get a research project started. · Has been significantly impacted by big data. o If scraping twitter, social media, internet webpages, are getting thousands of pages of text you couldn't before.

How do you maintain privacy and confidentiality?

How do we keep recorded data private? - Minimizing risk of access by unauthorized persons - Locking records - Identifying codes Laws allow for subpoena of research data: - May require reporting in certain circumstances - The data isn't legally protected. Just because you are a researcher doesn't mean you can keep it from law. Confidentiality does not apply to: - Observation in public places - Information available in public records

____ and ____ are two components required for informed consent

Competency and voluntariness

This should appear at the beginning of all your surveys

Consent form

Stages/Methods of Conducting Qualitative Research

Content Analysis: - can be in narrative studies, ethnographies and case studies - provides contextual information and illuminates areas of further pursuit - finding: determine what types of documents you will gather, lay out your sampling strategy, decide where you will search for documents, and store and organize documents - analyzing: skim, thorough reading, and organize; your notes and the words from the text become your data Examples: content analysis of content analyses and a less meta example of disney movies and gender roles Interviews: - for phenomenological and grounded theory studies - semi structured interviews: loose guideline - unstructured interviews: no guidelines and often is characteristic of ethnographic work - focus groups: interviewing multiple people at once and it follows a semi structured interview type - Finding interview participants - sampling: you know you have enough people when you are not getting new information from new participants (saturation); typically use a purposive sampling method; the point of the interviews is to get deep, nuanced findings, not to be generalizable - making initial contact: typically use snowball sampling for hard to reach populations; make contact with organizations; building rapport is incredibly important - organize interview times - do the interview: create rapport, walk through consent form, ask about recording the interview, ensure your participant does most of the talking, decorum must be maintained - analyzing interview data: transcribe if possible, reading through transcripts to find themes, and using quotes to convey your themes Observation - spectrum ranging from total observation to total participation - covert observation is used to see how things interact (observing the environment and no one knows who or what you are doing - known observation is when people know you but you do not participate in what you are observing - some participation is when you are known and play a passive role in activities - complete participation is where you are not known and simply participating in the environment - entering the field begins by investigating the setting (closed setting is something like an institution where complete participation would be used; open setting is something like a college campus) - you then must develop and maintain relationships, and then build trust and develop rapport - jotting is when you write in shorthand, use voice notes, photographs or videos which then can later be described in more detail with context, researcher interpretation and deeper meanings Linking Different Data - research questions are best answered with multiple data sources (survey data, official data, census data) Store Qualitative Data

___, ___, and ___ are the three methods of collecting qualitative data

Content analysis, interviews, observation

Bivariate Analyses types:

Correlation: - An association between two concepts - A change in X is associated with a change in Y - Positive and negative correlation - Informs us whether there is a relationship between both property and violent crime. Statistical tests: - T-tests: Looks at the difference in means between two groups. Tell us whether there is a significant difference between two different neighborhoods. - ANOVA: Looks at the differences in means across and between many groups. Tells us whether there is a significant difference between neighborhoods and within individual citizens within neighborhoods. - Chi Square: Observing the association between categories.

This is the type of question that asks multiple questions in one?

Double barrel

These four things are what you nee to know to read regression results

Effect size, significance, directionality, and uncertainty

This is a manatee's closest living land relative

Elephants

Three criteria for causal effects:

Empirical association - Mathematical association - Relationship is observable, not assumed - A change in IV is related to a change in DV - Also called correlation - However, correlation does not equal causation. - Think weird skull graphic - Positive correlation: X goes up as Y goes up. - Negative correlation: X goes up as Y goes down. Temporal ordering - Cause must precede effect in time. X has to come before Y. Hinges on TYPE of research design. - Longitudinal research: Data is collected at two or more points in time. Helps convey cause and effect. - Cross-sectional research: Data is collected at one time point. "Snapshot" of the sample. Makes temporal ordering difficult to establish. - If you only have cross-sectional data, how can you establish that X comes before Y? · Ask questions about "in your childhood, did x happen" and "in the last year, did y happen"? Nonspuriousness: - Want to find relationships that are not spurious. Spuriousness is when a separate variable "z" explains the relationship between "x" and "y". - How to limit spuriousness: Control for all variables outside that may be relevant. - Example of spuriousness: Ice cream sales and shark attacks.

Ethnography

Ethnographies describe and interpret a culture sharing group; social observation with analysis happening through description of culture - comes from anthropology and sociology Types Non participant observation - detached observer where the researcher simply observes without being involved - pros: not intrusive, potentially capturing realistic behaviors and the researcher is not doing anything unethical - cons: may not capture desired events, actions may be misinterpreted (witnessing a culture you know nothing about) and observations may actually be atypical - common with whites going to places like Africa and writing about what they observed Participant Observation - researcher's identity is known but the level of participation varies - observer as a participant: researcher plays a passive role in activities and their identity is know. - participant as observer: researcher does not participate but their identity is known - pros: ethical and there is an open recording of data - cons: hard to maintain objectivity and reactive effects - Examples: In search of respect looked at how society and communities interact to create a sect of segregation and disadvantage to better understand the underground economy of East Harlem; Lions of the North was a book written by a Jew about how music and culture interact with one another in relation to the use of music as a recruitment tool for Nordic Nationalists Complete Participant Observation - researcher operated as a fully functioning member of a group but they are unknown as a researcher - pros: complete data validity and the capturing of natural behaviors - cons: quite a few including dangerousness and difficulty but also major ethical considerations Examples: - Asylum study where Erving Goffman got hired as an assistant to a PT in an asylum and observed patients; this type of research is referred to as "going native" and an asylum is considered a total institution where individuals are fully defined and one can witness what that does to individuals; facilitates more generalizable research - The Study of Deviance and Ethnography: What we know about deviance comes from fully participant observation ethnography research begging the question of whether it is good or bad that we have all this knowledge from this type of research

_____ Is a type of qualitative research in which a researcher makes observations in the field, describing and interpreting culture

Ethnography

Comparing a group of lifetime drinkers with non-drinkers later in life to look at the health outcomes of alcohol consumption is an example of this type of quasi-experiment

Ex-post facto control group design

Seven threats to internal validity:

Experimental morality: People dropping out of the survey or experiment. · If participants dropping out are related the key characteristics of the study, then it is unclear how the treatment affects the experimental group. History: Refers to a specific event during the course of an experiment that affects posttest comparisons between groups. · Lets imagine experiment in real world in which some event may dramatically affect the course of the experiment: o Example: Policing style and how citizen perceptions change over that time. BLM happens in the middle in one town. Not related to experiment itself, but will impact the results. Instrumentation: Instruments are anything you're using to get your data. If the instrument changes during the course of the experiment, there are threats to internal validity. · If devices are recalibrated or damaged during, internal validity is compromised. Maturation: When the passage of time affects the experiment itself. · Think an experiment which requires active attention from participants. If the experiments lasts a long time, will participants become tired and unable to focus? Selection bias: When bias is created by the selection of an experimental and control group. · Particularly at risk in experiments with small number of participants. · Getting more people to participate is specifically helpful for reducing this. Statistical regression: · Regression to the mean: If you have extremes, will eventually pull the mean. o Overtime, people just get better at a thing. · Is this effect because of the treatment or because of this natural tendency? Testing: · Presence of a pretest may affect the experiment · This is the rationale for not always administering a pretest.

How can surveys be used in different types of research?

Exploratory research · Not common in exploratory research · Often used as a precursor to more rigorous qualitative data collection o Example: Provides preliminary information which can guide later interviews § Get demographic information before interview. Descriptive research · Surveys lend themselves well to descriptive research · Through the summarizing of survey information, researchers can easily describe phenomena · Rates, percentages, etc... · Oftentimes where survey analysis begins. Explanatory research: · Surveys equally find utility in explanatory research o Potentially most common source of explanatory research in criminology o Most common in criminology, maybe psychology, etc... · Surveys are great at showing cause and effect (or association at minimum). o Can ask questions about both independent and dependent variable on same survey. Do analysis comparing the two. · As surveys can be longitudinal, they can really get at issue of causality. Evaluation research · Surveys can be administered AS your evaluation method · Surveys can be administered at two time points to explain how well policy or program does o Ask how people feel before and after. · Evaluation similarly benefits from survey research in the same way explanatory research does

Types of Evaluation Research

Formative Evaluation Summative Evaluation Evaluation Assessment

What is GIS?

GIS stands for Geographic Information System, which is the system used by most researchers to conduct spatial analysis

Goals of Policy Research and Demands for Evidence Based Policy

Goals: - inform policymakers about addressing policies - identifying strengths and weaknesses of existing programs - address positive and negative effects Demands: - policies based off findings of rigorous research - tools we have to determine good policies (systematic reviews and meta analysis) - evidence based is the gold standard for impactful policy

These are the five components of a GIS:

Hardware, software, data, methods, people

5 Components of GIS

Hardware: computer Software: the various computer programs such as ArcGIS, QGIS, and Python/R Data: - types are points, lines, polygons, images, and raster - spatial data comes from a variety of sources, the most common being the census People: - specialists: those that create and analyze - users: those who request the support of specialists - consumers: those who use the GIS information Methodology: - cleaning: slow and painful - geocoding: attaching spatial characteristics to the data - analyses: descriptive distribution, clustering and distances

Research and its use both in and out of the academy

In the academy - produce knowledge - advance theory - inform future studies - replication - articles include blocks of text and fitting as much information in it as possible (really detailed methods sections) Outside the academy - inform policymakers and legislators - inform practice for petitioners - provide better information to the lay public

Types of Crime Mapping

Industry Mapping - administrative crime analysis: police mapping of their own information - tactical crime analysis: mapping a crime in order to stop it (short term use) - strategic crime analysis: mapping of crime trends in order to reduce it (long term use) Academic Mapping - policing practices: looking at crime patterns to inform policing practices - risk terrain modeling: look at community characteristics to predict future crime aoristic analysis: understanding the 24 hour cycle of a phenomena

__, ____, and ____ are the three requirements of ethical research outlined by the Belmont report

Informed consent, assessment of risks and benefits, fair selection of study participants

Four threats to external validity:

Interaction of selection biases and experimental variables: · When there is potential biases that arise with who participates in the study. · Does the nature of your study attract only specific individuals? · The very nature of the thing you're studying may prevent you from making generalizations. Interaction of experimental arrangements and experimental variables: · Does the artificial environment of an experiment affect the generalizability of findings? · Is there a downside to isolating the exact effect you're interested in? · The sterile environment of experiments, and experiments point of accounting for other variables, may affect generalizability. Interaction of testing and experimental variables: · Testing may alter people's behaviors. Reactivity threats: · Instances in which the novelty of participating in research or being observed affects participants' behaviors.

Internal v external validity:

Internal validity: Ability to claim causality. Measures what its meant to measure. External validity: Ability to make generalizations from the results of our experiment.

This is the process of taking quick notes in the field while conducting ethnography

Jotting

Limitations and Ethical considerations

Limitations - can be time consuming - data limitations in that the data used is typically official - different geographic units have different utility - taking socially constructed phenomena and placing them in physically constructed space Ethics - should we know the locations of what we are studying? - what are the ramifications of knowing geographic information?

Four ways surveys are distributed:

Mail: § Oldest version § Still used sometimes § Commonly. Just looks like folding piece of paper with survey into envelope and mailing to participants. § Advantages: · No interviewer bias or interviewer effect. Person is taking it wherever they want. · Usually pretty cost-effective · Since individuals take it themselves, can usually cover more ground. § Drawbacks: · No one checks their mail. · When people check mail, may get rid of it as spam. · Time. Don't want to wait time to get survey back. · People forget to send it back. Online: § Prominent form today § Design survey on computer, administer on computer, can make QR codes, email to people, etc... § No interviewer effect. People take it on own time, independently. § Even more cost and time efficient. § Gets rid of drawback of mail-in survey of actually having to input the data from paper into electronics. § Limitations: · Skew younger · Easy to ignore · Confidentiality issues · Can't access those who don't have internet access. · May not know how long it is, may stop when going through because it takes too long. · May think its spam or fraud. Telephone: § Not the oldest, but still pretty much dead. § Used to randomize phone numbers, have people call and administer survey over the phone. § Things are clearer on what question means. § People don't want to spend long amounts of time on the phone with random people. · Can only ask so many questions before someone hangs up. § Unlikely that people will answer the phone. § People don't answer the phone because they assume robo-call. Face to face: § Have advantages. § Things are clear, are likely to get complete data. Surveyor will ask all questions, explain what they mean. § Every answer likely to be well understood by person taking the survey. § HOWEVER: · May not want to answer honestly to someone else. o Interviewer or surveyor bias.

This unethical study saw participants administering increasingly intense electric shocks to others, authority

Milgram experiment

What are the measures of central tendency?

Mode: - The value which appears most frequently - The only measure of central tendency available at all levels of measurement. Median: The middle value Mean: The average - The "center of gravity" for a distribution - Understanding the mean brings us to deviation - This is the distance from the mean to any given raw score.

This type of qualitative research explores the life of an individual

Narrative

Types of Qualitative Research

Narrative Case Study Phenomenology Grounded Theory Ethnography

Narrative

Narrative: - narratives explore the life of an individual through interviews and documents like medical records and/or from observation of their life - come from the humanities Types of Narratives - biographical study: writing and recording of another's personal life - auto ethnography: writing and recording of your own experiences and actions - life history: the portrayal of a person's entire life (The Jack Roller story - Shaw followed this individual throughout his life to determine causes and reasons for participation in a life of crime) - oral history: gathering and analyzing personal, oral stories (The Snorra Edda - recorded Nordic myths and undocumented stories into this one book)

Know the primary examples of unethical research:

Nazi Research - German doctors conducted experiments on concentration camp prisoners. - Exposure to diseases - Exposure to mustard gas - Being shot to analyze blood coagulation - Being placed in low-pressure chambers or freezing water Tuskegee Study of Untreated Syphilis - Tuskegee Alabama - Purpose was to identify the natural course of syphilis - Six hundred impoverished Black men signed up to be involved in "bad blood" research. - 399 participants purposefully infected with syphilis - Even after cure was discovered, study participants were not delivered to those that were infected. Milgram's Obedience to Authority Study - Look at how individuals obey authority - Volunteers were believed to be delivering electric shocks to individuals who did not answer questions correctly. - Conclusions were that obedience was surprisingly high and that even though volunteers exhibited severe distress, would keep going. - Unethical, but scientifically sound. Stanford Prison Experiment - Graduate student volunteers at Stanford acted as both prisoners and correctional officers. - Roles were internalized. Guards started to conduct highly abusive behavior toward students acting as prisoners. - Zimbardo ultimately acted as executive leader of prison, not objective researcher. Tearoom Trade: - Interest in causal and fleeting same-sex acts engaged by men who publicly identified as heterosexual. - Looked at strangers meeting in park bathrooms. - Tracked down the home addresses of these men by acting as the lookout. - Collected more data under deception.

Three types of quasi-experiment and components of each:

Nonequivalent control group design: § Still has an experimental group and a comparison group which is not randomly assigned § Given the lack of random assignment, no equivalence between groups § Allows you to conduct an experiment when randomization is not possible. Before-and-after designs: § Commonly used with how effective laws and policies are. · Hard to compare these large scale things to another thing. Therefore, compare before and after snapshot § There is no random assignment § Two types: · Fixed-sample panel design: Data is collected from the sample at two separate points and compared to each other. Subjects are compared against themselves. Pre-post, but no control group. · Time series repeated measure panel design: This is where you have a trend of something, implement your intervention, and compare actual trend to projected trend. (1) Identify trend up to an intervention. (2) Project the trend. (3) Does the projected trend match the actual trend after the intervention? Ex post facto control group designs: § Treatment and control groups are made after the treatment § Again, non-random assignment § Common in studies examining phenomenon we cannot control. · Example: Effects of smoking § May use when you can't MAKE people do something, it would be unethical. § Simply using people who did not receive the treatment as the control.

____ are surveys that ask about a wide variety of topics with less depth, while ___ ask about less topics with more depth

Omnibus, specific

Analyses in CSS

Online experiments - experiments are typically expensive and resource intensive - With big data and CSS, we can see which tweet gets retweeted more in order to answer a research question (costs no money and you can get exponentially more responses) Text analysis - can look at how people with legislative powers engage with individuals that follow them and whether or not party ideas go to direction of those in power or vice versa (who generates what) - can also look at how people with different political affiliations communicate across and within the aisle (people stay in one lane via online platforms but when it is not publicized will communicate across the aisle) Network analysis - how different types of news spread

These are three examples of analyses conducted in computational social sciences

Online experiments, text analysis, network analysis

Surveys can be administered through these four methods

Online, over the phone, mail-in, in person

Psychometrics

Psychometrics are statistics used in testing, measurement, and assessment. Commonly used in psychology, testing research, assessment research. How we determine reliability and validity of scales. If you are interested in how concepts cluster. · How do concepts cluster together? Or if you're interested in your survey questions and how they related to one another. Analyses that fall under psychometrics. · Factor analysis · IRT See how many factors "hang together". - Example: · Interested in understanding labor exploitation. · Ask a bunch of questions related to labor exploitation. · Want to know if different experiences hang together in specific ways. o If you experience one type of victimization, are you more likely to experience another victimization similar to it? Higher the number, the better the question gets at the overall cluster.

The Interview Types (only need to know the 2 types not what distinguishes them)

Phenomenology: - understanding the essence of experience; describing the experience of a lived phenomenon; often through the study of individuals who shared an experience - comes from philosophy and psychology - primarily done through interviews and analyzing descriptions and signficant statements Types - hermeneutical phenomenology: researcher interpretation - transcendental phenomenology: perspective of participants Example - Waiting for a Liver Transplant: interviewed people who were waiting for liver transplants and found commonalities between subjects Grounded Theory: - analyzing text through coding structure and generating a theory - comes from sociology - studying the processes or interactions Types - systematic: create a theory which explains the actions or beliefs of participants - constructivist: attempts to address problems with systematic Example - An Exploratory Study of Labor Trafficking Among US Citizens - understanding how US citizens experience labor trafficking by interviewing individuals about their work experience, what jobs they've had, and their experience in those jobs

Pros and Cons

Pros: big, always on and non-reactive Cons: incomplete, inaccessible, non-representative, dirty and sensitive

What is Qualitative data and why is it used?

Qualitative data is non numeric data (words) that scratches the surface of topics we may not know about allowing us to push forward with these concepts, provides more detail than simply quantifying a concept using numbers and it is innately compelling and approachable Rich and thick description is critical when writing up results including the discussion of context within paragraphs and the use of rich in the words you use and thick in your description of these words

This type of study design includes both experimental and control groups, but does not use randomization to divide participants:

Quasi-experimental

Quasi-experiment:

Quasi-experiments DO NOT use random assignment: - May not be possible to randomize. Too many resources needed, too much time/money, etc... - Still have experimental group (treatment group), control group (comparison group). Just not constructed the same way. - Experiment can STILL be done, but with less explanatory power and more problems with validity. Quasi: Latin word meaning 'resembling' or "having some, but not all of the features of"

What makes for a good survey question?

Questions should both valid and reliable. Questions should be clear and not misleading. Should use a variety of question types if possible (open-ended and close-ended). Clear and meaningful language. § Keep your audience in mind. § Make obvious what you're asking. Avoid confusing phrasing and vagueness § Ex: Bad: "How happy are you about things?" Avoid negative words and double negatives § Should not take a second to understand what a question means. Avoid double-barreled questions (asks about more than one question) § Don't know what you're saying yes to, as they ask about multiple things. Avoid loaded questions Use correct spelling Mutually exclusive and exhaustive options (including neutrality and "I don't know")

Measures of variability include....

Range, variance, standard deviation

What are the measures of variability?

Range: The highest value minus the lowest value. Mean deviation: - Understanding variation, but taking all scores into account. - The absolute sum of all variations divided by the number of scores - The bigger the number, the wider the distribution Variance and standard deviation: - Moves beyond the mean deviation. Allows us to make more concrete conclusions about distributions and conduct more advanced analyses. - Standard deviation: How far away from the mean data deviates. Most commonly used

This is a statistical method to look at the relationship between two variables while controlling for other variables

Regression

Regression:

Regression is deriving our best guess for our dependent variable while controlling for multiple independent variables. Control for things in order to get accurate estimate of what you're interested in. The more things you control for, the more accurate your estimate becomes. Many forms of regression. Characteristics of the data you have change the specific method of regression you use. - Regression is the most fundamental type of statistical analysis. Basic regression is for normally continuous data.

Experiment:

Research design that attempts to isolate and test one specific variable and determine causal effects. True experiments are the gold standard for showing causality in research. When done correctly, experiments are fairly generalizable Involves a causal hypothesis - Independent variable X will directly affect dependent variable Y

Survey designs: Advantages and disadvantages:

Researchers assess the advantages and disadvantages of each survey type in terms of... § Representativeness of sample § Questionnaire and question design § Distortion of answers § Administrative goals of project

What is Spatial Analysis?

Spatial analysis is the conducting of analyses on data that has a space or geographic component - it allows for more concrete understanding of place and allows us to look at the nexus of space and time Some examples are when you google restaurants near me and a list of places are generated; GPS is another example; picking out an outfit based on the weather (spatial dependent)

Why do we use spatial analysis?

Spatial analysis started in the 1800s when two researchers looked at changes in French crime and suicides across districts and time - researchers then began to use mapping to advance important social science questions - this practice expanded to England where researchers looked at the issue of poverty and how it clustered (becoming interested in how space interacts with personal characteristics to affect crime, delinquency and victimization) - Shaw and McKay's research in delinquency in Chicago where they looked at the concentration on geographic conditions and their relation to crime Theory feeds into spatial analytics from criminology, geography, wildlife biology and epidemiology - theory is useful to help explain the patterns we can physically observe - we can determine appropriate data, formulate research questions and focus our analyses - some examples are routine activities theory and social disorganization theories (deal with how space affects behavior) Examples - autocorrelation of type of arrest and race of arrestee

Survey design and layout:

Starting with consent form § Voluntary, competent consent is a requirement for studies. Usually begin with demographic questions § Get descriptive information on who you're surveying. Middle should be substantive questions § What you're interested in Conclude with wrap up questions § Asking about following up later, contact information, how to send money, etc... Create filter questions and skip pattern where needed § Make sure that only people who have had certain experiences answer questions about those experiences. § Create method of skipping questions if they don't apply to you. Should "pilot" your survey to make sure it works properly.

Summative Evaluation

Summative evaluation makes comprehensive statements about a program or policy; questioning whether the program should continue to be funded or terminated 2 types - outcome evaluation: measures the effectiveness of the program on the target population; did the program have the intended consequences?; often experimental or quasi experimental research design - impact evaluation: does the program have the desired effect on society as a whole?; the focus is not on the target population; often conducted using quantitative data Ethics - most issues concern outcome and the impact studies - issues related to participants: distribution of benefits, preserving confidentiality, and participant burden may be considerable - issues related to evaluation design and conduct: research designs may be shaped by politics and the intent of evaluation may be political

Five primary ways researchers collect quantitative data:

Surveys · Most common form of data collection in social science research for quantitative. · Structured documents with a serious of questions that ask about variety of topics researchers are interested in. · Helps researchers answer quantitative questions. · Not localized to a sampling technique · Probability samples often use surveys · Less common in non-probability samples. Can be present in snowball sampling, however. · Collected by: In person, digital, phone, etc... Official data: · Often collected through cooperation of organizations your research project is collaborating with · Typically going to be either cluster or purposive sampling (reason you're working with that organization). · While useful, typically collected in addition to other types of data. Administrative data · Goes with official data. o Very seldom that researchers are solely trying to collect administrative data alone to analyze it. o Using administrative records can help answer research questions. Experiments: · Common in certain types of social science (public health, public policy) · More used for physical science · Almost exclusively probability sampling - Extremely challenging to construct simple random sample. · Collecting data from an experimental and control group. Specifically looking for causal relationships. · Does body worn camera use effect use of force: o Three police departments. Randomly select 40% of officers to wear body worns, other 60% don't wear them. Run intervention over course of three months, collecting information on how many times involved in use of force. General secondary data · Not much is needed regarding data collection, but can be used in tandem with other forms of data collection. · Important to know what data is going to be going into research project. If you don't review before, your secondary data may not answer research question. · If secondary data collected poorly, you have to deal with this and work with data not collected well. More than one type can be collected for one project.

This statistical test looks at the difference between two groups

T-test

What are the Computational Social Sciences and Types of Research found

The computational social sciences involves the combination of readymades and custommades - involves 5 key communities: social science, data science, business people, privacy, advocates, and policy makers Readymades - the fountain: existed for a specific purpose and the artist repurposes it; movement data that you use to answer a research question Custommades - Michelangelo's David: artist made it for a specific reason; could also be the administering of a survey that answers a research question not yet asked

When discussing computational social science, this art piece used as an example of "readymade"

The fountain

Formative Evaluation

This type of evaluation occurs early in the life of policy development and is done to assess whether a policy is possible 2 types - Needs Assessment: understanding the needs of the target population; steps taken are identify population, conduct gap analysis (estimate the size or extent of the problem), and identify interventions or programs that will be useful - Process evaluation: this is conducted when the program is in operation to determine if it was implemented the way it was intended to be implemented and to determine if the existing program has changed over time; you use logic models to show how a program is intended to work; inputs are the staff who promote the programs, outputs are the things that are given (lessons planned about drug use for DARE), short term outcomes, long term outcomes, and stakeholders (these individuals can provide feedback at anytime and they typically have investments in programs or vested interest in the program) Ethics of Process Evaluations - sharing results: is it only with sponsors? with clients? with the general public? - scientific credibility: if the design is poor, then the results should not be used; if the client controls the data then the researcher may not be able to avoid their use of that data

What are the three types of experimental design?

Two-group posttest-only design: o Randomly assigned experimental group o Randomly assigned control group o Researcher manipulates the treatment and creates a comparison of the dependent variable - known as the posttest. o Difference in posttest measures are assumed to be because of the treatment. o What might be a limitation of this specific method? § Only collecting one time point for data. Don't have baseline knowledge of how both groups did. Two-group pretest-treatment-posttest design: o Randomly assigned experimental and contrl groups o Give pretest to both groups o Conducts the experiment o Conducts posttest o Can now compare between groups across time points o Not only asking how much does treatment affect groups, but how much did each group change over time § Comparing across groups and across time o Does this design have limitations? § The test can affect people. What is on the test may impact your behavior. § If you take the test and may understand what's being researched, could impact result. Solomon four group design: o Have four groups o Two control groups, two experimental groups o One control group and one experimental group get a pretest, the others don't. o Four groups § Pretest, treatment, posttest group § Pretest, no treatment, posttest group § Treatment, posttest group § No treatment posttest group

Analyzing Qualitative Data

Typically involves reading and pulling out themes that exist within all of your pieces of data - this can be done on paper or using NVIVO (software)

What makes effective evaluation?

Utility: whether or not the evaluation provided satisfactory information to the client Feasibility: whether an evaluation is viable or realistic given available resources Propriety: whether an evaluation was conducted ethically, legally, and with regard to the welfare of participants and others Accuracy: whether an evaluation offers correct findings

The 5 V's of Big Data

Volume - you do not have a computer that could store the data; large volume of data requiring analysis (175 zetabytes by 2025) Velocity - the data is being generated quickly - every 60 seconds there are 100,000 tweets, 700,000 status updates, 11 million messages, 200 million emails sent, etc Variety - structured: excel spreadsheet with columns of variables with ones and zeros - unstructured: text, audio files, and video files (things that are not distinct in how you can analyze) - semi structured: combination of the two where you may have an excel sheet with column for book, author, text, etc Value - what you gain from it; considered big data if it is valuable to the person analyzing it Veracity - is the data received valid and conveys what you want it to convey

Be able to formulate an argument as to what makes for ethical and unethical research. Think through how you might answer the following questions: What are the qualities of ethical research? How are ethics in research changing in more contemporary landscapes? Is research unethical if it is conducted poorly? Is it unethical to use secondary data that was collected unethically? Is it unethical to use data that was collected before the creation of the IRB? Is a study automatically ethical if it was approved by an IRB? Why or why not?

Write out

Two broad types of surveys:

o Omnibus surveys: § Covers many topics § Can be used by many researchers § Limited depth § AddHealth Survey: Asks a ton of questions § Multiple researchers can get at multiple ideas using that dataset. o Specific surveys: § Covers narrower topics § Deeper knowledge § Not as wide of a utility § For interest in very specific topics. Can't be addressed in full with Omnibus surveys. § If interested in something specific, have to design your own specific survey. § Very deep, nuanced, rich information.

What are the terms in survey research?

o Questionnaire: - Self-administered survey instrument. - Contains the questions in a self-administered survey. o Interview Schedule: § Researcher is administering questions to you. That instrument is the interview schedule. § Survey instrument contain questions asked by the interviewer in an in-person or over the phone survey. o Respondent: § Individual taking the survey. - Person who answers the survey questions. o Response rate: § Percent of persons that complete survey § Want a larger response rate to be generalizable. Certain groups may not respond to survey, that would cause bias if low response rate. § Higher proportion of individuals that respond to your survey, the better. § However, won't ever get 100% response rate. § How to get around it: · Send it again · Offer money o Item response rate: § Percentage of completed items on a survey § How many questions on survey were actually answered. § If leaving things blank, don't get all questions answered or don't get full understanding of variables you're looking at. § Statistical power: Need certain number of people to have answered your question or responded to your survey to be able to analyze anything. § How to get around: · Make questions mandatory · If you have index, decide which questions you don't mind as much if people answer. Decide how many questions you need answered to be meaningful. o Attrition rate: § Loss of study participants overtime. § Why we care? · If you're trying to study something longitudinally, and you lose a ton of people overtime, dwindling numbers of individuals you can make causational arguments about.

Why do we visualize statistics?

§ Data visualization is important and powerful tool. § It can help to simply convey aspects of the data § It can also enhance a message the author is trying to convey using the data § Effective visualization helps to distill information you think is important and present it in a way that's understandable.

How to read a regression table:

· Columns mean different outcomes. Different models. - Rows represent things impacting the outcomes (independent variables?) · What is an effect size, a standard error, and a confidence interval? o Effect size: Ultimately the impact one variable has on another o Standard error: Level of accuracy or uncertainty o Confidence interval: Essentially the distribution of our uncertainty · Number of observations: when they ran the analysis, how many observations were included in each model when all was said and done? P values: Carl Pierson discovered Chi Square and "p" values. o For something to be considered significant, it should be less than 0.5. o P>0.5 is just because these people decided this. On far left are the results. Can't have negative values. But if it's less than one it means its negatively impactful. § Increases in support decrease overall levels of satisfaction. § Effect size has ratio that's under 1, that's significant, so we're sure the relationship exists. Therefore, the effect is negative. § 1.40(0.05): In the parentheses is the statistical significance. Effect size is outside the parentheses. Under 1 is negative correlation, over 1 is positive correlation. § 2.04(0.11): Heightened satisfaction with your assignment increases satisfaction overall. Has good effect size, bad on statistical significance. Stars correspond with confidence interval. Only statistically significant at *, or p<0.05. LOOK AT SLIDESHOW, stats part 2 slide 9.

What is a survey:

· Piece of paper with series of questions. o Tools used to gather data on a variety of toppics o Can gather quantitative and qualitative data § Though mostly used for quantitative data collection o Collection of information from sample of individuals through their responses to questions o Not solely a scientific tool § Many of us have taken non-scientific surveys § However, primary tool for social scientists as it helps us get to social phenomenon.


Conjuntos de estudio relacionados

C214 FINANCIAL MANAGEMENT, WGU C214 Concepts Only Multi Choice Version, C214 Word Problems, WGU C214 Finance Management PVCC, C214 Study Guide, c214 Pre-Assessment, WGU C214 Finance Management PVCC, c214 quiz + notes, C214 Math Problems, Financial Ma...

View Set