SOC FINAL

¡Supera tus tareas y exámenes ahora con Quizwiz!

Filter question

- A survey question used to identify a subset of respondents who then are asked other question. - These filter questions create skip patterns. - These filter questions create skip patterns. For example, respondents who answer no to one question are directed to skip ahead to another question, but respondents who answer yes go on to the contingent question.

Omnibus survey

- A survey that covers a range of topics of interest to different social scientist - Perhaps the most efficient type of survey is an omnibus survey, which includes a range of topics of interest to different social scientists or to other sponsors.

Double-barreled question

A single survey question that actually asks two questions but allows only one answer.

Unobtrusive Methods

Glossary: Unobtrusive measure: A measurement based on physical traces or other data that are collected without the knowledge or participation of the individuals or groups that generated the data. Krull: "nonreactive" and so, do not alert the people that they study.

Survey research

Research in which information is collected from a sample of individuals through their responses to a set of standardized questions.

Strengths & Weaknesses of Content Analysis

Strengths= - relative, quick, and inexpensive - easy to repeat a portion of the study if necessary - permits study of processes over time - researcher seldom has any effect on the subject being studied

Solving Ethical Issues

To lessen any detrimental program impact: - minimize number in control group - use minimum sample size - test only new parts of the program, not entire program - compare treatments that vary in intensity, rather than presence & absence - vary treatments between settings, rather than among individuals in a single setting

Ethical Issues in Evaluation Research

• How can confidentiality be preserved when the data are owned by a government agency or are subject to discovery in a legal proceeding? • Who decides what burden an evaluation project can impose upon participants? • Can a research decision legitimately be shaped by political considerations? • Must findings be shared with all stakeholders or only with policy makers? • Will a randomized experiment yield more defensible evidence than the alternatives? • Will the results actually be used?

What are the ethical issues when using unobtrusive measures?

* Unobtrusive measure: A measurement based on physical traces or other data that are collected without the knowledge or participation of the individuals or groups that generated the data. - Unobtrusive measures can be based on physical traces, archives, or observations. - Although the potential harm to research participants may be delayed, it can still occur unless care is used to avoid disclosing identities—including covering faces in photos that are published. - Ethical concerns are multiplied when surveys are conducted or other data are collected in other countries. - If the outside researcher lacks much knowledge of local norms, values, and routine activities, the potential for inadvertently harming subjects is substantial. - Researchers who review historical or government documents must also try to avoid embarrassing or otherwise harming named individuals or their descendants by disclosing sensitive information

What are the ways in which social scientists use unobtrusive measures and what are the strengths and weaknesses of unobtrusive measures?

* Unobtrusive measure: A measurement based on physical traces or other data that are collected without the knowledge or participation of the individuals or groups that generated the data. KRULL NOTES: 4 kinds of unobtrusive measures: 1. Creative Methods, 2. Content analysis, 3. Historical Methods, 4. Comparative Methods Creative sources: Physical Traces, Archives, Observation, Contrived Observation. - 4 categories of data that might provide unobtrusive measures: physical traces, archives, simple observation, and contrived observation. 1.) Physical traces: Either the erosion or the accumulation of physical substances that can be used as evidence of activity. 2.) Archives: records of all sorts that are already being kept, aside from any social science purpose. - These may be quite formal, as in government records of births, deaths, marriages, tax records, building permits, crime statistics, and the like. - archival data: Written or visual records, not produced by the researcher. Weak: Even officially kept records, not to say personal documents, often have built-in biases. 3.) Observation: fully developed, this is what we've called ethnography or field research, but even very brief observations can be revealing Strength: Even simple and obvious sorts of observations, though, can be used to validate other sorts of measures 4.) Contrived observation: Observations of situations in which the researcher has deliberately intervened. Content Analysis: The study of recorded Human communications - diaries, newspapers, social media, photographs, magazines. - content analysis is essentially a coding operation - coding is the process of transforming raw data into a "standardized" form.

Data: Plural or singular?

*The word "data" is ALWAYS PLURAL *the word datum is singular, and the word data is plural. - "the data show"

Qualitative compared With Quantitative Data analysis

*differences of the logic behind qualitative versus quantitative analysis Qualitative data analysis has the following characteristics: • A focus on meanings rather than on quantifiable phenomena • Collection of much data on a few cases rather than little data on many cases • Study in depth and detail, without predetermined categories or directions, rather than emphasis on analyses and categories determined in advance • Conception of the researcher as an "instrument" rather than as the designer of objective instruments to measure particular variables • Sensitivity to context, rather than seeking universal generalizations • Attention to the impact of the researcher's and others' values on the course of the analysis, rather than presuming the possibility of value-free inquiry • A goal of rich descriptions of the world rather than measurement of specific variable

How/why did program evaluation evolve?

*evaluation research is conducted for a distinctive purpose: to investigate social programs (such as substance abuse treatment programs, welfare programs, criminal justice programs, or employment and training programs). *For each project, an evaluation researcher must select a research design and method of data collection that are useful for answering the particular research questions posed and appropriate for the particular program investigated. *When you review or plan evaluation research, you have to think about the research process as a whole and how different parts of that process can best be combined. *The development of evaluation research as a major enterprise followed on the heels of the expansion of the federal government during the Great Depression and World War II. - Large Depression-era government outlays for social programs stimulated interest in monitoring program output, and the military effort in World War II led to some of the necessary review and contracting procedures for sponsoring evaluation research. -However, not until the Great Society programs of the 1960s did evaluation begin to be required when new social programs were funded. - The World Bank and International Monetary Fund (IMF) began to require evaluation of the programs they fund in other countries. - More than 100 contract research and development firms began in the United States between 1965 and 1975, and many federal agencies developed their own research units.

What are problems to avoid when writing survey questions?

+ Be Clear; avoid Confusing Phrasing + Minimize Bias + Allow for Disagreement + Don't ask Questions they Can't answer + Allow for uncertainty + Make Response Categories exhaustive and Mutually exclusive

Design Alternatives:

+ Black box or program theory—Do we care how the program gets results? +Researcher or stakeholder orientation—Whose goals matter most? + Quantitative or qualitative methods—Which methods provide the best answers? + Simple or complex outcomes—How complicated should the findings be?

Narrative analysis

- A form of qualitative analysis in which the analyst focuses on how respondents impose order on the flow of experience in their lives and so make sense of events and actions in which they have participate - Narrative "displays the goals and intentions of human actors; it makes individuals, cultures, societies, and historical epochs comprehensible as wholes" - Narrative analysis focuses on "the story itself" and seeks to preserve the integrity of personal biographies or a series of events that cannot adequately be understood in terms of their discrete elements - The coding for a narrative analysis is typically of the narratives as a whole rather than of the different elements within them. - The coding strategy revolves around reading the stories and classifying them into general pattern

Gatekeeper

- A person in a field setting who can grant researchers access to the setting. - Discussion about these issues with key participants, or gatekeepers, should be honest and should identify what the participants can expect from the research, without necessarily going into detail about the researcher's hypoth- eses or research questions

Participant observation

- A qualitative method for gathering data that involves developing a sustained relationship with people while they go about their normal activities - The term participant observer actually represents a continuum of roles, ranging from being a complete observer who does not participate in group activities and is publicly defined as a researcher to being a covert participant who acts just like other group members and does not disclose his or her research role. - Participant observers seek to avoid the artificiality of experimental designs and the unnatural structured questioning of survey research - This method encourages consideration of the context in which social interaction occurs, of the complex and interconnected nature of social relations, and of the sequencing of events

Intensive (depth) Interviewing

- A qualitative method that involves open-ended, relatively unstructured questioning in which the interviewer seeks in-depth information on the interviewee's feelings, experiences, and perceptions. - relies on open-ended questions to develop a comprehensive picture of the interviewee's background, attitudes, and actions—to "listen to people as they describe how they understand the worlds in which they live and work" - a type of interviewing that qualifies as qualitative rather than quantitative research

Focus groups

- A qualitative method that involves unstructured group interviews in which the focus group leader actively encourages discussion among participants on the topics of interest. ADV: + good method for observing the process of social interaction + very flexible + relatively high measurement validity + quick results & relatively inexpensive DIS: - less control than individual intensive interviews - data can be somewhat difficult to analyze - moderators must be carefully trained & practiced - relatively low reliability - groups can be difficult to assemble

Double negative

- A question or statement that contains two negatives, which can muddy the meaning of the question. - Avoid negative phrases or words

Contingent question

- A question that is asked of only a subset of survey respondent - These filter questions create skip patterns. For example, respondents who answer no to one question are directed to skip ahead to another question, but respondents who answer yes go on to the contingent question.

Content Analysis

- A research method for systematically analyzing and making inferences from text. - Certain forms of archival observation have been systematically developed into what's called content analysis. - Content analysis, "the systematic, objective, quantitative analysis of message characteristics," is a method particularly well suited to the study of popular culture and many other issues concerning human communication. - Content analysis develops inferences from human communication in any of its forms, including books, articles, magazines, songs, films, and speeches.

Theoretical Sampling

- A sampling method recommended for field researchers by Glaser and Strauss. - A theoretical sample is drawn in a sequential fashion, with settings or individuals selected for study as earlier observations or interviews indicate that these settings or individuals are influential.

In-person interview

- A survey in which an interviewer questions respondents face-to-face and records their answers. - If money is no object, in-person interviewing is often the best survey design. - In-person interviewing has several advantages: Response rates are higher than with any other survey design; -questionnaires can be much longer than with mailed or phone surveys; -the questionnaire can be complex, with both open-ended and closed- ended questions and frequent branching patterns; -the interviewer can control the order in which questions are read and answered; -the physical and social circumstances of the interview can be monitored; -and respondents' interpretations of questions can be probed and clarified. - The interviewer, therefore, is well placed to gain a full understanding of what the respondent really wants to say. -However, researchers must be alert to some special hazards resulting from the presence of an interviewer. -Ideally, every respondent should have the same interview experience—that is, each respondent should be asked the same questions in the same way by the same type of person, who reacts similarly to the answers. - Careful training and supervision are essential

Phone survey

- A survey in which interviewers question respondents over the phone and record their answers. - 2 problems often threaten the validity of a phone survey: not reaching the proper sampling units (or coverage error) and not getting enough successfully completed responses to make the results generalizable. - need follow up calls/ 3 tries is good rule of thumb

Interactive voice response (IvR)

- A survey in which respondents receive automated calls and answer questions by pressing numbers on their touch- tone phones or speaking numbers that are interpreted by computerized voice recognition software. -. Although they present some difficulties when many answer choices must be used or skip patterns must be followed, IVR surveys have been used successfully with short questionnaires and when respondents are highly motivated to participate. - When these conditions are not met, potential respondents may be put off by the impersonality of this computer-driven approach

Mailed (self-administered) survey

- A survey involving a mailed questionnaire to be completed by the respondent. - A mailed (self-administered) survey is conducted by mailing a questionnaire to respondents, who then take the survey by themselves. - The central problem for a mailed survey is maximizing the response rate. - Even an attractive questionnaire with clear questions will probably be returned by no more than 30% of a sample unless extra steps are taken. -. A response rate of 30%, of course, is a disaster, destroying any hope of a representative sample. That's because people who do respond are often systematically different from people who don't respond—women respond more often, for instance, to most surveys; people with very strong opinions respond more than those who are indifferent; very wealthy and very poor people, for different reasons, are less likely to respond. - Fortunately, the conscientious use of systematic techniques can push the response rate to 70% or higher for most mailed surveys, which is acceptable. - Sending follow-up mailings to nonrespondents is the single most impor- tant technique for obtaining an adequate response rate. - The follow-up mailings explicitly encourage initial nonrespondents to return a completed questionnaire; implicitly, they convey the importance of the effort. - standard procedure for the mailing process: a preliminary introductory letter, a well-packaged survey mailing with a personalized cover letter, a reminder postcard 2 weeks after the initial mailing, and then new cover letters and replacement questionnaires 2 to 4 weeks and 6 to 8 weeks after that mailing. - The cover letter, actually, is critical to the success of a mailed survey.

Group-administered survey

- A survey that is completed by individual respondents who are assembled in a group. - The response rate is usually high because most group members will participate. - Unfortunately, this method is seldom feasible because it requires a captive audience. - Whoever is responsible for administering the survey to the group must be careful to minimize comments that might bias answers or that could vary between different groups in the same survey. - A standard introductory statement should be read to the group that expresses appreciation for their participation, describes the steps of the survey, and emphasizes (in classroom surveys) that the survey is not the same as a test. - A cover letter like that used in mailed surveys also should be distributed with the questionnaires. - To emphasize confidentiality, respondents should be given envelopes in which to seal their questionnaires after they are completed. - Another issue of special concern with group-administered surveys is the possibility that respondents will feel coerced to participate and, therefore, will be less likely to answer questions honestly. -. Also, because administering group surveys requires approval of the authorities—and this sponsorship is made quite obvious because the survey is conducted on the organization's premises—respondents may infer that the researcher is in league with the sponsor. - No complete solution to this problem exists, but it helps to make an introductory statement emphasizing the researcher's independence and giving participants a chance to ask questions about the survey. - The sponsor should keep a low profile and allow the researcher both control over the data and autonomy in report writing.

Needs assessment

- A type of evaluation research that attempts to determine the needs of some population that might be met with a social program. - A needs assessment attempts, with systematic, credible evidence, to evaluate what needs exist in a population. - Need may be assessed by social indicators, such as the poverty rate or the level of home ownership; interviews with local experts, such as school board members or team captains; surveys of populations potentially in need; or focus groups with community residents

Cost effectiveness analysis

- A type of evaluation research that compares program costs with actual program outcomes. - How much does it cost to achieve a given effect?

Cost benefit analysis

- A type of evaluation research that compares program costs with the economic value of program benefits. - Are the program's financial benefits sufficient to offset the program's costs? - A cost-benefit analysis must (obviously) identify the specific costs and benefits to be studied, but my "benefit" may easily be your "cost." - In addition to measuring services and their associated costs, a cost-benefit analysis must be able to make some type of estimation of how clients benefited from the program and what the economic value of this benefit was.

Outlier

- An exceptionally high or low value in a distribution

Key Informant

- An insider who is willing and able to provide a field researcher with superior access and information, including answers to questions that arise during the research

Stakeholder approaches to evaluation

- An orientation to evaluation research that expects researchers to be responsive primarily to the people involved with the program. - Stakeholder approaches encourage researchers to be responsive to program stakeholders. - In one stakeholder approach, termed utilization-focused evaluation, the evaluator forms a task force of program stakeholders who help to shape the evaluation project so that they are most likely to use its results - One research approach, termed appreciative inquiry, eliminates the professional researcher altogether in favor of a structured dialogue about needed changes among program participants themselves NEG:. If stakeholders are ignored, researchers may find that participants are uncooperative, that their reports are unused, and that the next project remains unfunded.

Social Science approaches to evaluation

- An orientation to evaluation research that expects researchers to emphasize the importance of researcher expertise and maintenance of autonomy from program stakeholders. - emphasize researcher expertise autonomy to develop the most trustworthy, unbiased program evaluation - These approaches assume that "evaluators cannot passively accept the values and views of the other stakeholders" - the researcher derives a program theory from information on how the program operates and current social science theory, not from the views of stakeholders. - one somewhat extreme form of this approach, goal-free evaluation, researchers do not even permit themselves to learn what goals the program stakeholders have for the program. + the researcher assesses and then compares the needs of participants to a wide array of program outcomes + The goal-free evaluator wants to see the unanticipated outcomes and to remove any biases caused by knowing the program goals in advance. NEG: If social science procedures are neglected, standards of evidence will be compromised, conclusions about program effects will likely be invalid, and results are unlikely to be generalizable to other settings.

Integrative approaches

- An orientation to evaluation research that expects researchers to respond to the concerns of people involved with the program stakeholders, as well as to the standards and goals of the social scientific community - Integrative approaches attempt to cover issues of concern to both stakeholders and evaluators - Integrative approaches seek to balance responsiveness to stakeholders with objectivity and scientific validity. - Evaluators negotiate regularly with key stakeholders during the planning of the research; preliminary findings are reported back to decision makers so they can make improvements; and when the final evaluation is conducted, the research team may operate more autonomously, minimizing intrusions from pro- gram stakeholders. - Evaluators and clients thus work together.

Computer-assisted qualitative data analysis

- Analysis of textual, aural, or pictorial data using a special computer program that facilitates searching and coding text. - can dramatically accelerate the techniques used traditionally to analyze such text as notes, documents, or inter- view transcripts: preparation, coding, analysis, and reporting

Creating Code Categories

- Both deductive and inductive methods may be used - code categories work just like variables - different levels of measurement can be used in content analysis

Physical Traces

- Either the erosion or the accumulation of physical substances that can be used as evidence of activity. - For instance, footprints in snow indicate that someone has walked there. - Simply becoming aware of such traces (we might call it "seeing like a detective") can provide social scientists with valuable research data.

Foci of evaluation research

- Evaluation research is social research that is conducted for a distinctive purpose: to investigate social programs. - The evaluation process as a whole, and the feedback process in particular, can only be understood in relation to the interests and perspectives of program stakeholders. - The evaluation process can be modeled as a feedback system, with inputs entering the program, which generate outputs and then outcomes, which feed back to program stakeholders and affect program inputs. - Evaluation research is research for a client, and its results may directly affect the services, treatments, or punishments that program users receive. - There are five primary types of program evaluation: 1. needs assessment, 2. evaluability assessment, 3. process evaluation (including formative evaluation), 4. impact analysis (also termed summative evaluation), and 5. efficiency (cost-benefit) analysis. • Evaluation research raises complex ethical issues because it may involve withholding desired social benefit

Field Notes

- Field research: Research in which natural social processes are studied as they happen and left relatively undisturbed

stakeholders

- Individuals and groups who have some basis of concern with the program. - The evaluation process as a whole, and the feedback in particular, can be understood only in relation to the interests and perspectives of program stakeholders - They might be clients, staff, managers, funders, or the public. - The board of a program or agency, the parents or spouses of clients, the foundations that award program grants, the auditors who monitor program spending, the members of Congress—each is a potential program stakeholder, and each has an interest in the outcome of any program evaluation. - Some may fund the evaluation, some may provide research data, and some may review—or even approve—the research report - Who the program stakeholders are, and what role they play in the program evaluation, can have tremendous conse- quences for the research.

What are the challenges to conducting field research and how do researchers overcome these challenges?

- It is almost always a mistake to try to take comprehensive notes while engaged in the field—the process of writing extensively is just too disruptive. - The usual procedure is to jot down brief notes about highlights of the observation period. - These brief notes then serve as memory joggers when writing the actual field notes later. - It also helps to maintain a daily log in which each day's activities are recorded - With the aid of the jottings and some practice, researchers usually remember a great deal of what happened—as long as the comprehensive field notes are written immediately afterward or at least within the next 24 hours, and before they have been discussed with anyone else *Field notes: Notes that describe what has been observed, heard, or otherwise experienced in a participant observation study. -These notes usually are written after the observational session + Field researchers cannot help but be affected on a personal, emotional level by social processes in the social situation they are studying. - At the same time, those being studied react to researchers not just as researchers but as personal acquaintances—and often as friends. + Managing and learning from this personal side of field research is an important part of any project.

What are the various ways a survey can be administered and what are the strengths and weaknesses of each?(ie Mailed, group, phone, in-person)

- Mailed (self-administered) survey - Group survey - Phone survey - In person Interview - Electronic survey

Frequency distribution

- Numerical display showing the number of cases, and usually the percentage of cases (the relative frequencies), corresponding to each value or group of values of a variable.

Contrived observation

- Observations of situations in which the researcher has deliberately intervened.

Formative evaluation

- Process evaluation that is used to shape and refine program operations. - Formative evaluation occurs when the evaluation findings are used to help shape and refine the program, for instance by being incorporated into the initial development of the service program. - Evaluation may then lead to changes in recruitment procedures, program delivery, or measurement tools

Confidentiality

- Provided by research in which identifying information that could be used to link respondents to their responses is available only to designated research personnel for specific research need - Confidentiality is most often the primary focus of ethical concern in survey research. -Usually confidentiality can be protected readily; the key is to be aware of the issue.

Anonymity

- Provided by research in which no identifying information is recorded that could be used to link respondents to their response -Few surveys can provide true anonymity, where no identifying information is ever recorded to link respondents with their responses.

What is a primary ethical concern with program evaluation?

- Researchers can be subjected to cross-pressures by stakeholders - answering to stakeholders can compromise scientific design standards - researchers may be pressured to avoid null findings or find their research findings ignored - eval reports might need to be overly simplified for alay audience, and thus subject to some distortion - evaluation research research can miss important outcomes or aspects of the program process

Quantitative data analysis (& Advantages)

- Statistical techniques used to describe and analyze variation in quantitative measures

Descriptive statistics

- Statistics used to describe the distribution of and relationship among variables.

Inferential statistics

- Statistics used to estimate how likely it is that a statistical result based on data from a random sample is rep- resentative of the population from which the sample is assumed to have been selected.

What are the strengths of using surveys for data collection?

- Survey research owes its popularity to three advantages: (1) versatility, (2) efficiency, and (3) generalizability. - The versatility of surveys is apparent in the wide range of uses to which they are put, including opinion polls, election campaigns, marketing surveys, community needs assessments, and program evaluations. - Surveys are efficient because they are a relatively fast means of collecting data on a wide range of issues at relatively little cost—ranging from about $10 to $15 per respondent in mailed surveys of the general population to $30 for a telephone survey and then as much as $300 for in-person interview surveys. - Because they can be widely distributed to representative samples, surveys also help in achieving generalizable result

Floaters

- Survey respondents who provide an opinion on a topic in response to a closed-ended question that does not include a "Don't know" option but who will choose "Don't know" if it is available - Many people, for instance, are floaters: respondents who choose a substantive answer even when they really don't know. Asked for their opinion on a law of which they're completely ignorant, a third of the public will give an opinion anyway, if "Don't know" isn't an option. But if it is an option, 90% of that group will pick that answer. - Because there are so many floaters in the typical survey sample, the decision to include an explicit "Don't know" option for a question is important, especially with surveys of less educated populations.

Fence Sitter

- Survey respondents who see themselves as being neutral on an issue and choose a middle (neutral) response that is offered. - Fence-sitters, people who see themselves as being neutral, may skew the results if you force them to choose between opposites.

Grounded Theory

- Systematic theory developed inductively, based on observations that are sum- marized into conceptual categories, reevaluated in the research setting, and gradually refined and linked to other conceptual categories - The goal of many qualitative researchers is to create grounded theory—that is, to build up inductively a systematic theory that is "grounded" in, or based on, the observations.

Mean

- The arithmetic, or weighted, average computed by adding the value of all the cases and dividing by the total number of cases.

What is the general idea of comparative methods and historical methods?

- The central insight behind both historical and comparative research is that we can improve our understanding of social process when we make comparisons with other times and places. - Much historical research is qualitative. - Like other qualitative methods, qualitative historical research is inductive: it develops an explanation for what happened from the details discovered about the past. - In addition, qualitative historical research is case-oriented; it focuses on the nation or other unit as a whole, rather than only on different parts of the whole in isolation from each other - Related to this case orientation, qualitative historical research is holistic—concerned with the context in which events occurred and the interrelations between different events and processes: "how different conditions or parts fit together" - Qualitative historical research is also likely to be historically specific—limited to the specific time(s) and place(s) studied. - Qualitative historical research uses narrative explanations—in which the research tells a story involving specific actors and other events occurring at the same time or one that accounts for the position of actors and events in time and in a unique historical context - Comparative methods may be cross-sectional, such as when variation between country characteristics is compared, or longitudinal, in which developmental patterns are compared between countries

Cover letter (high quality characteristics?)

- The letter sent with a mailed questionnaire that explains the survey's purpose and auspices and encourages the respondent to participate. - The cover letter, actually, is critical to the success of a mailed survey. - This statement to respondents sets the tone for the entire questionnaire. The cover letter or introductory statement must establish the credibility of the research and the researcher, it must be personalized (including a personal salutation and an original signature), it should be interesting to read, and it must explain issues about voluntary participation and maintaining subject confidentiality. - A carefully prepared cover letter should increase the response rate and result in more honest and complete answers to the survey questions; a poorly prepared cover letter can have the reverse effects.

Central tendency

- The most common value (for variables measured at the nominal level) or the value around which cases tend to center (for a qualitative variable).

Mode (probability average)

- The most frequent value in a distribution; also termed the probability average.

Saturation Point

- The point at which subject selection is ended in intensive interviewing because new interviews seem to yield little additional information.

Median

- The position average, or the point, that divides a distribution in half (the 50th percentile)

Black Box

- The process by which a program has an effect on outcomes is often treated as a "black box," but there is good reason to open the black box and investigate the process by which the program operates and produces, or fails to produce, an effect - Black box or program theory—Do we care how the program gets results - The focus of such research is whether cases have changed as a result of their exposure to the program between the time they entered as inputs and when they exited as outputs - The assumption is that program evaluation requires only the test of a simple input/output model - There may be no attempt to "open the black box" of the program process.

Data cleaning

- The process of checking data for errors after the data have been entered in a computer file

Ethnography

- The study and systematic recording of human cultures. - Ethnographic research can also be termed naturalistic because it seeks to describe and understand the natural social world as it really is, in all its richness and detail

Skip pattern

- The unique combination of questions created in a survey by filter questions and contingent question - Skip patterns should be indicated clearly, - These filter questions create skip patterns. For example, respondents who answer no to one question are directed to skip ahead to another question, but respondents who answer yes go on to the contingent question.

Netnography (cyberethnography and virtual ethnography)

- The use of ethnographic methods to study online communities. - The researcher prepares to enter the field by becoming familiar with online communities and their language and customs, formulating an exploratory research question about social processes or orientation in that setting, selecting an appropriate community to study. - Unlike in-person ethnographies, netnographies can focus on communities whose members are physically distant and dispersed.

Stages of Interviewing

7: 1.) Thematize the interviews by clarifying the purpose of the interview & the concepts to be explored 2.) Design the interviews by laying out the process through you'll complete the interviews (contacting, sampling, ethical issues) 3.) Interviewing 4.) Transcribing 5.) Analyzing 6.) Verifying & Checking Facts 7.) Report the results of your analysis

Reflexivity

- Within sociology more broadly—the field of origin—reflexivity means an act of self-reference where examination or action "bends back on", refers to, and affects the entity instigating the action or examination. - It commonly refers to the capacity of an agent to recognize forces of socialization and alter their place in the social structure. - A low level of reflexivity would result in an individual shaped largely by their environment (or "society"). - A high level of social reflexivity would be defined by an individual shaping their own norms, tastes, politics, desires, and so on.

Archival data

- Written or visual records, not produced by the researcher. - Archival data can be enormously useful, but as always you should be aware in using all sorts of archives that they may not accurately sample or represent reality.

Visual sociology

- a method both to learn how others "see" the social world and to create images of it for further study.

Challenges/ Obstacles to evaluation research

- can miss important outcomes/aspects of the program process - researchers can be subject to cross-pressures by stakeholders - answering to stakeholders can compromise scientific design standards - researchers may be pressured to avoid null findings or find their research findings ignored - Eval reports might need to be overly simplified for audience, & thus subject to some distortion

Conversation analysis

- studies the sequence and details of conversational interactions, primarily to understand how people construct social realities through their talk - conversation analysis is a specific qualitative method for analyzing ordinary conversation. - Like ethnomethodology, from which it developed, conversation analysis focuses on how reality is constructed rather than on what it "is." **Three premises guide conversation analysis: 1. Interaction is sequentially organized, and talk can be analyzed in terms of the process of social interaction rather than in terms of motives or social status. 2. Talk, as a process of social interaction, is contextually oriented—it both is shaped by interaction and creates the social context of that interaction. 3. These processes are involved in all social interaction, so no interactive details are irrelevant to understanding it.

Mixed-Made Surveys

- using diff survey designs at once - allows strengths of one to compensate the weakness of other - maximize participation from diff types of respondents

9 Steps to Successful Field Research

1. Have a simple, one-sentence explanation of your project. - People will ask what you're doing, but no one cares to hear all your theories. 2. Be yourself. Don't lie about who you are. First, it's wrong. Second, you'll get caught and ruin the trust you're trying to build. 3. Don't interfere. They got along just fine before you came along, and they can do it again. Don't be a pest. 4. Listen, actively. Be genuinely interested in what they say. Movie stars, politicians, and other celebrities are used to having other people listen to what they say, but that's not true for most people. If you really care to listen, they'll tell you everything. 5. Show up, at every opportunity—3:00 in the morning, or if you have to walk 5 miles. Go to their parties and their funerals. Make a 5-hour trip for a 15-minute interview, and they'll notice—and give you everything you want. 6. Pay attention to everything, especially when you're bored. That's when the important stuff is happening, the stuff no one else notices. 7. Protect your sources, more than is necessary. When word gets around that you can be trusted, you won't believe what people will tell you. 8. Write everything down, that day. By tomorrow, you'll forget 90% of the best material, and then it's gone forever. 9. Always remember: It's not about you, it's about them. Don't try to be smart, or savvy, or hip; don't try to be the center of attention. Stop thinking about yourself all the time. Pay attention to other people

What are the weaknesses of using surveys for data collection?

Cost— in person interviews are clearly the most expensive type of survey. Phone interviews are much less expensive, and surveying by mail is cheaper yet. Electronic surveys are now the least expensive method, because there are no interviewer costs; no mailing costs; and, for many designs, almost no costs for data entry. (Of course, extra staff time and expertise are required to prepare an elec- tronic questionnaire.)

Qualitative Data Analysis

DEF: Techniques used to search and code textual, aural, & pictorial data and to explore relationships among the resulting categories. - Data is left in words - If the researcher is coding for latent content or if a qualitative assessment is desired, "negative case testing" should be used to analyze the data.

Are qualitative methods usually inductive or deductive?

INDUCTIVE Qualitative methods: Methods, such as participant observation, intensive interviewing, and focus groups, that are designed to capture social life as participants experience it rather than in categories the researcher predetermines. **These methods typically involve exploratory research questions, inductive reasoning, an orientation to social context, and a focus on human subjectivity and the meanings participants attach to events and to their lives.

Manifest Vs. Latent

Manifest Content: The visible, surface content and is analogous to using a standardized questionnaire. Latent Content: the underlying meaning of the content. -most researchers code for manifest content or code for both manifest & latent content.

What techniques Do Qualitative Data analysts Use?

Most approaches to qualitative data analysis take five steps: 1. Documentation of the data and data collection 2. Conceptualization and coding 3. Examining relationships to show how one concept may influence another 4. Authenticating conclusions by evaluating alternative explanations, disconfirming evidence, and searching for negative cases 5. Reflexivity

What are the different roles for the participant observer? Strengths and weaknesses of each?

Participant observation: A qualitative method for gathering data that involves developing a sustained relationship with people while they go about their normal activities. -The term participant observer actually represents a continuum of roles: 1.) being a complete observer who does not participate in group activities and is publicly defined as a researcher + complete observation, researchers try to see things as they happen, without actively participating in these events. + Complete observation: A role in participant observation in which the researcher does not participate in group activities and is publicly defined as a researcher 2.) being a covert participant who acts just like other group members and does not disclose his or her research role. + Complete (covert) participation: A role in field research in which the researcher does not reveal his or her identity as a researcher to those who are observed + Complete Participation: Some field researchers adopt a complete participation role in which one operates as a fully functioning member of the setting. + Most often, such research is also covert, or secret—other members don't know that the researcher is doing research -Weakness of Complete Part/covert= Covert participants don't disrupt their settings, but they do face other problems. They must write up notes from memory and must do so when it would be natural for them to be away from group members. -and, Researchers' spontaneous reactions to every event are unlikely to be consistent with those of the regular participants *Many field researchers develop a role between these extremes, publicly acknowledging being a researcher but nonetheless participating in group activities. ^^^ Mixed Participation or Observation - Such fieldwork or field research, going out to where people really live and work, is a means for seeing the social world as the research subjects see it, in its totality, and for understanding subjects' interpretations of that world - Participant observers seek to avoid the artificiality of experimental designs and the unnatural structured questioning of survey research - This method encourages consideration of the context in which social interaction occurs, of the complex and interconnected nature of social relations, and of the sequencing of events - Through it, we can understand the mechanisms (one of the criteria for establishing cause) of social life.

What are the different techniques for qualitative data collection?

Qualitative methods refer to several distinct research activities: participant observation, intensive interviewing, and focus groups.

What are the strengths and weaknesses of qualitative methods?

Qualitative methods: Methods, such as participant observation, intensive interviewing, and focus groups, that are designed to capture social life as participants experience it rather than in categories the researcher predetermines. - These methods typically involve exploratory research questions, inductive reasoning, an orientation to social context, and a focus on human subjectivity and the meanings participants attach to events and to their lives. - Qualitative methods refer to several distinct research activities: participant observation, intensive interviewing, and focus groups. - Qualitative researchers typically begin with an exploratory research question about what people think and how they act, and why, in some social setting. This research approach is primarily inductive. • The designs focus on previously unstudied processes and unanticipated phe- nomena because previously unstudied attitudes and actions can't adequately be understood with a structured set of questions or within a highly con- trolled experiment. • Qualitative designs have an orientation to social context, to the interconnections between social phe- nomena rather than to their discrete features. • The designs focus on human subjectivity, on the meanings that participants attach to events and that people give to their lives. • The designs have a sensitivity to the subjective role of the researcher. Qualitative researchers consider themselves as necessarily part of the social process being studied and, therefore, keep track of their own actions in, and reactions to, that social process. *3 qualitative methods that illustrate the flexibility of this approach: ethnography, netnography, and ethnomethodology. -how to collect data using three different qualitative strategies: participant observation, intensive interviewing, and focus groups.


Conjuntos de estudio relacionados

Chap. 6 Socioemtional Development in early childhood

View Set

HIS 132 Final Exam Review (Ch. 15 - Week 4 Quiz)

View Set

Physical Science - Chapter 6 Practice Test

View Set

Digestive System Chapter 7:11 Study Guide

View Set