IS Method

Ace your homework & exams now with Quizwiz!

Postmoderism

"Views knowledge as relative and context dependent Questions the assumption of objectivity

Fordele og ulemper ved interview

+ learn about the peoples' interpretations and points of view compare between different interpretations insight into important narratives insight into local norms, ideals, values _ difference between saying and doing knowledge depends on questions asked only targets explicit knowledge that can be put into words complex social situations

Hvad en teori består af

All theories are constrained by their specific critical bounding assumptions. Within these boundaries lies the stuff of the ory.

From cross-section to multilevel design

It is possible to include/control for context in cross-sectional data. But can we really control all context effects through cross-sectional data? Cross-sectional data mostly ignores embeddedness and assumes each observation is independent.

Mixed Methods Research Questions for Qualitative Comparative Designs

Mixed methods question also could be designed such that it embeds a qualitative research question that involves comparison (i.e., pairwise sampling designs, subgroup sampling designs, nested sampling designs, multilevel sampling designs). Qualitative Comparative Analysis (QCA) is a means of analysing the causal contribution of different conditions (e.g. aspects of an intervention and the wider context) to an outcome of interest.

Abduction

Movement away from the midline of the body

Observation vs. participation

Observation privilege eye-sight and the "neutral" researcher. Participation requires total sensual presence and a dedicated researcher Engaging in activities and observing them Insider/outsider perspective The ability to shift between different positions The risk of "going native" - "staying native"

What to do about CMV bias? before analysis

Obtain measures of predictor (IV) and criterion (DV) variables from different sources- Not appropriate if both IV and DV should be personal attitudes, feelings, perceptions... Temporal, proximal, psychological separation between variables: Separating the constructs on different pages as far in the survey as possible, Administering parts of the survey days/months apart, Avoid complicated sentences Conditional reasoning: How do we access something that is not even known to the individual himself/herself? Conditional Reasoning Framework The implicit (unknown) relies heavily on rationalizations and psychological defences

Ontology

Ontology refers to assumptions about the nature of reality.

Multilevel research design- multilevel models

Partition variance so we can simultaneously explain within and between variance at the same time. Designed to bridge the micro and macro perspectives Allow for tests of multilevel theories More flexibility in terms of measures

Survey

Produce systematic set of data (variable by case data) Cross-sectional /Time-series or Pannel/ Multilevel ... research design Naturally-occurring variance Can provide descriptives for a set of cases Aspires to find out causes of phenomena by comparing cases to each other Looking for patterns/systematic differences between groups of cases

Non-probability sampling methods

Sometimes we don't want necessarily generalize to the entire population, we want to describe/explore novel phenomena and therefore want to find wide variety of units that might be somehow impacted by the studied phenomenon

TYPES OF INTERVIEWS

Structured, semi-structured, unstructured

Mixed Methods Research Questions for Experimental Research Designs

Studying different persons experiences about a specific topic. These experiences for the experimental group would then be compared to the experiences for the control group. The overall mixed methods research design would be concurrent in nature.

Subjectivism

Subjectivism incorporates assumptions of the arts and humanities (Table 4.1), assert ing that social reality is made from the perceptions and consequent actions of social actors (people). Ontologically, subjectivism embraces nominalism (also sometimes called conventionalism). Nominalism, in its most extreme form, considers that the order and structures of social phenomena we study (and the phenomena themselves) are created by us as research ers _and by other social actors through use of language, conceptual categories, perceptions and consequent actions.

Problems with survey

Superficial measures? Standardized instruments administered in standard way Self-reported data Causality

Internal validity

Transparency: Clearly lay out all the steps made Checking for representativeness (not statistical). Who do I need to talk to, to get a balanced view? Checking for researcher effects. Do I influence people in some way? Triangulation. Can I support my finding with other methods? Seeking disconfirmation - e.g. seek outliers, logic of falsification

Research designs include

Unit of observation- depends on the phenomenon we want to study. Who do we want to do the research on? Condition/treatment/Independent variable. Number of replications Level of analysis- depending on the theoretical framework where does the phenomenon of interest lie? DV and IVs are on individual or higher level of analysis?

Quality Criteria across paradigms (quality in qualitative research)

Worthy topic, rich rigor, sincerity, credibility, resonance, significant contribution, ethical, meaningful coherence

Things to consider when doing interviews:

Your interview questions/interview guide relate to your research question? Your interview guide contain a good mixture of different kinds of questions (e.g. probing, specifying)? It includes requests for information about the interviewee (e.g. age, profession)? Your language in the questions are clear, comprehensible, and free of unnecessary academic and technical terms? Your questions offer a real prospect of capturing your interviewees' perspective rather than leading and imposing your frame of reference on them?

Critical Realism

a paradigm that holds things are real insofar as they produce effects

Kvant fordele i mixed method

scope: many instances/ cases represent populations undeniability through numbers Statistical generalisation: Samples as representations of populations Transferable through prediction of similar instances Validity Internal/external Through systematic approach Objective data analysis

Positivism

the belief that knowledge should be derived from scientific observation

Connection to "The role of theory"

the paper is about the theory- building process, but focus on explanation and predictions - This is also type IV in Gregor's taxonomy A major reason for the popularity and relevance of theory building from case studies is that it is one of the best...bridges from rich qualitative evidence to mainstream deductive research = Type III and IV in Gregor's taxonomy

Induction

the process that moves from a given series of specifics to a generalization

Complementary strengths and weaknesses for kval og kvant i MM

-

Def. på kval research question typer

-

Kvant research questions typer

-

Styker i kval og kvant

-

Svagheder i kval og kvant

-

Mixed method process for RQ

1. determining the goal of the study 2. formulating the research objective(s), 3. determining the research/mixing rationale, 4. determining the research/mixing purpose, 5. determining the research question(s), 6. selecting the sampling design 7. selecting the mixed methods research design, 8. collecting the data, 9. analyzing the data, 10. validating/legitimating the data, 11. interpreting the data, 12. writing the mixed methods research report, 13. reformulating the research question(s).

Mixed Methods Research Questions for Causal-Comparative Research Designs

A causal-comparative design is a research design that seeks to find relationships between independent and dependent variables after an action or event has already occurred. The researcher's goal is to determine whether the independent variable affected the outcome, or dependent variable, by comparing two or more groups of individuals.

The conceptual framework

A map of what you are investigating. Explains the main things to be studied. Display of your main conceptual ideas, key factors and constructs How you think they are related When multiple researchers are involved it helps focus and agreement on terms, subject and object of study Central both as an outset for study and as an outcome of the study. The map may change

Pragmatism

A philosophy which focuses only on the outcomes and effects of processes and situations.

Variable def.

A variable may be defined as an observable entity which is capable of assuming two or more values

Aims and scope of ethnography

Aims at making sense of different peoples' experiences of everyday social and cultural processes through in-depth, in situ observation (Crang & Cook, 2008 Involves the ethnographer participating in people's daily lives for an extended period of time (Hammersley & Atkinson, 1995)

A Taxonomy of Theory Types in Information Systems Research

Analysis, explanation, prediction, explanation and prediction, design and action

External validity

Are your findings "transferable"? Can the result be transferred from one organization to another Generalize to theory and not populations Replicating findings Multiple case studies Triangulate to theory. Use various findings, and put them together

What can observations do?

Attitudes versus behaviour Too often we measure attitudes while concluding to behaviour or actual outcomes Behaviour is difficult to access Surveys and interviews capture attitudes or reconstructed accounts People are not always good at giving accounts of what they do

Axiology

Axiology refers to the role of values and ethics within the research process. This incorporates questions about how we, as researchers, deal with both our own values and those of our research participants.

Before, during and after the interview:

Before: Provide the interviewee with information about the research - Purpose, themes of the interview, no right or wrong answers, During: Read and respond to the body language of the interviewee, Use the interview person's own words, Do not end the interview on a controversial, sensitive topic After: Write down short reflections immediately after

Pretesting:

Behaviour coding- monitoring pre-test interviews to note when respondents deviate from script Includes codes for misread question or ask for more information, sound unsure when starting the answer Cognitive pretesting- "think aloud" while answering Obvious limitation- subconscious processes are not accessible to the respondent so they cannot share

Problems when not taking account of context (at all) in cross-sectional data:

Completely misleading coefficients Underestimate/overestimate effects

Mixed Methods Research Questions for Descriptive Research Designs

Concurrent: The overall mixed methods research design would be concurrent because the quantitative phase of the study did not inform or drive the qualitative phase or vice versa. Sequentiel: The quantitative research design then would be descriptive in nature and the qualitative research design most likely would be phenomenological. The overall mixed methods research design most likely would be sequential instead of concurrent because the quantitative phase of the study would inform the qualitative phase.

Evaluating theory 2

Consistency: a theory must not contradict established empirical fact and should, as much as possible, also be able to predict established empirical fact Interestingness: a theory should make new and useful predictions Corroboration: the ratio of empirically corroborated to empirically falsified hypotheses should be as high as possible Accuracy: quantitative predictions made by a theory should fit independently collected empirical data as closely as possible Parsimony: The theory should be as simple as possible.

Construct def.

Constructs may be defined as "terms which, though not observational either directly or indirectly, may be applied or even defined on the basis of the observables" ELLER A construct is a hypothesised property or entity in terms of which the elements of a system or population are assumed to differ, e.g. personality factors

Sample size

Depends on the degree of accuracy we want to achieve and the extent of variation actually occurring in the population on the key variables we want to observe The stronger the real relationship in the population- smaller sample needed What type of analysis am I aiming for? Experiment/multilevel- breaking down data to subgroups Exploratory factor analysis- making completely new scale?

Kval karakteristikker

Depth and richness of data. Peoples stories, their experience, you can see their expression.. Deep knowledge of individual instances/cases. Understanding the role of context. Preservation and understanding of chronological flow. Grasping processes Establish links between events and outcomes Causal explanations?

Kval fordele i mixed method

Depth: Few instances/ cases theoretically relevant instances/ cases undeniability through detail Analytical generalisation Generalise to theory Transferable through analogy - maybe it is the same in another organization, but not generize Unique or general aspects? Validity Transparency in methods choices Triangulation Subjective data analysis?

Hvad man får du af interviews

Descriptions of past and present experiences and actions Interpretations of the logics linking actions to meanings, aims and outcomes Validation of immediate analyses

When to do observations?

Discovery Some topics/people can only be studied in this way When individuals are not the units of analysis Or individuals interacting or in their context When studying action When studying interactions and composites Early stage exploration Unstructured Basis for later interviews etc.

Cognitive process when answering a question

Do all people understand the same question in the same way? Each person develops a pragmatic meaning that is shaped by their personal experiences What is the researcher "really" asking, why is he asking that, what would be the acceptable answer? When constructing answer, respondents tend to abide by conversation norms Each new answer should contain new information Cognitive heuristics- The speed of retrieving our memories- judgment of frequency Reporting an answer in predetermined way will necessarily influence the answer Differences between open/closed questions apply here Differences between ranking and rating scales Type of administration

Epistemology

Epistemology concerns assumptions about knowledge, what constitutes acceptable, valid and legitimate knowledge, and how we can communicate knowledge to others.

Ethnography definition

Ethnography is the work of describing a culture. The central aim of ethnography is to understand another way of life from the native point of view. [...] Fieldwork, then, involves the disciplined study what the world is like to people who have learned to see, hear, peak, think, and act in ways that are different. Rather than studying people, ethnography means learning from people" (Spradley 2016, p. 3)

Evaluating theory

Falsifiability: Vurderer hvorvidt det er muligt at modbevise teorien. Hvis en teori er falsificierbar opfylder den: Non-contradictory: X er en nødvendig og tilstrækkelig betingelse for Y og omvendt. Non-tautologigal: hvad man forudser i teorien, må ikke konkluderes i dine antagelser og sammenhænge (RQ). Measurable: man skal kunne validere alle konstruktionerne i teorien med gyldighed. Utility: Graden af, hvor brugbar ens teori er. "usefulness"

Fremgangsmåde til survey

From RQ -> research design -> what do I want to know from the survey Who is my population- how do I sample? List of control/independent/dependent variables- thinking of the time limit Search for existing scales Designing the intro mail and survey in Survey Exact Piloting (skips, piping, wording, resolution, randomization) Any foreseeable bias? Revising order ( from general to specific, difficult/uncomfortable questions at the end) Administer + reminders Analysis

A model for qualitative research design

Goals - why are you doing this study? Why is it necessary that we know about this problem? Conceptual Framework - What do you think is going on? Research questions - What do you want to understand? Methods - What will you actually do? Validity - How might you be wrong?

When are Unstructured interviews useful?

If the interviewer has limited prior knowledge Sometimes used as a precursor to semi- structured interviews When we don't know what is there If long narratives are needed (process) In connection with other methods e.g. participant observation

Weaknesses and challenges of mixed methods

Increased cost in terms of time and resources Alignment (qual and quants may drift) Conceptual differences Researcher as a "Jack-of-all-trades and a master of none"?

When are structured interviews useful?

Information gathering/fact checking For hypothesis testing When strong comparability is necessary/desired Very specific deductive research processes where surveys are not available

Stages of the process

Initial design of the entire study- sampling/timing How to deliver? Construction of the questionnaire/survey- testing Collection of data Analysis and presentation of results

Qualitative data collection methods

Interviews (denne lektion handler udelukkende om interviews) "what people say they do/explanations why people do what they (say they) do" Observations "what people actually do" Document data "unbiased by the researcher" (Feeling) "researcher's experience and subjectivity"

Asking questions

Introducing questions > "Please tell me about when your interest in X first began?" Follow-up questions > "What do you mean by that"? Probing questions > "Could you say something more about that?" Specifying questions > "What effect did X have on you?" Direct questions > "Are you happy with X?" Indirect questions > "What do most fellow students in school think of X?" Structuring questions > "I would now like to move on to a different topic" Silence Interpreting questions > "Is it correct to say that what you are suggesting is that......?"

From cross-section to longitudinal design

It is possible to record past events in classical cross-sectional data (Retrospective questions) But can we really analyse process of change over time through cross-sectional data? (establishing causality Once decision is made- cognitive dissonance- becomes more "right" decision over time Interdependence between individuals and creation of institutionalized routines will hold the decision in place over time Inertia detected by Markov effect- assuming that the probability of future event depends on the state attained in the previous event

Kvant og kval i research design

Kig billede

Be aware of (questions) /biases

Labels: Rating scales should include words next to each point not just number or graphical display Giving respondents the NO OPINION option- partial filter for non-reliable response: No opinion option doesn't improve reliability Conversation conventions Respondents might interpret as information conveyed later in a sentence as more important than information provided earlier Satisficing is when a respondent conserves energy by compromising the standards of answering the questionnaire and trying to complete it as fast as possible Conditions that foster satisficing: Task difficulty Respondent's ability Respondent's motivation Acquiescence:tendency to endorse any assertion made in a question, regardless of its content, perhaps due to human tendency to appear agreeable or due to human cognitive biases. At man støtter alle påstande i et spørgsmål ligegyldigt hvad indholdet er. Kan skyldes mennesket tendens til at ville fremstå agreeable/sympatisk

Tradeoff (Kvant)

Laboratory experiment: Very good at internal validity due to high control of treatment settings Field experiment: loss of control over the environment Sample survey: Potentially very good at external validity if sampling random from the population, also high chance of statistical conclusion validity Triangulation- can be applied at different stages of design- context for data collection, sources of data, measures, research design itself

Kval ulemper

Labour intensiveness and extensiveness Time demanding Frequent data overload Researcher reflexivity and integrity Adequacy of sampling Generalizability of findings Credibility and quality of conclusions and their utility in the world

Avoid (når du laver spørgsmål)

Leading questions Ambiguous questions Vague terms Too specific term

Question type

Main decision about question type- response formats Open formats allow respondents to answer exactly according to their own opinion, Some respondents might be less articulate? Time-consuming and expensive (coding) Open-ended questions can offer prompts (list of possible choices- f.e.) "where did you find out about this offer?" Closed formats force respondents to choose from "acceptable" answers and also prime (influence) their answers, Cannot find appropriate answer- can intimidate/anger the respondent => they quit! The pre-set range should be well-tested and always followed by the option "other" Ensuring that the correct group of respondents is targeted will make it easier to create good closed-ended questions

Objectivism

Objectivism incorporates the assumptions of the natural sciences, arguing that the social reality that we research is external to us and others. This means that, ontologically, objectivism embraces realism, which, m Its most extreme form, considers social entities to be like physical entities of the natural world in so far as they exist independently of how we think of them, label them, or even of ou;awareness of them. Epistemologically, ob1ect1v· ists seek to discover the truth about the social world, through the medium of observable, measurable facts, from which Jaw-like generalisations can be drawn about the universal social reality. Axiologically, since the social entities and social actors exist independently of each other, objectivists seek to keep their research free of values, which they bebeve could bias their findings

Approaches to qualitative research

Phenomenology + narrative research: Individual lived experience. Want to understand how people understand things... Grounded theory: go from case to case and develop a overall theory. Case study: study something in its context. Culture, understanding.. Ethnography: Culture-sharing behaviour of individuals or groups All three tries to: discover regularities + explore processes, activities and events

What is the paradigmatic point of departure? (IS Research)

Positivism: Logical positivism are mentioned many places (e.g. 614, especially 615 discuss this position in depth, Type III and Type IV theories includes prediction with propositions, hypothesis, which is positivistic Interpretivism: Type II "theory of explaining" "can be seen that forms of this type of theory correspond reasonably closely to some views of theory in the interpretivist paradigm", page 624 Critical theory: Very briefly mentioned page 622 "Critical theory seeks to bring about improvements in the lives of human actors. Theory labeled normative has an ethical or moral dimension in addressing what should be done... Again, all of the theory types depicted here could have social or political implications" Pragmatism: Not mentioned in paper, but it could be argued that theory type V about "Design and action" implicitly subscribe to a pragmatic view

Qualitative and quantitative methods grow out of different paradigms

Quants: realist ontology and objectivist epistemology Qualitative: relativist ontology and subjectivist epistemology Post-positivist approach to ontology and epistemology Social reality exists independently of our perception Yet different methods may be needed to uncover (incomplete) accounts of it. Pragmatism An approach to knowing using a "what works" approach Recognizes the existence of the natural physical world as well as the social and psycholocal world (language, culture, human institutions, subjective thoughts) Rejection of a forced choice between positivism and constructionism

Deduction (deductive reasoning)

Reasoning that begins with a general principle and concludes with a specific instance that demonstrates the general principle. Reasoning from the general to the particular.

From experiments to survey

Regression discontinuity design As close to random assignment as possible Mimic to experiments- analysis of variance between the treatment and non-treatment group before and after intervention Basically comparing "what would the regression look like if treatment was not received" Matching: As close to random assignment as possible- having the group that has already received the "treatment" the control group is produced backwards by matching each individual with non-treated case that is as similar as possible

What theory is not:

Simple lists of concepts, constructs or variables Category systems in one or more dimensions (taxonomies, typologies, 2×2 matrices) Metaphors (such as organisational learning, sensemaking or absorptive capacity) The truth The ultimate goal of research

Probability sampling methods

Simple random sampling Systematic Stratified Multi-stage cluster

Why use mixed methods?

Social reality is multi-faceted and complex - methods need to be too! Increase comlpexity, get a broader pixture. Increased validity: Same findings with different methods (not all MM designs) Balance out weaknesses of the two methods Generate and validate theory in the same (overall) study

Representativeness vs. large samples

Some individuals might be harder to reach, so no matter how large the sample is, it might still be biased Low response rate from a specific group (not in general) can introduce bias Small samples are still problematic since size of a sample is directly linked to the precision of estimates based on the sample (how confident we can be in accuracy of the results)

Theory process

Step 1: Observation Step 2: Empirical generalization: An empirical generalization is 'an isolated propo- sition summarizing observed uniformities of relation- ships between two or more variables'. Step 3: Turning empirical generalizations into theories Step 4a: Hypothesis generation Step 4b: Hypothesis testing Step 5: Logical deduction: Next, we close the gap between theory and the empirical results. Logical deduction requires that we return to our original research question, and ask ourselves if the results make sense or at least contribute to the theory from on a more specific level. In general, there are three possible outcomes at this point: (1). 'end confirmation to' the theory by not disconfirming it; (2) 'modify' the theory by disconfirming it, but not at a crucial point; or (3) 'over- throw' the theory by disconfirming it at a crucial point in its logical structure, in its history of competi- tion with rival theories

Observation: What can you observe?

Systematic events: reports, team meetings, work tasks Individual participants Interactions - conversations Practices Material context Your own behaviour

Aim and contribution of Gregor

The aim of this essay is to examine the structural nature of theory in the discipline of Information Systems The paper contributes by showing that multiple views of theory exist and by exposing the assumptions underlying different viewpoints

The role of interviewer:

The balance of maintaining listening and concentration abilities while keeping dialogue The balance of understanding how you come across as an interviewer in an ideal way while not putting on an act The balance between being seen as friendly and trustworthy while keeping an emotional distance The balance between being informed and not coming across as knowing everything in advance

Common method variance bias

The covariance between two constructs/measures might be caused by utilizing the same method to collect data on both constructs rather than true relationships that we want to measure = Variationerne i respondenternes svar er forudsaget af de "instruments"/metode vi bruger til at lave spørgsmålene på, og ikke respondenternes faktiske forudsætninger.. Svarene er derfor "biased". Vi har måske stillet to spørgsmål på en måde, hvor respondenten opfatter dem som det samme, selvom de i virkeligheden har to forskellige betydninger. De to spørgsmål får derfor også en for høj correlation med hinanden, hvilket er et problem. "Method"/instrument =Similarities in item structure and wording Method factors can influence negatively estimates of construct reliability and validity

A theory are the assumed causal relationships between constructs and each causal relationship consist of

The effect that a theory tries to explain One or more causes A mechanism (interconnected processes that transform cause into effect) Boundary conditions

Different views on theory

The empiricist view: Theories are inductive generalisations of observations, based on empirical regularities The rationalist view: Theories about the empirical world are systems of concepts and statements that may be based on past observations, but also shape our selection and interpretation of future observations The pragmatic view: Research is an iterative process. In the early stages, exploratory approaches are most useful because they enable us to better understand the scope of a phenomenon: gathering data from multiple sources, using complementary data collection methods, attempting to identify regularities in the data, inductive development of tentative theories

Research design as variance control (MAXMINCON)

The main function of a good research design is to explain and control variance- MAXMINCON To maximize systematic variance (you can predict this) - in all research designs we want to make sure we have as much variance caused by/associated with the interesting independent variables as possible. This is done by: Make sure the treatments are really different Make sure that the data collection is occurring in context where variance is present To control extraneous systematic variance (that we are not interested in, but still wants to control) - there will always be variables that cause variance in the dependent variable but we are not interested in their effects- they need to be minimized, nullified or at least isolated so we can pull them apart. This is done by: Eliminate any variance due to this variable Randomization Build it into design To minimize the error or random variance (try to avoid that anything is polluting your results). Error variance is: random errors tend to balance each other out (mean zero), but because they are not systematic they are unpredictable (not possible to explain them) Increase control of conditions Increase reliability of measures

Doing qualtative research

Theory building: Regularities, sequences, "mechanisms". Look for patterns you can propose, and look for mecanisms Thick description (Geertz 1973): Theory based (deep) understanding of empirical phenomena Context sensitive: Adjust methods to context/phenomenon. Build the methods so it passe ind in a specific context. Dependent on researcher judgment and experience: In data collection and in data analysis

Why making ethnography

To gain in‐depth knowledge about: Social processes: actions and events as they unfold Perspectives and practices of people, "the insider's view" People's knowledge or meanings that do not exist at a highly articulated or reflexive level Help gain a holistic picture of a social phenomenon

5 different mixed method purposes

Triangulation: Convergence of results. What we find here, we also find another place Complementarity: Different facets of a phenomenon. The studies complement each other, to give us more insight. Development: Development purpose (process). Go through a development in one study, and use the result in the second study. Initiation: Use of discrepancies to start new research. Look for differences and paradox. We need to study this more, because there is a problem. Expansion: Test/focus on processes (qual?) and outcomes (quant?). Expand our knowledge using different methods. Broader knowledge.

Mixing qual. And quant. research - rækkefølge

Use qualitative methods to understand mechanisms underlying relationships identified in a quantitative study Use qualitative findings to generate hypotheses to be tested with a large-n sample in a quantitative study Quantify qualitative data to validate observed patterns

Research quality (kvant)

Validity Statistical conclusion validity: Is the design precise and powerful enough to detect the relationship between variables if it indeed exists? Internal validity: Can we establish causality between the treatment and effect (IV and DV)? Causality = change in one variable causes the change in other variable, other things being equal Empirical association- covariance/correlation: If IV goes up so does the DV and vice versa Temporal precedence of the IV: Cause must come before effect Absence of alternative explanations- nonspuriousness. Lack of other viable explanations for the relationship Construct validity: how good are our measurements. Do it represent the phenomena we want to capture External validity: to what extent can we generalize Ecological validity: would we find the same results about the phenomena in real life setting? Does it exist outside the lab?

Important questions/decisions researcher needs to make:

What information I need to answer my research question Who is my population? How do I capture information I need in simple, understandable questions? With every question on the survey- can and will a respondent answer this? How to ensure that all respondents answer the questions to their best abilities and complete the survey? How do I minimize response error?

When are semi-structured interviews useful?

When you have some idea of the phenomenon Relevant concepts Linkages between concepts When comparisons are needed Partly shared structure of interviews makes comparison easier When interviewing a "larger" number of individuals The structure increases focus Shorter interviews

Interpretivism

a research perspective in which understanding and interpretation of the social world is derived from one's personal intuition and perspective

An iterative research process (kval)

ideas, theory, design, data collection, analysis, dissemination

Philosophy / Paradigm defintion

› "A paradigm is a set of basic and taken-for-granted assumptions which underwrite the frame of reference, mode of theorising and ways of working..." (Saunders et al. 2016: 723) › "... a set of basic beliefs (or metaphysics) that deals with ultimates or first principles. It represents a worldview that defines, for its holder, the nature of the "world..." (Guba & Lincoln, 1994: 107).

Important concepts in IS research

› Research is the systematic collection and interpretation of information with a clear purpose, to find things out (Saunders et al. 2016: 726) › (Social) Theory is a system of interconnected ideas. It condenses and organizes knowledge about the social world › Data is quantitative (numerical) and qualitative (non-numerical) information and evidence that have been carefully gathered according to rules or established procedures › Empirical is a description of what we can observe and experience directly through human senses (e.g. touch, sight, hearing, smell, taste) or indirectly using techniques that extend senses (e.g. attitudes, opinions, emotions, power, authority, quarks, black holes, space, force fields, gravity)

What is Social theory? (Gregor) (IS)

› Social theory is a system of interconnected ideas. It condenses and organizes knowledge about the social world › We can also think of if it as a type of systematic "story telling" that explains us some aspect of the social world works and why › Useful metaphor: Theory is what helps us to see the forest instead of just a single tree (Neuman 2014: 57, 86): › The word theory will be used here rather broadly to encompass what might be termed elsewhere conjectures, models, framework, or body of knowledge (Gregor 2006: 614):


Related study sets

Bio 110 Chapter 16 Quiz Terms are Questions

View Set

11. The Commercial Package Policy

View Set

lesson 1: prewriting: persuasive writing

View Set

Biology: Diffusion, Osmosis and Active Transport

View Set

Fundamentals of Nursing Chapter 4 - Scenario

View Set

prepU ch 57 Drugs Affecting Gastrointestinal Secretions

View Set