Six Sigma: Measure Phase Notes

अब Quizwiz के साथ अपने होमवर्क और परीक्षाओं को एस करें!

When to use check sheet?

***When to use the check sheet:----> when the data can be observed and collected repeatedly by either the same person or the same location. It is also an effective tool when collecting data on frequency and identifying patterns of events, problems, defects, and defect location, and for identifying defect causes

Key points for checklist

2) Check List a) A tool used to ensure all important steps or actions have been taken b) Often a standard form is used c) Not one of the seven quality tools d) E.g.: All items in case cart are present before surgery in OR

Three Measures of Dispersion

3 Measures of Dispersion 1) Range 2) Standard Deviation 3) Variance

3 Methods to quantify measurement error in GR&R

3 Methods to quantify measurement error in GR&R 1) Range Method 2) Average & Range Method 3) ANOVA

3 Significance tests using interval data

3 Significance test using interval data 1) T-test 2) F-test 3) Correlation Analysis

3 Tools used in Enumerative (Descriptive) Studies

3 Tools used in Enumerative (Descriptive) Studies 1) Chi-square 2) Binomial distribution 3) Poisson distribution 4) Box plots

3 Types of sampling for SPC data collection

3 Types of sampling for SPC data collection 1) Random sampling = each part has equal chance being selected 2) Systematic sampling = sample every nth part 3) Subgroup sampling

3 assumption when 6sigma is the standard deviation due to measurement variability?

3 assumption when 6sigma is the standard deviation due to measurement variability? 1) Assume Measurement errors are independent, 2) Assume Measurement errors are normally distributed, 3) Assume Measurement errors are and independent of the magnitude of measurement

4 benefits of SIPOC

3 benefits of SIPOC 1) Display of cross functional activities in single, simple diagram 2) "Big picture" perspective to which additional detail can be added 3) Framework applicable to either large organizations or smaller projects. 4) Quickly familiarizes the team with the process at the organizational levels and project scope.

4 Examples of Sources of Common Cause Variation

4 Examples Sources of Common Cause Variation 1) Method of sequencing in the process 2) Manufacturing equipment design 3) Nation variation in material supply 4) Measuring equipment design

4 Examples of Sources of Special Cause Variation

4 Examples of Sources of Special Cause Variation 1) Quality of incoming material 2) Operators with varying skills 3) Changes to process setting 4) Environmental variations

4 Tools in Process Analysis and Documentation

4 Tools in Process Analysis and Documentation: 1) Flow charts 2) Process Maps 3) Written Procedures 4) Work Instructions

4 Types of Continuous Distributions:

4 Types of Continuous Distributions: 1) Normal distribution 2) Uniform distribution 3) Exponential distribution 4) Weibull distribution

6 Applications of Binomial distribution

5 Applications of Binomial distribution 1) Estimating the probabilities of an outcome in any set of success or failure trials 2) Sampling for attributes (acceptance sampling) 3) Number of defective items in a batch size of n 4) Number of items in a batch 5) Number of items demanded from an inventory 6) Model situation with only 2 outcomes

5 Categories of Measurement System Errors

5 Categories of Measurement System Errors 1) Repeatability 2) Reproducibility 3) Bias 4) Stability 5) Linearity.

ANOVA advantage over Average & Range Method

ANOVA advantage over Average & Range Method ANOVA can tell interactions (technician and parts) But also can tell variation due to process, repeatability, and reproducibility.

Advantage of Stem-and-Leaf Plot over Histogram?

Advantage of Stem-and-Leaf Plot over Histogram? *** Stem-and-leaf permits data to be read directly fro he diagram, whereas, histogram may lose the individual data values as frequencies within the class intervals

SIPOC-- Advantage

Advantage of using SIPOC 1) Display of cross-functional activities in single, simple diagram 2) "Big Picture" perspective to which additional detail can be added. 3) Framework applicable to either large organizations or smaller processes.

Analytical (inferential) Statistics

Analytical (inferential) Statistics: 1) Uses data from a sample to make estimates or inferences about the population from which the same was drawn

Application of Lognormal Distribution

Application include simulations of: 1) Distribution of wealth 2) Machine downtimes 3)Duration of time 4) Phenomenon that has a positive skew (tails to the right)

Application of Nelson Funnel

Application of Nelson Funnel Applied to tampering and common kneejerk reactions on the part of management, all of which are impediments to effective management and continual improvement, including: 1) Adjusting a process when a part is out of specifications 2) Making changes without the aid of control charts 3) Changing company policy based on the latest attitude survey 4) Modifying the quota to reflect current output 5) Using variances to set budgets 6) Relying on history passed down from generation to generation

Application of Process Capability Studies

Application of Process Capability Studies 1) Evaluation of new equipment 2) Reviewing tolerance based on inherent variability of a process 3) Assigning equipment to product 4) Routine process performance audits 5) Effects of adjustments during processing

Application of Scatter Diagram

Application of Scatter Diagram 1) Root cause analysis 2) Estimation of correlation 3) Making prediction using a regression-line fitted to the data.

4 Applications of probabilistic assessments

Application of probabilistic assessments of: 1) Mean time between failure (MTBF) 2) Arrival times 3) Time, distance or space between occurrences of the events of interest 4) Queuing or wait-line theories

Which Measurement Scale? Central Location is Arithmetic Mean

Arithmetic Mean Central Location is Arithmetic Mean

Attribute Data/ Discrete

Attribute Data 1) Data values can only be integers--eg. No. of defects, absent people, kind of perfomance.

Average & Range Method Application

Average & Range Method 1) Determine repeatability by examining variation between individual appraisers and within their measurement reading 2) Determine reproducibility by examining variation between the average of individual appraisers for all parts measured 3) Establish process variation by checking the variation between part averages that are averaged among technicians

Average Standard Deviation Chart Application

Average Standard Deviation Chart Application 1) Subgroups greater than 10 2) Less sensitive to detecting shirt 3) Provides Economic balance between the cost of running SPC and information that can be usefully derived.

Best approach to identifying the key characteristics that require SPC monitoring

Best approach to identifying the key characteristics that require SPC monitoring 1) SPC monitoring is expensive so need to carefully select process parameters and characteristics 2) To identify these parameters requiring process control -----> FMEA + Control plan + SPC planning

Difference between absolute value and true value with respect to a standard master at various measurement point of the measuring range

Bias Difference between absolute value and true value with respect to a standard master at various measurement point of the measuring range

Binomial Distribution

Binomial Distribution Basic assumptions: Discrete distribution Number of trials are fixed in advance Just two outcomes for each trial Trials are independent All trials have the same probability of occurrence

Box -and- whisker plot---> Purpose & Application

Box -and- whisker plot 1) Purpose= pictorial view of minimum, maximum, median, and interquartile range in one graph 2) Application= Provides more information than distribution plot but easier to interpret. Outliners are easily identified on the graph.

Box- and -Whisker Plots (Box Plots)

Box- and -Whisker Plots (Box Plots) ---John W. Tukey 1) Most simple and useful way to summarize data-----> 5 number summary of the data with a central line representing the median 2) Upper and lower quartiles of the data define the end of the box ---------> min and max data points are the end of the line "whiskers" 3) Notch Widths are calculated so that if 2 median notches do not overlap------->means are different with 5% significance. 4) Variable width notches are proportional to the log of the sample size. 5) Outliners (Asterricks)------Points that are more than 1.5 times the inter-quartile distance from each quartile.

Box-And-Wisker Plot

Box-And-Wicker Plot 1) Indicates variability of the median 2) Whiskers = min and max data point at the end of the lines extending from the box 3) Data median= line dividing the box 4) Asterics= outliner data (Make sure you can label the parts)

CPK & Short-term Capability

CPK & Short-term Capability: Generally, a CPK greater than 1.33 indicates that a process or characteristic is capable in the short term. Values less than 1.33 tell you that the variation is either too wide compared to the specification or that the location of the variation is offset from the center of the specification. It may be a combination of both width and location.

Drift in average measurements of an absolute value

Calibration Drift in average measurements of an absolute value

Causes of Non-linearity in Gauge R& R

Causes of Non-linearity in Gauge R& R 1) Improperly calibrated instrument at high and low end or operating range 2) Error in the master part measurements 3) Worn instrument 4) Design of the measurement instrument

Challenges in GR&R studies

Challenges in GR&R studies 1) One-sided specification 2) Skewed distribution 3) Fully automated equipment with little/no appraiser interaction 4) Destructive testing 5) New product introduction where only a few units are available 6) Multiple station comparison 7) Equipment the requires resetting or calibration after every measurement 8) Incomplete GR&R data

Check sheet vs. check list:

Check sheet vs. check list: People sometimes confuse a check sheet with a check list. The list we use for groceries and the report you get from the auto repair shop with things checked off after service (oil, filter, tire pressure, tread, etc.) are examples of a check list. The following table highlights some key differences between a check list and a check sheet. 1) Check Sheet a) A tally sheet to collect data on frequency of occurrence b) Custom designed by user c) One of seven quality tools d) E.g.: Check sheet to document reasons for interruptions in OR .

Significance test using nominal data

Chi-Square test Significance test using nominal data

Chi-square Distribution

Chi-square Distribution 1) Summing the squares of standard normal random variables 2) Distribution with regenerative property 3) Gamma distribution with failure rate of 2, and the [Degrees of freedom =2/no. of DF]

Confidence Interval on Proportion

Confidence Interval on Proportion 1) Percentage of samples in the confidence interval will include the true error rate 2) Example---> 90% error rate means that 9 out 10 will have included the true error rate. 3) CI of the mean decreases as the sample increases--->As more samples are obtained, fewer values are required to create confidence interval. 4) If standard deviation Known, use z-table---> If unknown, then use T-tables 5) Process stability cannot be determined by the fact that a particular sample falls within the confidence interval. ----> Must confirm process stability with statistical control chart.

Confidence Interval

Confidence Interval: 1) Assumes normal distribution and a stable, unchanging population 2) Measure stage---> used to estimate process average when a process control cannot be set because of lack of data. ---> Process average used for baseline estimates 3) Analyze Stage---> Used to examine similarities or differences between samples means taken during various process conditions

Statistical Term: Distributions containing infinite (variable) data points that may be displayed on a continuous measurement scale.

Continuous Distributions: Distributions containing infinite (variable) data points that may be displayed on a continuous measurement scale.

Conversion of Attribute Data to Variable Data

Conversion of Attribute Data to Variable Data 1) When collecting data, there are opportunities for some types of data to be either variable or discrete. 2) Consider costs when deciding to collect which type of data. ---> Measuring instruments are most costly for variable data 3) Variable data requires storing of individual value and computations of the mean, st. deviation, and other population estimates 4) Attribute data requires minimal counts of each categoty so little data storage space needed. 5) Manual Collection ---> Variable Data requires more skill than attribute data.

Conversion of Attribute Data to Variable Measures

Conversion of Attribute Data to Variable Measures 1) Variable data more expensive (instrument & data storage costs) 2) Variable data requires storage of individual data and computation of mean, std. deviation, and population estimates 3) Variable require more skilled personnel

Cumulative Distribution Function

Cumulative Distribution Function 1) a function whose value is the probability that a corresponding continuous random variable has a value less than or equal to the argument of the function. 2) In the case of a continuous distribution, it gives the area under the probability density function from minus infinity to x. Cumulative distribution functions are also used to specify the distribution of multivariate random variables.

DOE & Process Capabilities

DOE & Process Capabilities DOE can be used to find process capabilities ---> objective is optimum values of process variables which yield the lowest process variation.

Data Coding

Data Coding Efficiency of data entry and analysis is frequently improved by data coding---Sometimes it is more efficient to code data by adding, subtracting, multiplying or dividing by a factor. 1) Types of Data Coding a) Substitution - ex. Replace 1/8ths of an inch with + / 1 deviations from center in integers. b) Truncation- Ex. data set of 0.5541, 0.5542, 0.5547 - you might just remove the 0.554 portions. 2) Effects of Coding Data a) Will affect the mean to the extent that the mean must be uncoded for reporting purposes. b) Coding and uncoding of the mean will be exactly opposite. (Ex. Add X, subtract X or multiply by X, divide by X.) c) The effect the coding has on standard deviation depends on how the data is coded.

Data Coding by Substitution

Data Coding by Substitution 1) Consider a dimensional inspection procedure in which the specification is : [(nominal) +/- (deviation from nominal)] 2) Measurement Resolution = 1.8 inch 3) Inspectors record the Standard Deviations 4) Data can be coded as integers expressing the number of 1/8 increments deviating form nominal. 5) Example of data entry (32 - 3/8 inches) crammed onto check sheet----> Recommended Solution-= Either make the check sheet blocks bigger at the risk of having fewer samples and plot points per page

Data Coding by Truncation of Repetitive Place Values

Data Coding by Truncation of Repetitive Place Values Measurements such as 0.55303, 0.55308, 0.55310----> The digits 0.553 is repeating in all the observations can be recordeds as the last digits expressed as integers

Dealing with Process out-of- control in process capability study

Dealing with Process out-of- control in process capability study 1) Center the process and determine % of noncoformace outside specification limits 2) Process variation is less than No. of nonconformance ---> Reduce the # of nonconformance. 3) Reduce variations or revise specification limits.

Statistical Term: Distributions used to make decision and construct confidence intervals

Decision Distribution: Distributions used to make decision and construct confidence intervals

Definition of Process Analysis and Documentation

Definition of Process Analysis and Documentation 1) Process= set of interrelated resources and activities that transform input into outputs with the objective of adding value. 2) Activities within any process should be documented and controlled

Describe Critical-to-Cost metrics

Describe Critical-to-Cost metrics 1) identifies areas of the process that raise the expense 2) Should not include only the typical cost of task, but also the increased cost of errors in the performance of this task. 3) Should include the probability of the error. 4) If lag time is present---> Include the cost of keeping product on hand

Describe Critical-to-Quality Metrics

Describe Critical-to-Quality Metrics Definition= amount of money required to align a product/service with quality baselines 1) Yeild--amount of completed product divided by the amount of product that began the process.---> Does not aid in location of process errors or tell which parts can be salvaged 2) If the yeild= 95%, then scrap rate is 5% 3) Rolled Throughput ---average % of units with no defects---measures the expected quality level after several steps ----better than yeild

Describe Process Definition

Describe Process Definition 1) First step in Measure Stage= Create a comprehensive process level map of processes currently performed.----aka. group describes all the activities that they plan to improve 2) Expert employee participation needed to create detailed process map because employees have a different conception of process sequence than their superiors. 3) Employees, often , would have streamlined and modified process with notification of management---> harmful to criticize or judge the customization made by employees------> best to focus on creating an accurate process map.

Describe the Critical-to-Schedule Metrics

Describe the Critical-to-Schedule Metrics 1) Cycle time= most common metric----aka. delivery time, order processing time, or downtime. 2) Requires distinction between NVA and VA activities to improve Cycle time 3) Cycle time is secondary in importance to cost or quality.

Describe the importance of identifying key decision points during Measure stage

Describe the importance of identifying key decision points during Measure stage 1) After creating a process map, must identify the important decision made during the process. 2) Team members are alert to areas of inefficiency, requiring excess decision-making. 3) Team should try to reduce the number of decision made every time a process is performed. 4) Since process has not been measured, then all measurements must reflect the process as typically performed----> to ensure proper targeting of improvement efforts. 5) Team leaders communicate info to stakeholders.

Describe the process performance indices, including their application to the measure stage of DMAIC

Describe the process perfoRmance indices, including their application to the measure stage of DMAIC 1) Process performanc index---> tells whether a particular batch of material will be satisfactory to customer 2) Key difference between process performance index and process capability---> Pp limited to single batch. 3) Process performance index used to create process baseline estimates for uncontrolled processes------> requires large sample size. 4) Statistical control is preferred over Process performance index but can't be done if process lacks statistical control or insufficent data.

Describe the use of enumerative statistics in process baseline estimation

Describe the use of enumerative statistics in process baseline estimation 1) In estimation of a process baseline, enumerative statistics used to evaluate random samples from populations. 2) Enumerative statistics can determine whether samples represent the population, meaning that they were drawn without bias. 3) Confience level intervals of 95% used. 4) Drawback of Enumerative Statistics----> they are drawn from static, unchanging population.--but Six Sigma needs information from dynamic process and uses analytical statistics.

Descriptive Statistics

Descriptive Statistics Includes: 1) Central Tendency 2) Measures of Dispersion 3) Probability Density function 4) Frequency Distribution 5) Cumulative Distribution

Determining Stability & Normality:

Determining Stability & Normality: 1) Common Cause Variation ---> process output is stable and predictable 2) Special cause variation --> Unpredictable output and out-of-control

Developing Sampling Plans

Developing Sampling Plans 1) Depends on the purpose and whether there are customer or standards requirements for the study. 2) Process currently running and in control ---> use control chart to calculate process capability indices 3) Process normally distributed and in control ---> standard deviation = R-bar / d(2)

Developing an SIPOC Diagram

Developing an SIPOC Diagram: 1) Team will create process map that will be posted on the meeting room wall 2) 4-5 key steps in a process. 3) Process outputs---product/ service of this process. 4) Customers of the process outputs (end users) 5) Process inputs / Materials 6)Suppliers of the process 7) Identification of preliminary customer requirements 8) Involvement of stakeholders, team leader, and champion for verification.

Summary of analytical and Enumerative Studies

Difference between Enumerative study & Analytic Study: 1) Analytical studies start with hypothesis statement made about population parameters 2) Sample statistic used to test hypothesis and either reject/accept null. 3)

Difference between Enumerative study & Analytic Study

Difference between Enumerative study & Analytic Study: Enumerative study = study in which action will be taken in the universe Analytic study= study in which action will be taken on a process to improve performance in the future.

Difference between natural process limits vs. specification limits

Difference between natural process limits vs. specification limits 1) Natural process control limits = process in statistical control and meets specification or is +/- 3sigma 2) Specification limits= expectations of the customer or designer 3) Capable Process= process variation is significantly lower then width of specification limits (upper and lower limits)

Difference between repeatability & reproducibility

Difference between repeatability & reproducibility Precision = describes the variation you see when you measure the same part repeatedly with the same device. It includes the following two types of variation: 1) Repeatability = Equipment measurement variation (same operator) 2) Reproducibility = appraiser measurement variation (different operator, same part) Both are expressed as standard deviation.

Statistical Term: Distributions resulting fro countable (attribute) data that has a finite number of possible values

Discrete Distributions: Distributions resulting fro countable (attribute) data that has a finite number of possible values

Discuss the identification of process metrics during the Measure Stage

Discuss the identification of process metrics during the Measure Stage: 1) Factors critical to schedule (CTS) ----> impact project completion date 2) Factors critical to quality (CTQ) ---->Direct effect on the desired characteristics of the product/ service. 3) Factors Critical to Cost (CTC)----> impact materials, labor, delivery, overhead, inventory and/or cost to the consumer of the good or service. 4) Metrics should be customer-focused and transparent; that is they should be objective 5) Metrics need to repeatable and reproducible

Drawbacks of Range Method

Drawbacks of Range Method 1) Does not quantify repeatability & reproducibility separately ---> to do so, use ANOVA or Average and Range method

Effect of Gauge R& R?

Effect of Gauge R& R? 1) Measurement system play main role in process capability assessment ---> higher the gauge R& R, the higher the error in C(p) assessment

Ensuring Data Accuracy & Integrity

Ensuring Data Accuracy & Integrity Data should not be removed from a set without an appropriate statistical test or logic. Generally, data should be recorded in time sequence. Unnecessary rounding should be avoided. If done, should be late in the process. Screen the data to remove entry errors. Avoid emotional bias. Record measurements of items that change over time as quickly as possible after manufacture and again after the stabilization period. Each important classification identification should be recorded alongside the data. (Ex. Time, machine, operator, gage, lab, material, conditions, etc).

Ensuring Data Accuracy and Integrity

Ensuring Data Accuracy and Integrity Bad Data corrupts decision-making process---Preventions methods: 1) Avoid emotional bias relative to targets/ tolerance wh/ counting/ measuring/ recording 2) Time-sequenced data---> Record by order of capture 3) Avoid Rounding (reduces measurement sensitivity) & Calculate average to one more decimal than individual readings 4) Use objective statistical tests to identify outliners 5) Select sampling plan based on analyst's specific needs & experience. 6) Screen & filter data to detect & remove data entry errors 7) To apply statistics assuming normal population, ask if expected dispersion can be represented by 8-10 resolution increments-----> If no, then default statistic may be the count of observations which do or don't meet specification criteria. 8) Make sure that your data has labels

Enumerative (Descriptive) Studies

Enumerative (Descriptive) Studies 1) Classification of people into intervals of income, age, or health (Eg. census study) 2) Graphical tools

Errors in performing Gauge R& R?

Errors in performing Gauge R& R? 1) In process control situations, not sampling covering the tolerance spread ---> best to pick sample outside of specification limit rather than random samples 2) Knowledge bias in repetitive measurement trials due to failure to randomize samples during measurement 3) Inflated reproducibility errors due to untrained appraiser 4) Absent experimenter during study (remote located experimenter) --> can automate measurement but need to present during the human involvement 5) Publishing results with appraiser names (future uncooperative appraisers) 6)Assuming that GR&R is the same for all equipment or that results are valid forever

Exponential Distribution

Exponential Distribution 1) Basic assumptions: a) Family of distributions characterized by its m b) Distribution of time between independent events occurring at a constant rate c) Mean is the inverse of the Poisson distribution d) Shape can be used to describe failure rates that are constant as a function of usage

F-distribution Application

F-distribution 1) Test for equality of variance from two normal populations (ANOVA) 2) Type of cumulative distribution function

Factors that influence equipment measurement variation (repeatability)

Factors that influence equipment measurement variation (repeatability) 1) Design of measurement system

Factors that influence appraiser measurement variation (reproducibility)

Factors that influence equipment measurement variation (reproducibility) 1) Setting of the work piece (special loading/unloading) 2) Operator training, skill, and knowledge 3) Consistency in measurement

Factors to consider when creating work instructions

Factors to consider when creating work instructions 1) Keep Controlled copies of work instructions where activities performed 2) Can use flow charts with work instructions to show relationship of process steps 3) Consider the level of detail and its appropriateness of the background, experience, and skills of personnel. 4) Involve the people that perform activities when creating work instructions

Function of Machine Capability Study

Function of Machine Capability Study 1) Determine the inherent process (machine) variation by excluding elements like batch-to-batch, stream-to-stream, and time-to-time variation 2) Minimize measurement variability (operator and equipment)

Function of Process Capability Study

Function of Process Capability Study 1) Demonstrates that process is centered within the specification limits 2) Process variation predicts the process is capable of producing parts within tolerance limits

Function of SIPOC

Function of SIPOC 1) Identifies essential work flows and sources of variation in work over time 2) Captures key components of success from suppliers through internal processes and on to key customers 3) Other tools (affinity diagrams, process mapping, flow charting) can be used to identify major blocs or steps in the process/ system

Function of Written Instructions

Function of Written Instructions: 1) Who does what (personnel with specific skill set) 2) How it is done (step by step)

Function of Written Procedures

Function of Written Procedures Describes: 1) What is done during the process? 2) Why it is done (Business reason) 3) Where it is done ( Location/ process step) 4) When it is done (trigger)

Geometric Distribution

Geometric Distribution Basic assumptions: Discrete distribution Just two outcomes for each trial Trials are independent All trials have the same probability of occurrence Waiting time until the first occurrence

Geometric Distribution Application

Geometric Distribution Application: Number of failures before the first success in a sequence of trials with probability of success p for each trial Number of items inspected before finding the first defective item - for example, the number of interviews performed before finding the first acceptable candidate

Geometric Mean

Geometric Mean is a special type of average where we multiply the numbers together and then take a square root (for two numbers), cube root (for three numbers) etc. The geometric mean is NOT the arithmetic mean and it is NOT a simple average. It is the nth root of the product of n numbers. That means you multiply a bunch of numbers together, and then take the nth root, where n is the number of values you just multiplied. Did that make sense? Here's a quick example: What is the geometric mean of 2, 8 and 4? Solution: Multiply those numbers together. Then take the third root (cube root) because there are 3 numbers. cube Root ( 2 *8* 4) = cube root (64) = 4 Geometric Mean

Guideline for Collecting Data

Guidelines---Methods of Collecting Data 1) Formulate a clear problem statement 2) Define what is measured 3) List all the important measurements to be measured. 4) Select the right measurement technique 5) Construct an uncomplicated data form 6) Decide who will collect the data 7) Arrange for appropriate sampling method 8( Decise who will analyze and report the results *** Data needs an operational definition to have meaning and the ability to control quality ** Manual Data Collection is laborious and prone to error so inferior to automatic data collection. *****Large amounts of data are difficult to analyze unless organized is digestable format---graphs, charts, histograms, and Pareto Diagrams

Harmonic Mean

Harmonic Mean A kind of average. To find the harmonic mean of a set of n numbers, add the reciprocals of the numbers in the set, divide the sum by n, then take the reciprocal of the result. The harmonic mean of {a1, a2, a3, a4, . . ., an} is given below. Harmonic Mean = N/(1/a1+1/a2+1/a3+1/a4+.......+1/aN) Step 1: Find the Harmonic Mean of 1 and 100 H= 2 _____________ = 2 / [101/ 100] = 2/ [1.01] [ 1/1] + [1/ 100] H ≈ 1.98019802

Histogram

Histogram 1) Approximation of the distribution's shape 2) Normal distribution if symmetrical curve= stable, predictable process 3) Frequency= No. of data points that fall within a given bar 4) Sampling error or lack of randomness if differences between sample data in histogram and the population data curb (Fig. 6.15 pg.VI- 25 of Indiana counsel notes) 4) Natural variation = bell shaped curve is approximate of the distribution shape or variation that close to the bell curve (Se VI-36 Indiana Council notes)

Histograms

Histograms 1) Frequency column graphs that display a static picture of process behaviours with a minimum of 50-100 data points 2) "Frequency" =Data points fall withing a given bar or interval 3) Stable Process---> Predictable process; Uni-modal or Bell-shaped curve. 4) Unstable Process---> Exotic shape---> Exponential, Lognormal, Gamma, Beta, Weibull, Poisson, Binomial, hypergeometric, Geometric

Histograms Shapes----Double-peaked or bimodal.

Histograms Shapes---Double-peaked or bimodal. The bimodal distribution looks like the back of a two-humped camel. The outcomes of two processes with different distributions are combined in one set of data. For example, a distribution of production data from a two-shift operation might be bimodal, if each shift produces a different distribution of results. Stratification often reveals this problem.

Histograms Shapes---Skewed distribution

Histograms Shapes---Skewed distribution 1) The skewed distribution is asymmetrical because a natural limit prevents outcomes on one side. 2) The distribution's peak is off center toward the limit and a tail stretches away from it. ****For example, a distribution of analyses of a very pure product would be skewed, because the product cannot be more than 100 percent pure. Other examples of natural limits are holes that cannot be smaller than the diameter of the drill bit or call-handling times that cannot be less than zero. These distributions are called right - or left-skewed according to the direction of the tail.

Histograms Shapes------Truncated or heart-cut.

Histograms Shapes---Truncated or heart-cut. 1) The truncated distribution looks like a normal distribution with the tails cut off. 2) The supplier might be producing a normal distribution of material and then relying on inspection to separate what is within specification limits from what is out of spec. The resulting shipments to the customer from inside the specifications are the heart cut.

Histograms Shapes---Normal Distribution

Histograms---Normal Distribution The number of columns in a histogram should have the square root of the sample size -------eg. 100 samples will show 10 bars 1) Normal. A common pattern is the bell-shaped curve known as the "normal distribution." 2) In a normal distribution, points are as likely to occur on one side of the average as on the other. ***Be aware, however, that other distributions look similar to the normal distribution. Statistical calculations must be used to prove a normal distribution.

How can a process inputs and outputs be identified?

How can a process inputs and outputs be identified? 1) Process can measured and improved by identifying its inputs and outputs 2) Process inputs can be quantified as raw materials, human resources, or some upstream process. Process input requirements should be stated so key measures of input quality can be controlled. 3) Once process capabilities are known, output measures can be used to determine whether the process has remained in control.

How to conduct measurement correlation?

How to conduct measurement correlation? 1) Scatter diagram= detection of special causes due to multiple measurement systems 2) Analysis of "components of variation" = use of multiple appraiser and devises with randomized trials ---> If there is a significant P=value in the variance between measurement equipment, then investigate

Interpreting Gage R&R Report

Interpreting Gage R&R Report 1) EV (equipment variation)= standard deviation of repeatability [sigma (rat)] 2) AV= (apprasair variation ) = standard deviation of reproducibility [sigma (rpd)] 3) GRR= standard deviation of measurement system variation[sigma (M)] 4) PV= standard deviation of part-to-part variation [sigma (p)]

How to interpret Run Charts?

Interpreting Run Charts: 1) Subgroup= 1--> use run charts 2) Subgroups greater 1 ---> calculate the means or medians and connected with a line.

Data arranged in order and differences can be found. There is no inherent starting point and ratios are meaningless

Interval Data Data arranged in order and differences can be found. There is no inherent starting point and ratios are meaningless

Key reason for conducting ongoing measurement system analysis?

Key reason for conducting ongoing measurement system analysis? To understand the uncertainty of the measurement system.

Accuracy of measurement at various measurement points of measuring range in the equipment

Linearity Accuracy of measurement at various measurement points of measuring range in the equipment

Linearity & bias in Gauge R& R

Linearity & bias--Measures of Accuracy Linearity: a measure of how the size of the part affects the bias of a measurement system. It is the difference in the observed bias values through the expected range of measurement. 1) Equipment is accurate at one point of measurement but not at other point of measurement across the measurement range. 2) How biased the measuring equipment is compared to the "master" 3) If R-sq = 0.0% ---> Non-linearity

Scatter Diagrams--Linearity and Implications

Linearity and Implications 1) Judges the possibility that bias error found in particular system of measurement will be present throughout the equipment's entire operating range. 2) Measure Stage---> Assessing the accuracy of a measurement system within the range of values likely to be observed during the process. 3) Procedure for analysis of linearity ---> Operating range is examined in multiple parts and then reference value is obtained---> An average of the ranges is calculated after repeated measurements taken----> Bias is calculated by subtracting the average from reference value. ----> Coiffiecient of determination of greater than 70% will mean that measurements are adequate.

Locational Data

Locational Data 1) Not attribute or variable data 2) Tells about location 3) Eg. Map of USA with sales area or map of all the Walmarts in the USA.

Lognormal Distribution

Lognormal Distribution Basic assumptions: Asymmetrical and positively skewed distribution that is constrained by zero. Distribution can exhibit many pdf shapes Describes data that has a large range of values Can be characterized by m and s

Long-term Capability

Long-term capability indices (PP and PPK): The same capability indices that you calculate for short-term variation can also be calculated for long-term, or total, variation. To differentiate them from their short-term counterparts, these long-term capability indices are called PP and PPK. (The P stands for "performance.") The only difference in their formulas is that you use σLT in place of σST. These long-term capability indices are important because no process or characteristic operates in just the short term. Every process extends out over time to create long-term performance.

Machine Capability- Exceptions

Machine Capability 1) Short-term capability of a machine and calculated the same as process capability except: a) Historical data from control chart not to be used b) Multiple machines producing the same part ---> capability of machine should be determined independently. c) Machine capability should come from consecutive part measurements from same machine at or near the same time

Main Objective of the Measure State

Main Objective of the Measure State: 1) Data gathering to complete project. 2) Team defines each relevant process in great detail. 3) Group of metrics for the process is developed 4) Measurement analysis is conducted to identify and quantify any common errors in the metric that is selected. 5) Estimation of process baselines to determine a reasonable starting point.

Main objective of process capability study

Main objective of process capability study To determine whether process is in statistical control is is capable of meeting specifications.

Measurement Correlation

Measurement Correlation 1) Used when measurement are taken simultaneously with multiple devices of the same type for part coming from multiple streams of manufacturing (eg. instrument calibration)

Measurement Scales----Ratio

Measurement Scales----Ratio ) Central location = Geometric or Harmonic Mean (both are not the same thing) 2) Dispersion of Percent Variation 3) Significance Test= Any Test 4) Data consists of names or categories without any ordering scheme

Measurement Scales---Interval

Measurement Scales---Interval ) Central location = Arithmetic Mean 2) Dispersion of Standard Deviation 3) Significance Test= F-test, t-test, Correlation Analysis 4) Data is arranged in order and differences can be found. However, there is no inherent starting point and ratios are meaningless.

Measurement Scales---Nominal

Measurement Scales---Nominal 1) Central location = Mode 2) Dispersion of Information only 3) Significance Test= Chi-Square 4) Data consists of names or categories without any ordering scheme

Measurement Scales---Ordinal Data

Measurement Scales---Ordinal Data ) Central location = Median 2) Dispersion of Percentages 3) Significance Test= Sign or Run Test 4) Data arranged in some order but difference between values cannot be determined or are meaningless

Why is measurement system discrimination important?

Measurement system discrimination Ability to detect changes in the measured characteristic--important to be able to measure process variation and quantify value (eg. mean) of individual parts.

Measures of Central Tendency

Measures of Central Tendency Different ways of characterizing the central value of a collection of data includes-------> Mean, Mode, Median Central Limit Theorem----> Probability distribution of the sample means approach normal distribution as the number of sample sizes increases, provided that they are simple random samples of uniform size.---> used for small sample sizes and when true distribution is unknown

What is a method?

Method: An UNWRITTEN process that must be followed consistently---usually an internal procedure for a particular activity that is not required by any standard.

Multiple Regression ---1st order and higher order

Multiple Regression ---1st order and higher order 1) First Order---->value of the dependent variable is influenced by each factor by itself as well as combination of 2 factors----> Produces a straight line over small regions so best used for only targeted data (flexing of the plan is due to impact of interacting factors) 2) Higher-Order -----> Includes squares and cubes of the value which produce a response surface with peaks and valleys-----> best for defining the area AROUND the stationary point and for evaluating how current operating parameters influence the response.

Nelson Funnel

Nelson Funnel Experiment (Deming) 1) Describes the adverse effects of tampering with a process by making changes to it without first making a careful study of the possible causes of the variation in that process. 2) Marble is dropped through a funnel onto a sheet of paper, which contains a target. The objective of the process is to get the marble to come to a stop as close to the target as possible. ---> Efforts to adjust the marble to hit the mark increase, variation increases.

Data consisting of names or categories only. No ordering scheme is possible

Nominal Data Data consisting of names or categories only. No ordering scheme is possible

Which Measurement Scale? Central Location is mode

Nominal Data Central Location is mode

Normal Probability Plots

Normal Probability Plots 1) Most of the points are near the center line, average 2) Some points are at the minimum and maximum points 3) When all the special causes of variation are eliminated, the process will produce a product that when sampled will produce a bell-shaped curve.

Np Chart/ Control Chart

Np Chart/ Control Chart 1) Assess attribute data to measure the number of times a condition exists in each sample, the condition may occur only once and the sample size is consistent. 2) Depicts a stable process 3) Measure Stage---used to guess the process baseline (variable control chart is more commonly used) 4) Improve Stage---used to find the number of errors in the process sample.---but the small error rate make Np chart ineffective

Objective of Statistical inference?

Objective of Statistical inference? To draw conclusions about population characteristics based on the information contained in sample

Which Measurement Scale? Central Location is Median

Ordinal Data Central Location is Median

Data arranged in some order but differences between values cannot be determined or are meaningless

Ordinal Data Data arranged in some order but differences between values cannot be determined or are meaningless

Pattern and Trend Analysis

Pattern and Trend Analysis 1) Control charts or Trends can used to display changes in data patterns. 2) Data can be either summary (statis) or time sequence (dynamic) 3) Trend charts show patterns that indicate if a process is running normally or whether desirable or undesirable changes are occurring.

Percent Agreement Application

Percent Agreement 1) GR&R can be used for attribute data (yes/no, pass/fail) 2) Attribute agreement study ---> a) human variation in judgement or evaluation ("Within appraiser" variation) b) due to automatic measurement gauging where parts are automatically screened as good/ bad by machine

Define Percent Agreement

Percent Agreement: 1) Agreement between the measurement system and either reference value of the true value of measured variable 2) If X-varation is known, Y= variable---> Correlation coiffiecnt indication of % change in the dependent variable due to change in independent variable (Y) 3) r= 0---> 0% agreement between the measurement system and either reference value of the true value of measured variable 4) r= +/- 1 ---> 100% agreement

Percent GR&R to Tolerance

Percent GR&R to Tolerance 1) For product control situation where the measurement result and decision criteria determine "conformance or nonconformance" (that is 100% inspection or sampling), samples (or standard but be selected, but need not cover the entire process range) 2) Assessment of the measurement system is based on tolerance.

Percent GR&R to Variation

Percent GR&R to Variation For process control situation where the measurement result and decision criteria determine "process stability, direction, and compliance with the natural process variation" (that is SPC, process monitoring, capability, ad process improvement), the availability of samples over the entire operating range becomes important. Independent estimate of process variation (process capability study) is recommended when assessing the adequacy of the measurement stem for process control (that is % GR&R to variation)

Poisson Distribution

Poisson Distribution Basic assumptions: 1) Discrete distribution 2) Length of the observation period (or area) is fixed in advance 3) Events occurs at a constant average rate 4) Occurrences are independent Rare event

Poisson Distribution Application

Poisson Distribution Application 1) Number of events in an interval of time (or area) when the events are occurring at a constant rate 2) Number of items in a batch of random size 3) Design reliability tests where the failure rate is considered to be constant as a function of usage 4) Distribution of defect counts

Term: All possible observations of similar items from which a sample is drawn

Population: All possible observations of similar items from which a sample is drawn

Precision/ Tolerance (P/T)

Precision/ Tolerance (P/T) Ratio between the estimated measurement error (precision) and tolerance of the characteristic measured 1) Best for (P/T) to be small --> less measurement variability 2) Assume Measurement errors are independent, normally distributed, and independent of the magnitude of measurement when 6sigma is the standard dev

Precision/ Total Variation (P/TV)

Precision/ Total Variation (P/TV) 1) Best for (P/TV) to be minimized to reduce measurement variation on assessment of process variation.

Probability Density Function

Probability Density Function (PDF) ***Describes the behavior of a random variable ---- For continuous variables, the pdf is the probability that a variate assumes the value x, expressed in terms of an integral between two points. 1) Smooth curve= represents population data 2) "Shape" of distribution----grouped frequency distribution. 3) Sampling errors / lack of randomness/ incorrect model = data points that above the "Smooth curve" outline of the bell-shaped curve. (aka. difference between the histogram bars/ sample data and the smooth curve/ population data) 4) Since probability density function represents the area under the curve, the area under the probability density function must equal one.

Problems due to not Coding:

Problems due to not Coding: a) Inspectors trying to squeeze too many digits into small blocks on a check sheet form b) Reduced throughput and increased errors by clerks at keyboards reading and entering large sequences of digits for a single observation. c) Insensitivity of analytic results due to rounding of large sequence of digits.

Process Analysis & Documentation

Process Analysis & Documentation 1) Set of inter-related resources and activities which transform inputs into outputs with the objective of adding value 2) Tools used- flow charts, process maps, written procedures, and work instructions.

Process Mapping

Process Mapping 1) Key advantage = provides visual presentation of the process being described with the use of symbols rather than the clutter of words 2) Used to outline new procedures and review old procedures for viability and throughness. 3) Multiple Flowchart styles ----> person-to-person, action-to-action, and conceptual

Process input, Outputs, and Feedback

Process input, Outputs, and Feedback 1) Must measure a process before it is improved 2) Process is measured by identifying process input variables, process outputs, and documenting their relationship via cause-and-effect diagram, relational matrices, flow charts and other tools 3) Measurement of Process Inputs (raw materials, human resources, or results from upstream process) can be used to optimize and control an upstream process. 4) Process input requirements should be stated so key measures of input quality can be controlled.---> Once the process capabilities are known, output measures can be used to determine if the process has remained in control. 5) Feedback from downstream process measurements can be used to improve an upstream process. 6) Planned experimentation ----isolating the effects of different, independent process variables----and Designing for Six Sigma ---eliminating potential sources of error-----are applied to complex inter-relationships found in the organizational feedback system

Proficiency Testing / Round-robin Testing

Proficiency Testing / Round-robin Testing Measurement system compared against the mean or standard deviation of multiple other devices, all reporting measurements of the same or similar artifacts. 1) If two ore more artifacts are test by each device and the replicate reading taken ---> variation due to artifacts and instruments can be separated 2) Best used when no national standard reference available

Properties of Chi-Squre Distribution

Properties of Chi-Squre Distribution 1) Non-negative (It is action of two non-negative values) 2) Non-Symmetric 3) DF when working with single population variance is [n-1]

Properties of Student t- distribution

Properties of Student t- distribution 1) Bell-shapped but with smaller sample sizes showing increased variability (flatter) ---> less peaked than normal distribution with thicker tails 2) As sample size increases, the distribution approaches normal distribution. For n> 30, difference negligible 3) Unknown standard deviation 4) Unimodal, symmetrical population distribution 5) Variance greater than 1 but variance increases as sample size increases.

Random Sampling

Random Sampling Definition---every unit in the population has the same chance of being selected. 1) Best for time or economic constraints. 2) Requires samples to be representative of the lot and not just the product 3) Sampling without randomness make the plan ineffective. ----Sampling sequence must be random as well

Which Measurement Scale? Central Location is Geometric Mean

Ratio Central Location is Geometric Mean

An extension of interval feel that includes an inherent zero starting point. Both differences and ratios are meaningful

Ratio Data An extension of interval feel that includes an inherent zero starting point. Both differences and ratios are meaningful

Recording Check Sheets

Recording Check Sheets 1) Used to collect measured or counted data 2) Data is collected by making tick marks -- ( tally sheet) 3) Caution--leave enough room for individual measurements to be written. 4) Measured data = physically measured information ---eg. pH, airpressure, or amount of downtime in hours.

Regression Analysis

Regression Analysis 1) Identifies when independent variables are influenced by one or more dependent variables 2)Measure Stage----> Evaluates the degree to which a measurement system is linear 3) Analyze Stage---->Used to explore connections between metrics and process factors. 4) Improve Stage----> Used to connections after improvement have been made

Variation in measuring equipment when measured by one appraiser in the same setting at the same time

Repeatability Variation in measuring equipment when measured by one appraiser in the same setting at the same time ---Expressed as standard deviation

Repeatability & Gage R&R

Repeatability & Gage R&R ** R= repeatability 1) Individual R averages= differences between appraisers 2) If R (a) < (R (b) = appraiser R (a) did a better at getting the same repeated measurement of the same part than B; appraiser B had wider variation

Variation in measurement when measured by two or more appraiser multiple times

Reproducibility Variation in measurement when measured by two or more appraiser multiple times

Reproducibility & Gage R&R

Reproducibility & Gage R&R * X-bar= Reproducibility 1) If X-bar (a) and X-bar (b) are very close but X-bar (c) is quite different ---> Appraiser C's equipment measurement has some bias.

Requirements for characteristics to be measured in process capability study

Requirements for characteristics to be measured in process capability study: 1) Characteristics should indicate a key factor in quality or product/process 2) Possible to adjust the value of characteristics 3) Operation condition the affect the characteristic should be defined and controlled.

Risk of using SIPOC

Risk of using SIPOC 1) Making the diagram too detailed may details the focus of the sigma projects 2) Best to limit to 4-7 process blocks to keep clarity

Role of Feedback in measuring process

Role of Feedback in measuring process 1) Feedback from downstream process measurements can be used to improve upstream process 2) Interrelationships between organizational feedback system is the target for planned experimentation for designing for six sigma 3) Designing for six sigma= the use of planned experimentation of isolate several process variables to eliminates potential sources of error.

Run Charts ---> Purpose & Application

Run Charts 1) Purpose = visual indicator of nonrandom patterns 2) Application = Real-time feedback required fro variables data. Shows how stable a process is behaving and to detect special causes

Scatter Diagram --Purpose & Application

Scatter Diagram -- 1) Purpose = Detects possible correlation or association between two variables (cause and effect)

Scatter Diagrams

Scatter Diagrams 1) Correlation originates from the following: a) Cause-effect Pattern b) Relationship between one cause and another cause 2) Not all scatter diagrams have linear relationship 3) Regression line ---" Best Fit Line"---> must analyze the scatter diagram before making a decision in correlation statistics.

Sequential Sampling

Sequential Sampling 1) Similar to multiple sampling plan except Sequential sampling can continue indefinitely 2) Sampling usually ended after the number inspected has exceeded 3 times the sample size 3) Used for costly or destructive testing with sample sizes of one and are based on the probability ratio test

Short Term Capability

Short Term Capability 1) Control limts based on short-term processes 2) Less variation with smaller data quantities 3) When sample size is too small---> Control charts with false, out-of-control patterns

Significance test using ordinal data

Sign or Run Test: Significance test using ordinal data

Sources of Variation

Sources of Variation (Fig. 14.8, pg. 197) 1) Overall Variability ---> Either part-to-part variability or measurement system variability 2) Measurement System variability ---> either due to Variation in gauge (repeatability) or variation due to operator (reproducibility) 3) Reproducibility ---> Either due to Operator or Operator by part

Drift in absolute value over time

Stability Drift in absolute value over time

Define Stable Process

Stable Process 1) Absence of special cause variation after 20-30 subgroups plotted

Stem and Leaf Plot

Stem and Leaf Plot 1) Best for plotting variable & categorical data 2) Stems= data grouped by class intervals 3) Leaves= smaller interval data

Stem and Leaf Plots

Stem and Leaf Plots-- John Tukey 1) Manual method for plotting both categorical and variable data sets. 2) Data are grouped by class intervals as "Stem" ------smaller data increments = " Leaves" 3) Allows for data to be read directly from the diagram whereas histogram may lose individual values as frequencies within the class interval.

Stem-and-leaf plot ---> Purpose & Application

Stem-and-leaf plot 1) Provides numerical data information about the contents of the cells in frequency distribution. 2) Application = quickly identify any repetitive data within the class interval.

Steps for process capability studies

Steps for process capability studies 1) Measurement system verification---- remove sources of variation via process capability assessment to prep for process stability monitoring 2) Id. rational subgrouping of 5 samples---> 5 consecutive samples taken at equal intervals from a process and average/ range plotted to observe stability whee the subgroup size is one. 3) Normality required for continuous data

Steps in conducting Gage R&R

Steps in conducting Gage R&R (pg 190 ASQ ) 1) Inform appraisers of measurement criteria & inspection method (Ensure training) 2) Handpick study samples cover spread (avoid random sampling) 3) Study location not visible to appraisers but experimenter present during R&R study 4) Multiple appraisers perform measurements one-by-one. 5) Randomized data collection sheet entered into the calculation tabular sheet.

Steps in determining whether Normal distribution for process capability study

Steps in determining whether Normal distribution for process capability study 1) Histogram using original data (not the average) from control chart 2) If symmetrical tails and data points mostly around median---> Normal distribution 3) If not normal ---> Can transform non normal data to normal data using Box-Cox transformation or Johnson transformation

Steps involved in Statistical inference

Steps involved in Statistical inference 1) Define problem and choose between one-tail and two-tail hypothesis. 2) Select test distribution and critical value of the test statistic reflecting the degree of uncertainty that can be tolerated. 3) Calculate test statistic value and make inference by comparing calculated value with critical value.

Stratified Sampling

Stratified Sampling Basic assumption of sampling----> sample is selected from a homogenous lot ----> Hetergenous lot---eg. car parts in a pile but made by different machines, different condition, or different lines 1) Stratified Sampling-----> Selection of random samples from each group/ process that are different from other similar groups or processes 2) Results in a mix of samples that can be biased if the proportion of samples does not reflect the relative frequency of the groups. -----> Importance? a) User must be made aware of possibility of stratified groups b) Data reports must state that the observations are relevant only to the sample drawn and may not necessarily reflect the overall system.

Student t- distribution

Student t- distribution continuous probability distributions that arises when estimating the mean of a normally distributed population in situations where the sample size is small and population standard deviation is unknown. 1) Combination of standard normal random variable and chi-square random variable 2) Z= standard normal random variable ; X(2) is chi-square rand variable

Student t- distribution application

Student t- distribution application 1) Hypothesis testing = statistical significance of the difference between two sample means 2) Confidence intervals around the means 3) Linear regression analysis **Replaces normal distribution when standard deviation unknown. *****Compensates for error in estimated standard deviation.

Purpose and Application of Tally

Tally: 1) Purpose = quick count of the total quantity and by class interval. Provides visual ideas of distribution shape. 2) Application= Used to count defect quantity by type/class/category

Test for validity of normality assumption?

Test for validity of normality assumption? Chi-square test

F-Distribution & Student T-test

The F-distribution shares one important property with the Student's t-distribution: Probabilities are determined by a concept known as degrees of freedom. Unlike the Student's t-distribution, the F-distribution is characterized by two different types of degrees of freedom — numerator and denominator degrees of freedom. The F-distribution has two important properties: 1) It's defined only for positive values. 2) It's not symmetrical about its mean; instead, it's positively skewed.

Three Types of Checksheets

Three Types of Checksheets 1) Location or concentration diagram:----> When you rent a car, you probably receive a document with the sketch of the car which allows you to circle any damages, dents or scratches on the car with a corresponding mark on the diagram. 2) Graphical or Distribution check sheet: --------->Using the graphical form, the person collecting the data is able to visualize the distribution of the data. For example, the number of people in line at the registration desk at 15 minute intervals could be counted to determine the staffing needs and the size of the waiting room. 3) Tabular check sheet or tally sheet-------> : The tally sheet is commonly used to collect data on quality problems and to determine the frequency of events. For example, the tally sheet is useful for understanding the reasons patients are arriving late for appointments, causes for delays in getting the lab results back, etc. It is also useful in determining frequency of occurrence, such as number of people in the line for blood tests at 6:00 am, 6:15 am, etc., to understand staffing needs.

Three types of Decision Distribution:

Three types of Decision Distribution: 1) t-distribution 2) F-distribution 3) Chi-square distribution

Three types of Flow Charts

Three types of Flow Charts: 1) Person-to-person 2) Action -to-action 3) Conceptual

Tools for documenting process inputs and outputs ?

Tools for documenting process inputs and outputs ? 1) Cause-and-effect diagrams 2) Relational matrices 3) Flow charts

Two elements of statistical inference

Two elements of statistical inference 1) inference 2) measure of validity

Two key advantages of Process Mapping

Two key advantages of Process Mapping: 1) Depicts process using symbols, arrow, and words without the clutter of sentences 2) Can be used outline new procedures 3) Can be used to review old procedures for viability and thoroughness

Two measures for evaluating the acceptability of the measurement system

Two measures for evaluating the acceptability of the measurement system 1) Precision/ Tolerance (P/T) 2) Precision/ Total Variation (P/TV)

Three types of Discrete Distribution

Two types of Discrete Distribution 1) Binomial distribution 2) Poison distribution 3) Hypergeometric distribution

Types of Process Inputs

Types of Process Inputs: 1) Needs (2) Ideas (3) Exceptions (4) Requirements 5) Information 6) Data (7) Documents (8) Resources

Types of Check sheet

Types of check sheets: Commonly used check sheets are 1) tabular check sheets or tally sheets, 2) location check sheets and 3) graphical or distribution check sheets.

Types of process outputs

Types of process outputs 1) Designs 2) Decisions 3) Measurements 4) Products/ Services 5)Authorizations 7) Actions 8) Solutions 9) Proposals

Ultimate Goal of SIPOC

Ultimate Goal of SIPOC 1) Ultimate goal is to identify essential work flows and sources of variation in the work over time 2) Can be adapted to a number of essential support processes. 3) Process mapping, flow charting, and affinity diagrams can be used to identify the major blacoks or steps in a process of system.

Use of Analytical statistics to estimation of process baseline

Use of Analytical statistics to estimation of process baseline 1) Analytical statics ---used to distinguish special and common cause variation in dynamic, moving processes. 2) Statistical process control---operational defintion of special cause variation---notes the location and level of variation.

Variable Data/ Continuous

Variable Data Any real number----4.69, -1.4, --->measurable data that tell how long, what volume, or how much? 1) Measured data is more precise and more informative than counted Data but is expensive to collect.

Variable Width Box Plot

Variable Width Box Plot The other common variant that shows more information than the standard fixed width box plot is called the variable width box plot (like the plot at the top of this page). But this is only valuable if comparing more than one box plot since the width is irrelevant is just displaying one box plot. One common convention is to make the width of the boxes for group of data is proportional to the square roots of the number of observations in a given sample.

Variable vs. Attribute Data

Variable vs. Attribute Data 1) Characteristics : a) Variable ---measurable, continuous, "counting data" b) Attribute --Discrete (Good/bad); Countable 2) Types of Data a) Variable --length, volume, time b) Attribute-- No. of defects, scrap items, defectives.

Variations due to random sampling

Variations due to random sampling 1) Batch-to-batch variation 2) Within-batch variation

Ways to document Work Instructions

Ways to document Work Instructions 1) Written instructions 2) Checklists 3) Flowcharts 4) Photographs 5) Drawn pictures 6) Videos 7) Electronic screen shots 8) Electronic software-driven process steps

Weibull Distribution

Weibull Distribution Basic assumptions: Family of distributions Can be used to describe many types of data Fits many common distributions (normal, exponential and lognormal) The differing factors are the scale and shape parameters

Weibull Distribution 4 Applications

Weibull Distribution Application 1) Lifetime distributions 2) Reliability applications 3) Failure probabilities that vary over time 4) Can describe burn-in, random, and wear-out phases of a life cycle (bathtub curve)

What are Process Maps and Flowcharts Used for?

What are Process Maps and Flowcharts Used for? 1) Provide a visual depiction of their process information for customers and suppliers (SIPOC) 2) Serves as the first step in improving process when used in conjunction with FMEA and value stream mapping.

What factors should be considered when determining which process should be documented?

What factors should be considered when determining which process should be documented? 1) Effect on quality 2) Risk of customer dissatisfaction 3) Statutory and/or regulatory requirements 4) Economic Risk 5) Effectiveness and efficiency 6) Competence of personnel 7) Complexity of process

What is the difference between Written procedures and Written Instructions?

What is the difference between Written procedures and Written Instructions? Procedures = describe the process at general level Work Instructions = provide details and step-by-step sequence of activities.

What is the difference between process map and flowchart?

What is the difference between process map and flowchart? 1) Process maps = Broad perspective of problems/ opportunities for improvement; Provides additional information about each step (Eg. cost, setup time, cycle time, inventory, types of defects); Can be used to detect non-value-added steps and complexities. 2) Flowcharts = show each step in the process, decision points, inputs, and outputs.; creates a picture of the actual steps in the process or system as it actually operates. Flow charts = analytical tool for monitoring process over TIME; used for training operators and supervisors.

When Binomial distribution can be used?

When Binomial distribution can be used? 1) When sample greater than 50 (can't use from small sample size) 2) sample size (n) is less than 10% of N or [0.1* N] ; Population size (N) >50 3) Proportion defective is equal to or greater than 0.1

When the natural process limits are compared with specification limits, when would you: Accept Losses

When the natural process limits are compared with specification limits, when would you: Accept Losses--> After centering process and reducing variability, will be left with scrap and rework as losses.

When the natural process limits are compared with specification limits, when would you: Center the process

When the natural process limits are compared with specification limits, when would you: Center the process---> When you need to bring the bulk of the product within the specification limits or when the process spread is same as the specification limits spread

When the natural process limits are compared with specification limits, when would you: Change specification limits

When the natural process limits are compared with specification limits, when would you: Change specification limits---> When specification limits are too tight and need to be relaxed or modified

When the natural process limits are compared with specification limits, when would you: Do Nothing?

When the natural process limits are compared with specification limits, when would you: Do Nothing ----> When process limits fall within specification limits

When the natural process limits are compared with specification limits, when would you: Reduce Variability

When the natural process limits are compared with specification limits, when would you: Reduce Variability ---> When it is possible to partition the variation (within piece, batch-to-batch) and work on the largest offender

When to use Weibull Distribution?

When to use Weibull Distribution? Mainly for reliability data when the underlying distribution is unknown; Estimates the shape parameter beta an mean time between failures or failure rates

Why are processes measured?

Why are processes measured? 1) Process input variables are measured to control variation in output responses. 2) The effects of variations of input on output are explored using tools (eg. scatter diagrams, relationship matrices, fishbone diagram) 3) Through understanding of process inputs and outputs is key to step for process improvements.

Why are work procedures and work instructions a necessity?

Why are work procedures and work instructions a necessity? Worldwide demand for ISO 9001 compliance makes necessity for work instructions and work procedures. ISO requires that internal procedures to control nonconforming products so the it is presented from inadvertent use or installation. (Assigned to Quality Dept) They help drive consistency in business and manufacturing. Consistent approach to process management helps yield improvements, root cause analysis and traceability.

Why is measurement system analysis important?

Why is measurement system analysis important? 1) All data from process is filtered via measurement system 2) Measurement system analysis is the most cost-effective way to reduce variation

Work Instructions

Work Instructions 1) Procedures describe the process at general level, while work instructions provide details and step-by-step sequence of activities. 2) Flow charts may be used with work-instructions to show relationships of process steps. 3) Controlled copies of work instructions are kept in the area where the activities are performed. 4) Level of detail in the work instruction should be appropriate for background, experience, and skill of the personnel. 5) Wording and terminology should match that used by those doing the work.

Written Procedures

Written Procedures 1) ISO 9001 ---States that internal procedures shall control nonconforming product so that it is prevented from inadvertent use of installation. ------> Quality Department Responsibility but functions carried out by multiple departments 2) Procedure should be developed by those having responsibility for the process on interest.

X-Bar & R charts

X-Bar & R charts 1) Best for short runs ---> 3-10 pieces without adjustment 2) Inflated A2 and D4 values are used to establish control limits 3) Small runs with limited amount of data ---> use X and MR charts 4) Moving range (MR) = piece-to-piece variability 5) X= individual data values 6) As data points increases the calculated process capability will approach true capability

What is Z-score?

Z-Score 1) Measure of how many standard deviations below or above the population mean a raw score is. 2) AKA. standard score and it can be placed on a normal distribution curve. 3) Z-scores range from -3 std devs (which would fall to the far left of the normal distribution curve) to +3 std devs (which would fall to the far right of the normal distribution curve). 4) In order to use a z-score, must know the mean μ and the population standard deviation σ.


संबंधित स्टडी सेट्स

The Catcher in the Rye Study Guide questions and Answers

View Set

SAD EXAM 2 PRACTICE QUIZ CHAPTER 6

View Set

Chapter 13- The Nature of Storms

View Set

Nclex Review with Rationales (PEDIATRICS)

View Set

Chapter 6: Bones and Skeletal Tissue

View Set