Implementing Six Sigma Smarter Solutions Using Statistical Methods 2nd edition by Forrest W. Breyfogle – Ebook PDF Instand Download/DeliveryISBN: 0471476323 9780471476320
Full dowload Implementing Six Sigma Smarter Solutions Using Statistical Methods 2nd edition after payment
Product details:
ISBN-10 : 0471476323
ISBN-13 : 9780471476320
Author: Forrest W. Breyfogle
Includes new and expanded coverage of Six Sigma infrastructure building and benchmarking.
Provides plans, checklists, metrics, and pitfalls.
Implementing Six Sigma Smarter Solutions Using Statistical Methods 2nd Table of contents:
PART I S(4)/IEE DEPLOYMENT AND DEFINE PHASE FROM DMAIC
1 Six Sigma Overview and S(4)/IEE Implementaton
1.1 Background of Six Sigma
1.2 General Electric’s Experiences with Six Sigma
1.3 Additional Experiences with Six Sigma
1.4 What Is Six Sigma and S(4)/IEE?
1.5 The Six Sigma Metric
1.6 Traditional Approach to the Deployment of Statistical Methods
1.7 Six Sigma Benchmarking Study
1.8 S(4)/IEE Business Strategy Implementation
1.9 Six Sigma as an S(4)/IEE Business Strategy
1.10 Creating an S(4)/IEE Business Strategy with Roles and Responsibilities
1.11 Integration of Six Sigma with Lean
1.12 Day-to-Day Business Management Using S(4)/IEE
1.13 S(4)/IEE Project Initiation and Execution Roadmap
1.14 Project Benefit Analysis
1.15 Examples in This Book That Describe the Benefits and Strategies of S(4)/IEE
1.16 Effective Six Sigma Training and Implementation
1.17 Computer Software
1.18 Selling the Benefits of Six Sigma
1.19 S(4)/IEE Difference
1.20 S(4)/IEE Assessment
1.21 Exercises
2 Voice of the Customer and the S(4)/IEE Define Phase
2.1 Voice of the Customer
2.2 A Survey Methodology to Identify Customer Needs
2.3 Goal Setting and Measurements
2.4 Scorecard
2.5 Problem Solving and Decision Making
2.6 Answering the Right Question
2.7 S(4)/IEE DMAIC Define Phase Execution
2.8 S(4)/IEE Assessment
2.9 Exercises
PART II S(4)/IEE MEASURE PHASE FROM DMAIC
3 Measurements and the S(4)/IEE Measure Phase
3.1 Voice of the Customer
3.2 Variability and Process Improvements
3.3 Common Causes versus Special Causes and Chronic versus Sporadic Problems
3.4 Example 3.1: Reacting to Data
3.5 Sampling
3.6 Simple Graphic Presentations
3.7 Example 3.2: Histogram and Dot Plot
3.8 Sample Statistics (Mean, Range, Standard Deviation, and Median)
3.9 Attribute versus Continuous Data Response
3.10 Visual Inspections
3.11 Hypothesis Testing and the Interpretation of Analysis of Variance Computer Outputs
3.12 Experimentation Traps
3.13 Example 3.3: Experimentation Trap—Measurement Error and Other Sources of Variability
3.14 Example 3.4: Experimentation Trap—Lack of Randomization
3.15 Example 3.5: Experimentation Trap—Confused Effects
3.16 Example 3.6: Experimentation Trap—Independently Designing and Conducting an Experiment
3.17 Some Sampling Considerations
3.18 DMAIC Measure Phase
3.19 S(4)/IEE Assessment
3.20 Exercises
4 Process Flowcharting/Process Mapping
4.1 S(4)/IEE Application Examples: Flowchart
4.2 Description
4.3 Defining a Process and Determining Key Process Input/Output Variables
4.4 Example 4.1: Defining a Development Process
4.5 Focusing Efforts after Process Documentation
4.6 S(4)/IEE Assessment
4.7 Exercises
5 Basic Tools
5.1 Descriptive Statistics
5.2 Run Chart (Time Series Plot)
5.3 Control Chart
5.4 Probability Plot
5.5 Check Sheets
5.6 Pareto Chart
5.7 Benchmarking
5.8 Brainstorming
5.9 Nominal Group Technique (NGT)
5.10 Force-Field Analysis
5.11 Cause-and-Effect Diagram
5.12 Affinity Diagram
5.13 Interrelationship Digraph (ID)
5.14 Tree Diagram
5.15 Why-Why Diagram
5.16 Matrix Diagram and Prioritization Matrices
5.17 Process Decision Program Chart (PDPC)
5.18 Activity Network Diagram or Arrow Diagram
5.19 Scatter Diagram (Plot of Two Variables)
5.20 Example 5.1: Improving a Process That Has Defects
5.21 Example 5.2: Reducing the Total Cycle Time of a Process
5.22 Example 5.3: Improving a Service Process
5.23 Exercises
6 Probability
6.1 Description
6.2 Multiple Events
6.3 Multiple-Event Relationships
6.4 Bayes’ Theorem
6.5 S(4)/IEE Assessment
6.6 Exercises
7 Overview of Distributions and Statistical Processes
7.1 An Overview of the Application of Distributions
7.2 Normal Distribution
7.3 Example 7.1: Normal Distribution
7.4 Binomial Distribution
7.5 Example 7.2: Binomial Distribution—Number of Combinations and Rolls of Die
7.6 Example 7.3: Binomial—Probability of Failure
7.7 Hypergeometric Distribution
7.8 Poisson Distribution
7.9 Example 7.4: Poisson Distribution
7.10 Exponential Distribution
7.11 Example 7.5: Exponential Distribution
7.12 Weibull Distribution
7.13 Example 7.6: Weibull Distribution
7.14 Lognormal Distribution
7.15 Tabulated Probability Distribution: Chi-Square Distribution
7.16 Tabulated Probability Distribution: t Distribution
7.17 Tabulated Probability Distribution: F Distribution
7.18 Hazard Rate
7.19 Nonhomogeneous Poisson Process (NHPP)
7.20 Homogeneous Poisson Process (HPP)
7.21 Applications for Various Types of Distributions and Processes
7.22 S(4)/IEE Assessment
7.23 Exercises
8 Probability and Hazard Plotting
8.1 S(4)/IEE Application Examples: Probability Plotting
8.2 Description
8.3 Probability Plotting
8.4 Example 8.1: PDF, CDF, and Then a Probability Plot
8.5 Probability Plot Positions and Interpretation of Plots
8.6 Hazard Plots
8.7 Example 8.2: Hazard Plotting
8.8 Summarizing the Creation of Probability and Hazard Plots
8.9 Percentage of Population Statement Considerations
8.10 S(4)/IEE Assessment
8.11 Exercises
9 Six Sigma Measurements
9.1 Converting Defect Rates (DPMO or PPM) to Sigma Quality Level Units
9.2 Six Sigma Relationships
9.3 Process Cycle Time
9.4 Yield
9.5 Example 9.1: Yield
9.6 Z Variable Equivalent
9.7 Example 9.2: Z Variable Equivalent
9.8 Defects per Million Opportunities (DPMO)
9.9 Example 9.3: Defects per Million Opportunities (DPMO)
9.10 Rolled Throughput Yield
9.11 Example 9.4: Rolled Throughput Yield
9.12 Example 9.5: Rolled Throughput Yield
9.13 Yield Calculation
9.14 Example 9.6: Yield Calculation
9.15 Example 9.7: Normal Transformation (Z Value)
9.16 Normalized Yield and Z Value for Benchmarking
9.17 Example 9.8: Normalized Yield and Z Value for Benchmarking
9.18 Six Sigma Assumptions
9.19 S(4)/IEE Assessment
9.20 Exercises
10 Basic Control Charts
10.1 S(4)/IEE Application Examples: Control Charts
10.2 Satellite-Level View of the Organization
10.3 A 30,000-Foot-Level View of Operational and Project Metrics
10.4 AQL (Acceptable Quality Level) Sampling Can Be Deceptive
10.5 Example 10.1: Acceptable Quality Level
10.6 Monitoring Processes
10.7 Rational Sampling and Rational Subgrouping
10.8 Statistical Process Control Charts
10.9 Interpretation of Control Chart Patterns
10.10 x and R and x and s Charts: Mean and Variability Measurements
10.11 Example 10.2: x and R Chart
10.12 XmR Charts: Individual Measurements
10.13 Example 10.3: XmR Charts
10.14 x and r versus XmR Charts
10.15 Attribute Control Charts
10.16 p Chart: Fraction Nonconforming Measurements
10.17 Example 10.4: p Chart
10.18 np Chart: Number of Nonconforming Items
10.19 c Chart: Number of Nonconformities
10.20 u Chart: Nonconformities per Unit
10.21 Median Charts
10.22 Example 10.5: Alternatives to p-Chart, np-Chart, c-Chart, and u-Chart Analyses
10.23 Charts for Rare Events
10.24 Example 10.6: Charts for Rare Events
10.25 Discussion of Process Control Charting at the Satellite Level and 30,000-Foot Level
10.26 Control Charts at the 30,000-Foot Level: Attribute Response
10.27 XmR Chart of Subgroup Means and Standard Deviation: An Alternative to Traditional x and R Char
10.28 Notes on the Shewhart Control Chart
10.29 S(4)/IEE Assessment
10.30 Exercises
11 Process Capability and Process Performance Metrics
11.1 S(4)/IEE Application Examples: Process Capability/Performance Metrics
11.2 Definitions
11.3 Misunderstandings
11.4 Confusion: Short-Term versus Long-Term Variability
11.5 Calculating Standard Deviation
11.6 Process Capability Indices: C(p) and C(pk)
11.7 Process Capability/Performance Indices: P(p) and P(pk)
11.8 Process Capability and the Z Distribution
11.9 Capability Ratios
11.10 C(pm) Index
11.11 Example 11.1: Process Capability/Performance Indices
11.12 Example 11.2: Process Capability/Performance Indices Study
11.13 Example 11.3: Process Capability/Performance Index Needs
11.14 Process Capability Confidence Interval
11.15 Example 11.4: Confidence Interval for Process Capability
11.16 Process Capability/Performance for Attribute Data
11.17 Describing a Predictable Process Output When No Specification Exists
11.18 Example 11.5: Describing a Predictable Process Output When No Specification Exists
11.19 Process Capability/Performance Metrics from XmR Chart of Subgroup Means and Standard Deviation
11.20 Process Capability/Performance Metric for Nonnormal Distribution
11.21 Example 11.6: Process Capability/Performance Metric for Nonnormal Distributions: Box-Cox Trans
11.22 Implementation Comments
11.23 The S(4)/IEE Difference
11.24 S(4)/IEE Assessment
11.25 Exercises
12 Measurement Systems Analysis
12.1 MSA Philosophy
12.2 Variability Sources in a 30,000-Foot-Level Metric
12.3 S(4)/IEE Application Examples: MSA
12.4 Terminology
12.5 Gage R&R Considerations
12.6 Gage R&R Relationships
12.7 Additional Ways to Express Gage R&R Relationships
12.8 Preparation for a Measurement System Study
12.9 Example 12.1: Gage R&R
12.10 Linearity
12.11 Example 12.2: Linearity
12.12 Attribute Gage Study
12.13 Example 12.3: Attribute Gage Study
12.14 Gage Study of Destructive Testing
12.15 Example 12.4: Gage Study of Destructive Testing
12.16 A 5-Step Measurement Improvement Process
12.17 Example 12.5: A 5-Step Measurement Improvement Process
12.18 S(4)/IEE Assessment
12.19 Exercises
13 Cause-and-Effect Matrix and Quality Function Deployment
13.1 S(4)/IEE Application Examples: Cause-and-Effect Matrix
13.2 Quality Function Deployment (QFD)
13.3 Example 13.1: Creating a QFD Chart
13.4 Cause-and-Effect Matrix
13.5 Data Relationship Matrix
13.6 S(4)/IEE Assessment
13.7 Exercises
14 FMEA
14.1 S(4)/IEE Application Examples: FMEA
14.2 Implementation
14.3 Development of a Design FMEA
14.4 Design FMEA Tabular Entries
14.5 Development of a Process FMEA
14.6 Process FMEA Tabular Entries
14.7 Exercises
PART III S(4)/IEE ANALYZE PHASE FROM DMAIC (OR PASSIVE ANALYSIS PHASE)
15 Visualization of Data
15.1 S(4)/IEE Application Examples: Visualization of Data
15.2 Multi-vari Charts
15.3 Example 15.1: Multi-vari Chart of Injection-Molding Data
15.4 Box Plot
15.5 Example 15.2: Plots of Injection-Molding Data
15.6 S(4)/IEE Assessment
15.7 Exercises
16 Confidence Intervals and Hypothesis Tests
16.1 Confidence Interval Statements
16.2 Central Limit Theorem
16.3 Hypothesis Testing
16.4 Example 16.1: Hypothesis Testing
16.5 S(4)/IEE Assessment
16.6 Exercises
17 Inferences: Continuous Response
17.1 Summarizing Sampled Data
17.2 Sample Size: Hypothesis Test of a Mean Criterion for Continuous Response Data
17.3 Example 17.1: Sample Size Determination for a Mean Criterion Test
17.4 Confidence Intervals on the Mean and Hypothesis Test Criteria Alternatives
17.5 Example 17.2: Confidence Intervals on the Mean
17.6 Example 17.3: Sample Size—An Alternative Approach
17.7 Standard Deviation Confidence Interval
17.8 Example 17.4: Standard Deviation Confidence Statement
17.9 Percentage of the Population Assessments
17.10 Example 17.5: Percentage of the Population Statements
17.11 Statistical Tolerancing
17.12 Example 17.6: Combining Analytical Data with Statistical Tolerancing
17.13 Nonparametric Estimates: Runs Test for Randomization
17.14 Example 17.7: Nonparametric Runs Test for Randomization
17.15 S(4)/IEE Assessment
17.16 Exercises
18 Inferences: Attribute (Pass/Fail) Response
18.1 Attribute Response Situations
18.2 Sample Size: Hypothesis Test of an Attribute Criterion
18.3 Example 18.1: Sample Size—A Hypothesis Test of an Attribute Criterion
18.4 Confidence Intervals for Attribute Evaluations and Alternative Sample Size Considerations
18.5 Reduced Sample Size Testing for Attribute Situations
18.6 Example 18.2: Reduced Sample Size Testing—Attribute Response Situations
18.7 Attribute Sample Plan Alternatives
18.8 S(4)/IEE Assessment
18.9 Exercises
19 Comparison Tests: Continuous Response
19.1 S(4)/IEE Application Examples: Comparison Tests
19.2 Comparing Continuous Data Responses
19.3 Sample Size: Comparing Means
19.4 Comparing Two Means
19.5 Example 19.1: Comparing the Means of Two Samples
19.6 Comparing Variances of Two Samples
19.7 Example 19.2: Comparing the Variance of Two Samples
19.8 Comparing Populations Using a Probability Plot
19.9 Example 19.3: Comparing Responses Using a Probability Plot
19.10 Paired Comparison Testing
19.11 Example 19.4: Paired Comparison Testing
19.12 Comparing More Than Two Samples
19.13 Example 19.5: Comparing Means to Determine If Process Improved
19.14 S(4)/IEE Assessment
19.15 Exercises
20 Comparison Tests: Attribute (Pass/Fail) Response
20.1 S(4)/IEE Application Examples: Attribute Comparison Tests
20.2 Comparing Attribute Data
20.3 Sample Size: Comparing Proportions
20.4 Comparing Proportions
20.5 Example 20.1: Comparing Proportions
20.6 Comparing Nonconformance Proportions and Count Frequencies
20.7 Example 20.2: Comparing Nonconformance Proportions
20.8 Example 20.3: Comparing Counts
20.9 Example 20.4: Difference in Two Proportions
20.10 S(4)/IEE Assessment
20.11 Exercises
21 Bootstrapping
21.1 Description
21.2 Example 21.1: Bootstrapping to Determine Confidence Interval for Mean, Standard Deviation, P(p)
21.3 Example 21.2: Bootstrapping with Bias Correction
21.4 Bootstrapping Applications
21.5 Exercises
22 Variance Components
22.1 S(4)/IEE Application Examples: Variance Components
22.2 Description
22.3 Example 22.1: Variance Components of Pigment Paste
22.4 Example 22.2: Variance Components of a Manufactured Door Including Measurement System Component
22.5 Example 22.3: Determining Process Capability/Performance Using Variance Components
22.6 Example 22.4: Variance Components Analysis of Injection-Molding Data
22.7 S(4)/IEE Assessment
22.8 Exercises
23 Correlation and Simple Linear Regression
23.1 S(4)/IEE Application Examples: Regression
23.2 Scatter Plot (Dispersion Graph)
23.3 Correlation
23.4 Example 23.1: Correlation
23.5 Simple Linear Regression
23.6 Analysis of Residuals
23.7 Analysis of Residuals: Normality Assessment
23.8 Analysis of Residuals: Time Sequence
23.9 Analysis of Residuals: Fitted Values
23.10 Example 23.2: Simple Linear Regression
23.11 S(4)/IEE Assessment
23.12 Exercises
24 Single-Factor (One-Way) Analysis of Variance (ANOVA) and Analysis of Means (ANOM)
24.1 S(4)/IEE Application Examples: ANOVA and ANOM
24.2 Application Steps
24.3 Single-Factor Analysis of Variance Hypothesis Test
24.4 Single-Factor Analysis of Variance Table Calculations
24.5 Estimation of Model Parameters
24.6 Unbalanced Data
24.7 Model Adequacy
24.8 Analysis of Residuals: Fitted Value Plots and Data Transformations
24.9 Comparing Pairs of Treatment Means
24.10 Example 24.1: Single-Factor Analysis of Variance
24.11 Analysis of Means
24.12 Example 24.2: Analysis of Means
24.13 Example 24.3: Analysis of Means of Injection-Molding Data
24.14 Six Sigma Considerations
24.15 Example 24.4: Determining Process Capability Using One-Factor Analysis of Variance
24.16 Nonparametric Estimate: Kruskal–Wallis Test
24.17 Example 24.5: Nonparametric Kruskal–Wallis Test
24.18 Nonparametric Estimate: Mood’s Median Test
24.19 Example 24.6: Nonparametric Mood’s Median Test
24.20 Other Considerations
24.21 S(4)/IEE Assessment
24.22 Exercises
25 Two-Factor (Two-Way) Analysis of Variance
25.1 Two-Factor Factorial Design
25.2 Example 25.1: Two-Factor Factorial Design
25.3 Nonparametric Estimate: Friedman Test
25.4 Example 25.2: Nonparametric Friedman Test
25.5 S(4)/IEE Assessment
25.6 Exercises
26 Multiple Regression, Logistic Regression, and Indicator Variables
26.1 S(4)/IEE Application Examples: Multiple Regression
26.2 Description
26.3 Example 26.1: Multiple Regression
26.4 Other Considerations
26.5 Example 26.2: Multiple Regression Best Subset Analysis
26.6 Indicator Variables (Dummy Variables) to Analyze Categorical Data
26.7 Example 26.3: Indicator Variables
26.8 Example 26.4: Indicator Variables with Covariate
26.9 Binary Logistic Regression
26.10 Example 26.5: Binary Logistic Regression
26.11 Exercises
PART IV S(4)/IEE IMPROVE PHASE FROM DMAIC (OR PROACTIVE TESTING PHASE)
27 Benefiting from Design of Experiments (DOE)
27.1 Terminology and Benefits
27.2 Example 27.1: Traditional Experimentation
27.3 The Need for DOE
27.4 Common Excuses for Not Using DOE
27.5 Exercises
28 Understanding the Creation of Full and Fractional Factorial 2(k) DOEs
28.1 S(4)/IEE Application Examples: DOE
28.2 Conceptual Explanation: Two-Level Full Factorial Experiments and Two-Factor Interactions
28.3 Conceptual Explanation: Saturated Two-Level DOE
28.4 Example 28.1: Applying DOE Techniques to a Nonmanufacturing Process
28.5 Exercises
29 Planning 2(k) DOEs
29.1 Initial Thoughts When Setting Up a DOE
29.2 Experiment Design Considerations
29.3 Sample Size Considerations for a Continuous Response Output DOE
29.4 Experiment Design Considerations: Choosing Factors and Levels
29.5 Experiment Design Considerations: Factor Statistical Significance
29.6 Experiment Design Considerations: Experiment Resolution
29.7 Blocking and Randomization
29.8 Curvature Check
29.9 S(4)/IEE Assessment
29.10 Exercises
30 Design and Analysis of 2(k) DOEs
30.1 Two-Level DOE Design Alternatives
30.2 Designing a Two-Level Fractional Experiment Using Tables M and N
30.3 Determining Statistically Significant Effects and Probability Plotting Procedure
30.4 Modeling Equation Format for a Two-Level DOE
30.5 Example 30.1: A Resolution V DOE
30.6 DOE Alternatives
30.7 Example 30.2: A DOE Development Test
30.8 S(4)/IEE Assessment
30.9 Exercises
31 Other DOE Considerations
31.1 Latin Square Designs and Youden Square Designs
31.2 Evolutionary Operation (EVOP)
31.3 Example 31.1: EVOP
31.4 Fold-Over Designs
31.5 DOE Experiment: Attribute Response
31.6 DOE Experiment: Reliability Evaluations
31.7 Factorial Designs That Have More Than Two Levels
31.8 Example 31.2: Creating a Two-Level DOE Strategy from a Many-Level Full Factorial Initial Propos
31.9 Example 31.3: Resolution III DOE with Interaction Consideration
31.10 Example 31.4: Analysis of a Resolution III Experiment with Two-Factor Interaction Assessment
31.11 Example 31.5: DOE with Attribute Response
31.12 Example 31.6: A System DOE Stress to Fail Test
31.13 S(4)/IEE Assessment
31.14 Exercises
32 Robust DOE
32.1 S(4)/IEE Application Examples: Robust DOE
32.2 Test Strategies
32.3 Loss Function
32.4 Example 32.1: Loss Function
32.5 Robust DOE Strategy
32.6 Analyzing 2(k) Residuals for Sources of Variability Reduction
32.7 Example 32.2: Analyzing 2(k) Residuals for Sources of Variability Reduction
32.8 S(4)/IEE Assessment
32.9 Exercises
33 Response Surface Methodology
33.1 Modeling Equations
33.2 Central Composite Design
33.3 Example 33.1: Response Surface Design
33.4 Box-Behnken Designs
33.5 Mixture Designs
33.6 Simplex Lattice Designs for Exploring the Whole Simplex Region
33.7 Example 33.2: Simplex-Lattice Designed Mixture Experiment
33.8 Mixture Designs with Process Variables
33.9 Example 33.3: Mixture Experiment with Process Variables
33.10 Extreme Vertices Mixture Designs
33.11 Example 33.4: Extreme Vertices Mixture Experiment
33.12 Computer-Generated Mixture Designs/Analyses
33.13 Example 33.5: Computer-Generated Mixture Design/Analysis
33.14 Additional Response Surface Design Considerations
33.15 S(4)/IEE Assessment
33.16 Exercises
PART V S(4)/IEE CONTROL PHASE FROM DMAIC AND APPLICATION EXAMPLES
34 Short-Run and Target Control Charts
34.1 S(4)/IEE Application Examples: Target Control Charts
34.2 Difference Chart (Target Chart and Nominal Chart)
34.3 Example 34.1: Target Chart
34.4 Z Chart (Standardized Variables Control Chart)
34.5 Example 34.2: ZmR Chart
34.6 Exercises
35 Control Charting Alternatives
35.1 S(4)/IEE Application Examples: Three-Way Control Chart
35.2 Three-Way Control Chart (Monitoring within- and between-Part Variability)
35.3 Example 35.1: Three-Way Control Chart
35.4 CUSUM Chart (Cumulative Sum Chart)
35.5 Example 35.2: CUSUM Chart
35.6 Example 35.3: CUSUM Chart of Bearing Diameter
35.7 Zone Chart
35.8 Example 35.4: Zone Chart
35.9 S(4)/IEE Assessment
35.10 Exercises
36 Exponentially Weighted Moving Average (EWMA) and Engineering Process Control (EPC)
36.1 S(4)/IEE Application Examples: EWMA and EPC
36.2 Description
36.3 Example 36.1: EWMA with Engineering Process Control
36.4 Exercises
37 Pre-control Charts
37.1 S(4)/IEE Application Examples: Pre-control Charts
37.2 Description
37.3 Pre-control Setup (Qualification Procedure)
37.4 Classical Pre-control
37.5 Two-Stage Pre-control
37.6 Modified Pre-control
37.7 Application Considerations
37.8 S(4)/IEE Assessment
37.9 Exercises
38 Control Plan, Poka-yoke, Realistic Tolerancing, and Project Completion
38.1 Control Plan: Overview
38.2 Control Plan: Entries
38.3 Poka-yoke
38.4 Realistic Tolerances
38.5 Project Completion
38.6 S(4)/IEE Assessment
38.7 Exercises
39 Reliability Testing/Assessment: Overview
39.1 Product Life Cycle
39.2 Units
39.3 Repairable versus Nonrepairable Testing
39.4 Nonrepairable Device Testing
39.5 Repairable System Testing
39.6 Accelerated Testing: Discussion
39.7 High-Temperature Acceleration
39.8 Example 39.1: High-Temperature Acceleration Testing
39.9 Eyring Model
39.10 Thermal Cycling: Coffin–Manson Relationship
39.11 Model Selection: Accelerated Testing
39.12 S(4)/IEE Assessment
39.13 Exercises
40 Reliability Testing/Assessment: Repairable System
40.1 Considerations When Designing a Test of a Repairable System Failure Criterion
40.2 Sequential Testing: Poisson Distribution
40.3 Example 40.1: Sequential Reliability Test
40.4 Total Test Time: Hypothesis Test of a Failure Rate Criterion
40.5 Confidence Interval for Failure Rate Evaluations
40.6 Example 40.2: Time-Terminated Reliability Testing Confidence Statement
40.7 Reduced Sample Size Testing: Poisson Distribution
40.8 Example 40.3: Reduced Sample Size Testing—Poisson Distribution
40.9 Reliability Test Design with Test Performance Considerations
40.10 Example 40.4: Time-Terminated Reliability Test Design—with Test Performance Considerations
40.11 Posttest Assessments
40.12 Example 40.5: Postreliability Test Confidence Statements
40.13 Repairable Systems with Changing Failure Rate
40.14 Example 40.6: Repairable Systems with Changing Failure Rate
40.15 Example 40.7: An Ongoing Reliability Test (ORT) Plan
40.16 S(4)/IEE Assessment
40.17 Exercises
41 Reliability Testing/Assessment: Nonrepairable Devices
41.1 Reliability Test Considerations for a Nonrepairable Device
41.2 Weibull Probability Plotting and Hazard Plotting
41.3 Example 41.1: Weibull Probability Plot for Failure Data
41.4 Example 41.2: Weibull Hazard Plot with Censored Data
41.5 Nonlinear Data Plots
41.6 Reduced Sample Size Testing: Weibull Distribution
41.7 Example 41.3: A Zero Failure Weibull Test Strategy
41.8 Lognormal Distribution
41.9 Example 41.4: Lognormal Probability Plot Analysis
41.10 S(4)/IEE Assessment
41.11 Exercises
42 Pass/Fail Functional Testing
42.1 The Concept of Pass/Fail Functional Testing
42.2 Example 42.1: Automotive Test—Pass/Fail Functional Testing Considerations
42.3 A Test Approach for Pass/Fail Functional Testing
42.4 Example 42.2: A Pass/Fail System Functional Test
42.5 Example 42.3: A Pass/Fail Hardware/Software System Functional Test
42.6 General Considerations When Assigning Factors
42.7 Factor Levels Greater Than 2
42.8 Example 42.4: A Software Interface Pass/Fail Functional Test
42.9 A Search Pattern Strategy to Determine the Source of Failure
42.10 Example 42.5: A Search Pattern Strategy to Determine the Source of Failure
42.11 Additional Applications
42.12 A Process for Using DOEs with Product Development
42.13 Example 42.6: Managing Product Development Using DOEs
42.14 S(4)/IEE Assessment
42.15 Exercises
43 S(4)/IEE Application Examples
43.1 Example 43.1: Improving Product Development
43.2 Example 43.2: A QFD Evaluation with DOE
43.3 Example 43.3: A Reliability and Functional Test of an Assembly
43.4 Example 43.4: A Development Strategy for a Chemical Product
43.5 Example 43.5: Tracking Ongoing Product Compliance from a Process Point of View
43.6 Example 43.6: Tracking and Improving Times for Change Orders
43.7 Example 43.7: Improving the Effectiveness of Employee Opinion Surveys
43.8 Example 43.8: Tracking and Reducing the Time of Customer Payment
43.9 Example 43.9: Automobile Test—Answering the Right Question
43.10 Example 43.10: Process Improvement and Exposing the Hidden Factory
43.11 Example 43.11: Applying DOE to Increase Website Traffic—A Transactional Application
43.12 Example 43.12: AQL Deception and Alternative
43.13 Example 43.13: S(4)/IEE Project: Reduction of Incoming Wait Time in a Call Center
43.14 Example 43.14: S(4)/IEE Project: Reduction of Response Time to Calls in a Call Center
43.15 Example 43.15: S(4)/IEE Project: Reducing the Number of Problem Reports in a Call Center
43.16 Example 43.16: S(4)/IEE Project: AQL Test Assessment
43.17 Example 43.17: S(4)/IEE Project: Qualification of Capital Equipment
43.18 Example 43.18: S(4)/IEE Project: Qualification of Supplier’s Production Process and Ongoing
43.19 Exercises
PART VI S(4)/IEE LEAN AND THEORY OF CONSTRAINTS
44 Lean and Its Integration with S(4)/IEE
44.1 Waste Prevention
44.2 Principles of Lean
44.3 Kaizen
44.4 S(4)/IEE Lean Implementation Steps
44.5 Time-Value Diagram
44.6 Example 44.1: Development of a Bowling Ball
44.7 Example 44.2: Sales Quoting Process
44.8 5S Method
44.9 Demand Management
44.10 Total Productive Maintenance (TPM)
44.11 Changeover Reduction
44.12 Kanban
44.13 Value Stream Mapping
44.14 Exercises
45 Integration of Theory of Constraints (TOC) in S(4)/IEE
45.1 Discussion
45.2 Measures of TOC
45.3 Five Focusing Steps of TOC
45.4 S(4)/IEE TOC Application and the Development of Strategic Plans
45.5 TOC Questions
45.6 Exercises
PART VII DFSS AND 21-STEP INTEGRATION OF THE TOOLS
46 Manufacturing Applications and a 21-Step Integration of the Tools
46.1 A 21-Step Integration of the Tools: Manufacturing Processes
47 Service/Transactional Applications and a 21-Step Integration of the Tools
47.1 Measuring and Improving Service/Transactional Processes
47.2 21-Step Integration of the Tools: Service/Transactional Processes
48 DFSS Overview and Tools
48.1 DMADV
48.2 Using Previously Described Methodologies within DFSS
48.3 Design for X (DFX)
48.4 Axiomatic Design
48.5 TRIZ
48.6 Exercise
49 Product DFSS
49.1 Measuring and Improving Development Processes
49.2 A 21-Step Integration of the Tools: Product DFSS
49.3 Example 49.1: Notebook Computer Development
49.4 Product DFSS Examples
50 Process DFSS
50.1 A 21-Step Integration of the Tools: Process DFSS
PART VIII MANAGEMENT OF INFRASTRUCTURE AND TEAM EXECUTION
51 Change Management
51.1 Seeking Pleasure and Fear of Pain
51.2 Cavespeak
51.3 The Eight Stages of Change and S(4)/IEE
51.4 Managing Change and Transition
51.5 How Does an Organization Learn?
52 Project Management and Financial Analysis
52.1 Project Management: Planning
52.2 Project Management: Measures
52.3 Example 52.1: CPM/PERT
52.4 Financial Analysis
52.5 S(4)/IEE Assessment
52.6 Exercises
53 Team Effectiveness
53.1 Orming Model
53.2 Interaction Styles
53.3 Making a Successful Team
53.4 Team Member Feedback
53.5 Reacting to Common Team Problems
53.6 Exercise
54 Creativity
54.1 Alignment of Creativity with S(4)/IEE
54.2 Creative Problem Solving
54.3 Inventive Thinking as a Process
54.4 Exercise
55 Alignment of Management Initiatives and Strategies with S(4)/IEE
55.1 Quality Philosophies and Approaches
55.2 Deming’s 7 Deadly Diseases and 14 Points for Management
55.3 Organization Management and Quality Leadership
55.4 Quality Management and Planning
55.5 ISO 9000:2000
55.6 Malcolm Baldrige Assessment
55.7 Shingo Prize
55.8 GE Work-Out
55.9 S(4)/IEE Assessment
55.10 Exercises
People also search for Implementing Six Sigma Smarter Solutions Using Statistical Methods 2nd:
smart six sigma
dmaic solutions
digital six sigma
dmaic six sigma example
implementing solutions
Reviews
There are no reviews yet.