Application of seven new management tools. Seven Quality Management Tools

One of the basic principles of quality management is to make decisions based on facts. This is most fully solved by the method of modeling processes, both production and management tools of mathematical statistics. However, modern statistical methods are quite difficult to understand and widely used in practice without in-depth mathematical training of all participants in the process. By 1979, the Japanese Union of Scientists and Engineers (JUSE) had put together seven fairly easy-to-use visual methods for process analysis. Despite their simplicity, they maintain a connection with statistics and give professionals the opportunity to use their results and, if necessary, improve them.

Cause-and-effect diagram (Ishikawa diagram)

The 5M type diagram considers quality components such as “man”, “machine”, “material”, “method”, “control”, and in the 6M type diagram the “environment” component is added to them. In relation to the problem of qualimetric analysis being solved, for the “human” component it is necessary to determine factors related to the convenience and safety of performing operations; for the “machine” component - the relationship of the structural elements of the analyzed product with each other, associated with the implementation of this operation; for the “method” component - factors related to the productivity and accuracy of the operation performed; for the “material” component - factors associated with the absence of changes in the properties of the product materials during the execution of this operation; for the “control” component - factors associated with reliable recognition of errors in the process of performing an operation; for the “environment” component - factors associated with the impact of the environment on the product and the product on the environment.

Example of Ishikawa diagram

Checklists

Checklists can be used for both qualitative and quantitative control.

Histograms

Histograms are one of the variants of a bar chart that displays the dependence of the frequency of the quality parameters of a product or process falling into a certain range of values ​​from these values.

The histogram is constructed as follows:

  1. We define highest value quality indicator.
  2. We determine the lowest value of the quality indicator.
  3. We define the range of the histogram as the difference between the largest and smallest value.
  4. Determine the number of histogram intervals. You can often use an approximate formula:

    (number of intervals) = N (number of quality indicator values) For example, if the number of indicators = 50, the number of histogram intervals = 7.

  5. Determine the length of the histogram interval = (histogram range) / (number of intervals).
  6. We divide the histogram range into intervals.
  7. We count the number of hits of results in each interval.
  8. Determine the frequency of hits in the interval = (number of hits)/(total number of quality indicators)
  9. Building a bar chart

Scatter plots

Scatter plots are graphs like the one shown below that show the correlation between two various factors.

Scatter diagram: There is practically no relationship between quality indicators.

Scatter plot: There is a direct relationship between quality indicators

Scatter plot: There is an inverse relationship between quality indicators

Pareto Analysis

Pareto analysis gets its name from the Italian economist Vilfredo Pareto, who showed that most capital (80%) is in the hands of a small number of people (20%). Pareto developed logarithmic mathematical models that describe this heterogeneous distribution, and the mathematician M.Oa. Lorenz provided graphic illustrations.

The Pareto Rule is a “universal” principle that is applicable in many situations, and without a doubt - in solving quality problems. Joseph Juran noted the “universal” application of the Pareto principle to any group of causes that cause one or another consequence, with most of the consequences caused by a small number of causes. Pareto analysis ranks individual areas by significance or importance and calls for identifying and first eliminating those causes that cause the greatest number of problems (inconsistencies).

Pareto analysis is usually illustrated by a Pareto diagram (Fig. below), on which the x-axis shows the causes of quality problems in descending order of the problems they cause, and the y-axis shows the problems themselves in quantitative terms, both numerically and cumulatively. (cumulative) percentage.

The diagram clearly shows the area for priority action, outlining the reasons that cause the largest number of errors. Thus, first of all, preventive measures should be aimed at solving these problems.

Pareto chart

Stratification

Basically, stratification is the process of sorting data according to some criteria or variables, the results of which are often shown in the form of charts and graphs

We can classify a data set into different groups (or categories) with general characteristics, called variable stratification. It is important to establish which variables will be used for sorting.

Stratification is the basis for other tools such as Pareto analysis or scatterplots. This combination of tools makes them more powerful.

The figure shows an example of analyzing the source of defects. All defects (100%) were classified into four categories - by supplier, by operator, by shift and by equipment. From the analysis of the presented bottom data, it is clearly seen that the greatest contribution to the presence of defects comes from in this case"supplier 1".

Data stratification.

Control cards

Control charts are a special type of chart, first proposed by W. Shewhart in 1925. Control charts have the form shown in Fig. 4.12. They reflect the nature of changes in quality indicators over time.

General view of the control chart

Control charts for quantitative characteristics

Control charts for quantitative characteristics are usually double maps, one of which depicts the change in the average value of the process, and the second - the scatter of the process. Scatter can be calculated either from the process range R (the difference between the largest and smallest value) or from the process standard deviation S.

Nowadays x-S cards are commonly used, x-R cards are used less frequently.

Control charts based on quality characteristics

Map for the proportion of defective products (p - map)

The p-map calculates the proportion of defective products in the sample. It is used when the sample size is variable.

Map for the number of defective items (np - map)

The np map calculates the number of defective products in the sample. It is used when the sample size is constant.

Map for the number of defects in the sample (c - map)

The c-map calculates the number of defects in the sample.

Map for the number of defects per product (u - map)

The u-map calculates the number of defects per product in the sample.

Control card form

The simple quality control tools discussed above (the Seven Tools of Quality Control) are designed to analyze quantitative quality data. They make it possible to solve 95% of the problems of analysis and quality management in various fields using fairly simple but scientifically based methods. They use techniques mainly of mathematical statistics, but are available to all participants in the production process and are used at almost all stages life cycle products.

However, when creating a new product, not all facts are numerical in nature. There are factors that can only be verbal description. These factors account for approximately 5% of quality problems. These problems arise mainly in the field of managing processes, systems, and teams, and when solving them, along with statistical methods, it is necessary to use the results of operational analysis, optimization theory, psychology, etc.

Therefore, JUSE (Union of Japanese Scientists and Engineers) in 1979, based on these sciences, developed a very powerful and useful set of tools to facilitate the task of quality management when analyzing these factors.

The “Seven Tools of Management” include:

1) affinity diagram;

2) diagram (graph) of relationships (dependencies) (interrelationship diagram);

3) tree (system) diagram (decision tree);

4) matrix diagram or quality table;

5) arrow diagram;

6) diagram of the process of program implementation (planning the implementation of the process) (Process Decision Program Chart - PDPC);

7) priority matrix (matrix data analysis).



The collection of initial data is usually carried out during brainstorming sessions between specialists in the field under study and non-specialists who are able to generate productive ideas on issues that are new to them.

Each participant can speak freely on the topic under discussion. His proposals are recorded. The results of the discussion are processed and means to solve the problem are proposed.

The scope of the Seven New Quality Tools is rapidly expanding. These methods are used in such areas as office management and management, education and training, etc.

It is most effective to apply the “Seven New Tools” at the stage

· development of new products and project preparation;

· to develop measures to reduce defects and complaints;

· to increase reliability and safety;

· to ensure the production of environmentally friendly products;

· to improve standardization, etc.

Let's take a brief look at these tools.

1. Affinity diagram (AD)- allows you to identify the main violations of the process by combining homogeneous oral data.

§ determination of the topic for data collection;

§ creation of a group to collect data from consumers;

§ recording the received data on cards (self-adhesive sheets) that can be moved freely;

§ grouping (systematization) of homogeneous data in areas of various levels;

§ formation of a common opinion among group members on data distribution;

§ creation of a hierarchy of selected areas.

2. Relationship diagram (DI)- helps determine the relationship between the main causes of process disruption and problems existing in the organization.

The procedure for creating a DS consists of the following steps:

· a group of specialists is formed who establish and group data on the problem;

· the identified causes are placed on cards, and a connection is established between them. When comparing causes (events), you need to ask the question: “Is there a connection between these two events?” If so, then ask: “Which event causes or causes another event to occur?”;

· draw an arrow between two events, showing the direction of influence;

· after identifying the relationships between all events, count the number of arrows emanating from each and entering each event.

The event with the largest number of outgoing arrows is the initial event.

3. Tree diagram (TD). After determining using a relationship diagram (DI), the most important issues, characteristics, etc., with the help of DD, they are looking for methods to solve these problems. DD indicates the paths and tasks at various levels that must be solved to achieve a given goal.

DD is used:

1. when consumer wishes are converted into organizational performance indicators;

2. it is required to establish a sequence for solving problems to achieve the goal;

3. secondary tasks must be solved before the main task;

4. The facts defining the main problem must be identified.

Creating a DD includes the following steps:

§ a group is organized that, based on the DS and DV, determines the research problem;

§ identify possible root causes of the identified problem;

§ highlight main reason;

§ develop measures to completely or partially eliminate it.

4. Matrix diagram (MD)- allows you to visualize the relationships between various factors and the degree of their closeness. This increases the efficiency of solving various problems that take into account such relationships. Factors analyzed using MD may include:

§ quality problems and the reasons for their occurrence;

§ problems and ways to solve them;

§ consumer properties products, their engineering characteristics;

§ properties of the product and its components;

§ process quality characteristics and its elements;

§ characteristics of the organization’s performance;

§ elements of the quality management system, etc.

The matrix diagram method, like other new quality tools, is usually implemented by a team tasked with some quality improvement task. The degree of closeness of the relationship between factors is assessed either using expert assessments or using correlation analysis.

5.Arrow diagram (AD). After a preliminary analysis of the problem and ways to solve it, carried out using the DS, DV, DD, MD methods, a work plan is drawn up to solve the problem, for example, to create a product. The plan must contain all stages of work and information about their duration. To facilitate the development and control of the work plan by increasing its visibility, SD is used. An arrow chart can be in the form of either a Gantt chart or a network graph. The network graph, using arrows, clearly shows the sequence of actions and the influence of a particular operation on the progress of subsequent operations, therefore the network graph is more convenient for monitoring the progress of work than a Gantt chart.

6.Process planning diagram - PDPC (Process Decision Program Chart) it is applied for:

§ planning and assessing the timing of complex processes in the field scientific research,

§ production of new products,

§ solving management problems with many unknowns, when it is necessary to provide for various solutions and the possibility of adjusting the work program.

Using the PDPC diagram, reflect the process to which the Deming cycle (PDCA) is applicable. As a result of using the Deming cycle for a specific process, if necessary, this process is simultaneously improved.

7.Matrix data analysis (priority matrix).

This method, along with a relationship diagram (DI) and, to a certain extent, a matrix diagram (MD), is intended to highlight factors that have a priority impact on the problem being studied. The peculiarity of this method is that the task is solved by multifactorial analysis of a large number of experimental data, often indirectly characterizing the relationships being studied. Analysis of the relationships between these data and the factors being studied allows us to identify the most important factors, for which relationships are then established with the output indicators of the phenomenon (process) being studied.

SELF-TEST QUESTIONS

1.List seven simple quality control tools. What are they used for?;

2. What are checklists and Pareto charts used for?;

3. What factors influencing quality are presented in the Ishikawa diagram?;

4. What is determined using a histogram, scatter plot and stratification?;

5. What simple tool is used to judge the controllability of a process?;

6. What is the purpose of the “Seven New Quality Control Tools”? List them.

7. At what stages is it most effective to apply the “Seven New Quality Tools”?

The seven quality control tools discussed are designed to analyze quantitative quality data. They make it possible to solve 95% of the problems of analysis and quality management in various fields using fairly simple, but at the same time, scientifically based methods. They use techniques mainly of mathematical statistics, are available to all participants in the production process and are used at almost all stages of the product life cycle.

However, when creating a new product, not all factors are of a numerical nature. There are factors that can only be described verbally. These factors account for approximately 5% of quality problems. These problems arise mainly in the field of managing processes, systems, and teams, and when solving them, along with statistical methods, it is necessary to use the results of operational analysis, optimization theory, psychology, etc.

Union of Japanese Scientists and Engineers JUSE (Union of Japanese Scientists and Engineers) Based on these sciences, he has developed a very powerful and useful set of tools to facilitate the task of quality management when analyzing these factors. These tools are called the Seven New Quality Tools. These include:

  • affinity diagram;
  • diagram (graph) of relationships;
  • tree (system) diagram (decision tree);
  • matrix diagram or quality table;
  • arrow diagram;
  • diagram of the program implementation process (planning the implementation process);
  • priority matrix (analysis of matrix data).

Affinity diagrams used to classify ideas (causes, indicators, problems, consequences, etc.) into groups united general character, the nature of these ideas.

To determine the causes of the problem working group using the brainstorming method to identify possible reasons, which are collected in the form of disparate data.

Systematize ideas that have a common focus into groups. This work is done without discussion. Names are not assigned to common characteristics.

If there are similarities between some groups, they can be combined into one larger group. At this stage, in the process of general discussion, the composition of the groups is agreed upon, some ideas are reformulated, combined or differentiated. Individual data can be transferred to other groups. We identify a common feature for each group.

The composition of the data for each group is reviewed, the name of the groups and the final version of the generalizing feature are formed.

The clarity and ease of presentation of data provided by the affinity diagram is its indisputable advantage.

But the diagram also has a significant drawback - it is the subjectivity of the distribution of data according to related characteristics. This deficiency manifests itself most seriously when individual work. Brainstorming method and teamwork somewhat reduce subjectivity, but do not eliminate it.

Relationship diagram is intended for ranking related factors (conditions, causes, indicators, etc.) according to the strength of connection between them. The relationship diagram serves as a tool for identifying the most important, priority factors within each group. Conclusions are drawn on the basis of expert assessments during the brainstorming process.

  • 1) write down each problem on a separate sheet of paper and attach the sheets of paper in a circle on the poster;
  • 2) start from the top sheet and, moving clockwise, ask the question: “Is there a connection between these two events?” If there is, then ask: “Which event causes or causes another event to occur?”;
  • 3) draw an arrow between two events, showing the direction of influence;
  • 4) after identifying the relationships between all events, count the number of arrows emanating from each and entering each event.

The event with the largest number of outgoing arrows is the initial event. The team usually identifies two or three initial events that it must discuss to decide which one to focus on first. This takes into account various factors, for example, the organization’s limitations, resources, and experience.

Tree diagram. After identifying the most important issues, characteristics, etc. using a relationship diagram. using a tree diagram, they look for methods to solve these problems, ensure product characteristics, etc.

When searching for the causes of a problem, the “why-why” method is used. Members of the team that is solving the problem ask the question: “Why did it happen?” - and get a list of first-level reasons. Then the question “Why?” address each reason of the first level and receive a list of reasons of the second level, etc. The relationships between a problem (characteristic, etc.) and its causes at various levels are depicted in the form of a multi-stage tree structure. Schematic diagram such a diagram is shown in Fig. 8.22.

Rice. 8.22.

The benefits of a tree diagram are related to its clarity and ease of use and understanding. Besides, tree diagram can easily be combined with other quality tools, complementing them.

The disadvantages of this tool include the subjectivity of the arrangement of elements at a particular level of detail (especially if individual work is performed).

Matrix diagram allows you to visualize the relationships between various factors and the degree of their closeness. The analysis includes problems in the field of quality and the reasons for their occurrence, problems and ways to eliminate them, consumer properties of products, their engineering characteristics, properties of the product and its components, characteristics of the organization’s performance and elements of the quality management system, etc.

The matrix diagram shown in Fig. 8.23 is the most common. It is called the /.-form, represents the relationship between two groups of factors, is widely used in structuring the quality function and therefore is called the quality table. Information about the degree of closeness of the relationship between various factors, presented using special symbols, allows you to more accurately model these relationships and more effectively manage various factors and processes.


Rice. 8.23.a b a 2,..., th, And b 2,..., b,- components of the studied objects A and B, which are characterized by different tightness of connections

Arrow diagram. After a preliminary analysis of the problem and ways to solve it, a work plan is drawn up to solve the problem, for example, to create a product. The plan must contain all stages of work and information about their duration. To facilitate the development and control of a work plan by increasing its visibility, an arrow diagram is used. An arrow chart can be in the form of either a Gantt chart or a network graph.

Figure 8.24 shows the order and timing of work on the construction of a turnkey house within 12 months, presented in the form of a Gantt chart.

The network graph for performing the same work is shown in Fig. 8.25. The numbers at the nodes of the graph correspond to the serial number of the operation shown in Fig. 8.24. In this case, the final operation corresponding to the “final inspection and delivery of the house” in Fig. 8.25 is divided into two operations: 11 - final inspection and 12 - handing over the house. The numbers under the arrows of the network graph correspond to the duration (number of months) of the operation, the number of which is indicated in the node of the graph from which the arrow comes.


Rice. 8.24.


Rice. 8.25.

Process Execution Planning DiagramPDPC (Process Decision Program Chart) used for planning, estimating the timing of complex processes in the field of scientific research, production of new products, solving management problems with many unknowns, when it is necessary to provide for various solution options, and the possibility of adjusting the work program. In this case, they first draw up a program and, if deviations from the planned points arise at the intermediate stages of its implementation, they focus on activities that bring the process into line with the program. When, during the execution of a program, an unforeseen situation arises that could not be taken into account in advance, it is necessary to draw up new program devoid of previous shortcomings.

Figure 8.26 shows an example PDPC- planning diagrams for the implementation of the process of selection and control of suppliers.

The benefits of a decision diagram are clear. With its help, you can see possible risks on the work execution plan and select one or another corrective action to reduce these risks. The disadvantages of this quality tool include high labor intensity if the plan has a significant number of tasks.

Priority Matrix is a tool that can be used to rank the importance of data and information obtained from brainstorming or matrix diagrams. Its application makes it possible to identify important data in situations where there are no objective criteria for determining its significance, or when people involved in the decision-making process have different opinions about the priority of data. The main purpose of a priority matrix is ​​to distribute different sets of elements in order of importance, as well as to establish the relative importance between elements through numerical values.

The priority matrix can be constructed in three ways. Construction options depend on the method of determining the criteria by which the priority of data is assessed - the analytical method, the method of determining criteria based on consensus and the matrix method.

The analytical method is used when the number of criteria is relatively small (no more than 6), it is necessary to obtain the full consent of all experts participating in the assessment, the number of experts does not exceed eight people, large losses are possible in case of an error in prioritization.

The method of determining criteria based on consensus is used when the number of experts is more than eight people, there is a significant number of criteria (from 6 to 15), and there is a large number of ranked data (about 10-20 elements).

Rice. 8.26.

suppliers

The matrix method is used mainly when there is a strong relationship between the elements being ranked, and finding an element with greatest influence is critical for solving the problem.

The procedure for constructing the priority matrix for all three options is basically the same. The differences lie in determining the significance of the criteria.

The priority matrix is ​​constructed in the following order.

  • 1. The main goal for the sake of which the priority matrix is ​​built is determined.
  • 2. A team of experts is formed that will work on the task. Experts must understand the scope of the problem being solved and have an understanding of the methods teamwork(for example, about the brainstorming method, the Delphi method).
  • 3. A list of possible solutions to the problem posed is compiled. The list can be compiled through the use of other quality tools, for example, brainstorming, Ishikawa diagram, etc.
  • 4. The composition of the criteria is determined. Initially, it can be quite large. The priority matrix will include only part of these criteria, since in the future it will be reduced by selecting the most important and significant ones.
  • 5. A weighting factor is assigned to each criterion. The weighting coefficient is assigned depending on the selected method.

For the analytical method:

  • a rating scale is established for each criterion;
  • For each numerical value of the scale, a definition of significance is given. In order to make differences in weighting coefficients more noticeable, a scale with numerical values ​​1-3-9 is usually used, where 1 is low significance, 3 is medium significance, 9 is high significance.

For the consensus method:

  • a certain number of points is established that experts must distribute between the criteria. The number of points must be no less than the number of criteria;
  • each expert distributes the assigned points between the criteria;
  • The total number of points for each of the criteria is determined. This value will be the weighting coefficient of each criterion.

For the matrix method:

  • the criteria are arranged in the form of an /.-matrix;
  • a scale is established for pairwise comparison of criteria (for example, “O” - criterion A is less significant than criterion B; “1” - criterion A and criterion B are equivalent; “2” - criterion A is more significant than criterion B);
  • a pairwise comparison of all criteria is carried out;
  • the weight coefficient of each criterion is determined (the weight coefficient is calculated as the sum of all values ​​in the matrix row).
  • 6. The most significant criteria are selected. This can be done by discarding the criteria with lowest values weight coefficients. If the number of criteria is not large, then for further work all criteria can be saved.
  • 7. A method is established for calculating the significance of each of the decisions of the priority matrix (defined in step 3) based on the selected criteria (defined in step 6).
  • 8. Each decision is evaluated against each criterion.
  • 9. The score is multiplied by the weight coefficient of the corresponding criterion. The obtained values ​​are summed up for each of the decisions, which gives a final assessment of the priority of the decisions. The final score, which the priority matrix contains, can be left as is, or converted into percentages.
  • 10. The resulting list of solutions is sorted in order of priority. If necessary, the priority of decisions can be presented in the form of a Pareto chart.

Example 8.2

Build a priority matrix.

  • 1. We determine the purpose of compiling a priority matrix: to reduce the number of defects in the product.
  • 2. Forming a team of experts: for example, the team of experts will consist of three people. Each of them is familiar with the method of developing solutions based on brainstorming.
  • 3. We make a list of possible solutions to the problem (generated by a team of experts):
    • change manufacturing technology;
    • conduct training for craftsmen;
    • change the product design.
  • 4. We determine the composition of the criteria (the composition of the criteria for assessing the priority of decisions):
    • no more than 100 people/hour are required to implement the solution;
    • low cost of implementing the solution;
    • the number of personnel involved is no more than 50 people;
    • reduction in waste costs by at least 1.5 times.
  • 5. Assign a weighting coefficient for each criterion. Let's consider the purpose of criteria for each of the three methods - analytical, consensus method and matrix method.

For analytical method

For the consensus method We establish that each expert can distribute 4 points between the criteria.

For the matrix method

Ending

  • 6. Determine the most significant criteria: since the number of criteria selected for the example is only 4, we leave all the criteria.
  • 7. Select a method for calculating the significance of each of the solutions proposed earlier (in step 3). To determine significance, we use a scale of 1-3-9, where 9 is the most significant decision, 3 is a significant decision, 1 is an insignificant decision.
  • 8. We will assess the significance of each decision in relation to each criterion: to assess the significance of decisions, we will use the analytical method. The weighting coefficients of the criteria are determined in step 5.

Solution

Criterion

Requires no more than 100 people/hour to implement the solution

Low

price

implementation

solutions

The number of personnel involved is no more than 50 people

Reducing waste costs by at least 1.5 times

Weighting factor 3

Weight factor 9

Weighting factor 1

Weight factor 9

Change manufacturing technology

Increase the number of control points

Conduct training for masters

Change product design

9. We determine the priority of each solution: the score of each solution is multiplied by the weighting coefficient of each criterion and the values ​​are summed up.

Criterion

Requires no more than 100 people/hour to implement the solution

price

implementation

The number of personnel involved is no more than 50 people

Reducing waste costs by at least 1.5 times

coefficient

coefficient

coefficient

coefficient

Change

technology

manufacturing

Increase the number of control points

Conduct

education

masters

Change

design

  • 10. We distribute solutions in order of priority:
    • conduct training of craftsmen - 118;
    • change manufacturing technology - 100;
    • increase the number of control points - 90;
    • change the design of the product - 72.

The priority matrix, compared to other ranking methods, makes it possible to more objectively assess the significance of the data and establish the value of this significance.

At the same time, the disadvantage of this quality tool is also obvious - it is very labor-intensive, especially when it is necessary to rank a large amount of data according to a large number criteria.

These seven new tools are intended to complement other widely used statistical quality control methods. It is the joint use that is important known methods quality control and seven new quality control tools.

Test questions and assignments

  • 1. Describe the features statistical methods quality control.
  • 2. List the types of control charts for statistical regulation of technological processes.
  • 3. What is the difference between control based on a quantitative characteristic and control based on an alternative characteristic?
  • 4. Specify the procedure for constructing the control chart.
  • 5. How is the data obtained on the control chart interpreted?
  • 6. Draw an example of a control chart and explain the purpose of all the lines on the map.
  • 7. How to control a technological process using control charts?
  • 8. What is the reproducibility index and what does it reflect?
  • 9. What kind of quality control is called selective?
  • 10. Provide a diagram of defect levels. What is the difference?
  • 11. What is a sampling plan?
  • 12. What is the difference between the supplier’s risk and the consumer’s risk during selective product control?
  • 13. Provide a diagram of one-stage and two-stage control plans. Explain the procedure for their implementation.
  • 14. What are the operational characteristics of a sampling plan?
  • 15. When and for what purpose are the “seven tools” of quality control used?
  • 16. When and for what purpose are the seven new quality control tools used?
  • 17. Describe the method of layering or stratification. For what purpose is it used in quality management?
  • 18. What types of graphs do you know? For what purpose are they used in quality management?
  • 19. Describe the Pareto diagram. For what purpose is it used in quality management?
  • 20. Describe a cause-and-effect diagram. For what purpose is it used in quality management?
  • 21. Describe the check sheet and histogram. For what purpose are they used in quality management?
  • 22. Describe a scatter plot. For what purpose is it used in quality management?
  • 23. Describe the diagrams used in quality management: affinity, relationships, tree, matrix, arrow, process diagram, priority matrix. For what purpose are they used in quality management?

Seven Essential Quality Tools is the name given to a set of very simple graphical techniques that have been identified as most useful for solving simple, everyday quality issues. They're called main because even people with little or no statistical training will be able to understand these principles and apply them to their daily work.

I have often seen that even highly qualified personnel ignore the idea of ​​using modern instruments qualities such as experimental design, hypothesis testing, or multivariate analysis. Although it would be useful for most professionals to know that majority quality issues can be solved using these seven essential quality tools.

The purpose of this article is to review these basic tools and their effective use. Receipt best results using any of these tools does not require evidence; The quality specialist must provide complete, objective and sufficient information.

Tool #1: Ishikawa diagrams

(also called " fish skeleton" or " cause-and-effect diagrams") are cause-and-effect diagrams that show the root cause(s) of a particular event. A common way to build a truly informative fishbone is to use the 5 Whys method and a cause-and-effect diagram together.

  1. People - Personnel involved in the process; stakeholders, etc.
  2. Methods - Processes for performing tasks and specific requirements for performing them, such as policies, procedures, rules, regulations and laws
  3. Machinery - Any equipment, computers, tools, etc. needed to perform the job
  4. Materials - Raw materials, parts, pens, paper, etc. used to produce the final product
  5. Indicators - Data obtained from a process that is used to evaluate its quality
  6. Environment- Conditions such as location, time, temperature and culture in which this process is carried out

Tool #2: Checklist

It is a structured, prepared form for collecting and analyzing data. This is a versatile tool that can be adapted for a wide variety of purposes. The data collected may be quantitative or qualitative. When the information is quantitative, the checklist is called accounting sheet.

The defining characteristic of a checklist is that data is entered into it in the form of marks (“checkmarks”). A typical check sheet is divided into columns, and the marks made in different columns have different meanings. The data is read based on the location and number of marks on the sheet. Checklists typically use a “header” that answers five questions: Who? What? Where? When? Why? Develop operational definitions for each of the questions.

  1. Who filled out the checklist?
  2. What was collected (what each mark, lot identification number, or number of items in the lot represents)
  3. Where did the data collection take place (equipment, premises, tools)
  4. When the data was collected (hour, shift, day of week)
  5. Why this data was collected

Tool #3:

Is a display statistical information, which is represented by rectangles to show the frequency of data items in successive numerical intervals of the same size. In the most common form of a histogram, the independent variable is plotted on the horizontal axis and the dependent variable is plotted on the vertical axis.

The main purpose of a histogram is to clarify the data presented. It is a useful tool for plotting processed data into areas or bars of a histogram to establish the frequency of certain events or categories of data. These histograms can help reflect the highest frequency. Typical applications of root cause analysis histograms include presenting data to determine the dominant cause; understanding the distribution of manifestations of various problems, causes, consequences, etc. A Pareto chart (explained later in the article) is a special type of histogram.


Tool #4:

Is an important tool and solution. Since organizational resources are limited, it is important for process owners and stakeholders to understand the root causes of errors, defects, etc. Pareto excels at representing this mechanism by clearly ranking the root causes of a defect. The diagram is also known as the 80:20 principle.

A chart, named after economist and political scientist Vilfredo Pareto, is a type of graph that contains bars and line graph, where individual values ​​are represented in descending order by columns and the accumulated sum is represented by a line. The left vertical axis usually represents the frequency of occurrences. The right vertical axis is the total percentage of the total number of manifestations. Since the causes are arranged in descending order of their importance, the cumulative function is concave. As an example of the above, in order to reduce the number of tardiness by 78%, it is enough to eliminate the first three reasons.

Tool #5: Scatter plot or scatter plot

Often used to identify potential relationships between two variables, where one may be considered an explanatory variable and the other a dependent variable. This gives a good visual picture of the relationship between two variables, and helps in analyzing the correlation coefficient and regression model. The data is displayed as a set of points, each of which has the value of one variable that defines the position on the horizontal axis and the value of a second variable that defines the position on the vertical axis.

A scatter plot is used when there is a variable that is under the control of the experimenter. If there is a parameter that systematically increases and/or decreases when influenced by another, it is called control parameter or independent variable and is usually plotted along the horizontal axis. The manipulated or dependent variable is usually plotted along the vertical axis. If there is no dependent variable, or the variable can be plotted on any of the axes or on a scatterplot, it will only show the degree of correlation (not the cause-and-effect relationship) between the two variables.


Tool #6:

It is a method of sampling the population. In statistical surveys, when the population groups in the population are different, it is advisable to sample each group (stratum) separately. Stratification is the process of dividing members of a society into homogeneous subgroups before sampling.

The strata must be mutually exclusive: each population unit must be assigned to only one stratum. The strata must be exhaustive: no population unit can be excluded. A simple random sample or a systematic sample is then taken within each stratum.

This often improves the representativeness of the sample by reducing sampling error. It can produce a weighted average that has less variability than the arithmetic mean of a simple random sample of the population. I often tell the groups I oversee that correct procedures selection are more important than just having a sufficient sample size!!


Tool #7: Control charts, also known as Shewhart charts or process behavior charts

Represents special kind time diagram that allows significant change differentiate due to the natural variability of the process.

If control chart analysis shows that the process is under control (ie, stable, changing only due to reasons inherent to the process), then no corrections or changes to the process control parameter are required or desired. Additionally, data from this process can be used to predict future process performance.

If a map shows that an observed process is out of control, analysis of the map can help identify sources of variation that can then be addressed to bring the process back under control.

A control chart can be seen as part of an objective and disciplined approach that helps right decisions regarding process control, including whether process control parameters need to be changed. Process parameters should not be adjusted for a process that is under control, as this will reduce process performance. A process that is stable but is operating outside of a given range (the scrap rate, for example, may be statistically controllable but above a given norm) must be improved through focused efforts to understand the causes of current performance and fundamentally improve the process.

When I manage simple projects (Six Sigma) (commonly called project yellow belts), where the issues are not complex and the project team consists of people with 3 to 5 years of experience in the process, I strongly advocate using these simple tools to resolve process issues.

As a rule of thumb, any process demonstrating 1-2% repeatability standard deviations,can be improved by simple analysis using these tools. Only when process reproducibility is greater than 2.5 - 3% standard deviation should medium to advanced tools be used to identify and resolve process issues. I also recommend to anyone initial course Six Sigma education and training use the seven quality control tools to create fertile ground for the development of green and black belts within the organization.

Material prepared by Andrey Garin
based on materials from foreign publications
http://www.site/

Purpose of the "Seven Basic Quality Control Tools" method is to identify problems that need to be addressed as a matter of priority, based on monitoring the current process, collecting, processing and analyzing the obtained facts (statistical material) for subsequent improvement of the quality of the process.

The essence of the method- quality control (comparison of the planned quality indicator with its actual value) is one of the main functions in the quality management process, and the collection, processing and analysis of facts - the most important stage this process.

Of the many statistical methods for wide application Only seven have been selected that are understandable and can be easily used by specialists in various fields. They allow you to identify and display problems in a timely manner, establish the main factors from which you need to begin to act, and distribute efforts in order to effectively resolve these problems.

The expected result is a solution to up to 95% of all problems arising in production.

Seven Essential Quality Control Tools– a set of tools that make it easier to control ongoing processes and provide various kinds of facts for analysis, adjustment and improvement of the quality of processes.

1. Checklist- a tool for collecting data and automatically organizing it to facilitate further use of the collected information.

2. Histogram- a tool that allows you to visually evaluate the distribution of statistical data, grouped by the frequency of the data falling into a certain (predetermined) interval.

3. Pareto chart- a tool that allows you to objectively present and identify the main factors influencing the problem under study, and distribute efforts to effectively resolve it.

4. Stratification method(data stratification) - a tool that allows you to divide data into subgroups according to a certain criterion.

5. Scatter diagram(dispersion) - a tool that allows you to determine the type and closeness of the relationship between pairs of corresponding variables.

6. Ishikawa diagram(cause-and-effect diagram) - a tool that allows you to identify the most significant factors (reasons) influencing final result(consequence).

7. Control card- a tool that allows you to monitor the progress of the process and influence it (with the help of appropriate feedback), preventing its deviations from the requirements presented to the process.

Checklists(or data collection) - special forms for data collection. They facilitate the collection process, contribute to the accuracy of data collection and automatically lead to some conclusions, which is very convenient for quick analysis. The results can be easily converted into a histogram or Pareto chart. Checklists can be used for both qualitative and quantitative control. The form of the check sheet may be different, depending on its purpose.


To find the right way To achieve a goal or solve a problem, the first thing to do is to collect the necessary information, which will serve as the basis for further analysis. It is desirable that the collected data be presented in a structured and easy-to-process form. For this purpose, as well as to reduce the likelihood of errors occurring during data collection, a checklist is used.

A checklist is a form designed to collect data and automatically organize it, which makes it easier to further use the collected information.

At its core, a control sheet is a paper form on which controlled parameters are printed, according to which, with the help of notes or simple symbols, the necessary and sufficient data are entered into the sheet. That is, a check sheet is a means of recording data.

The form of the checklist depends on the task and can be very varied, but in any case it is recommended to indicate:

Topic, object of research (usually indicated in the title of the control sheet);

Data recording period;

Data source;

The position and surname of the employee registering the data;

Legend, to register the received data;

Data logging table.

When preparing checklists, care must be taken to use the most simple ways filling them out (numbers, symbols), the number of controlled parameters was as small as possible (but sufficient for analyzing and solving the problem), and the form of the sheet was as clear and convenient as possible for filling out even by unqualified personnel.

1. Formulate the purpose and objectives for which the information is being collected.

2. Select quality control methods that will be used to further analyze and process the collected data.

3. Determine the time period during which the research will be conducted.

4. Develop measures (create conditions) for conscientious and timely entry of data into the checklist.

5. Assign responsibility for data collection.

6. Develop a form for the checklist.

7. Prepare instructions for performing data collection.

8. Instruct and train workers in collecting data and entering it into the checklist.

9. Organize periodic data collection reviews.

The most pressing issue that arises when solving a problem is the reliability of the information collected by staff. Finding a solution based on distorted data is very difficult (if not impossible). Taking measures (creating conditions) for employees to register true data is a necessary condition for achieving the goal.

Rice. Checklist examples

It is possible to use electronic forms

At the same time, to the minuses electronic form control sheet compared to paper include:

- bOgreater complexity to use;

- the need to spend more time entering data.

On the plus side:

- ease of data processing and analysis;

- high speed of obtaining the necessary information;

- the ability to simultaneously access information from many people.

However, most of the data collected has to be duplicated in in paper form. The problem is that this leads to a decrease in productivity: the time saved on analyzing, storing and retrieving the necessary information is largely offset by the double work of recording data.

bar chart– a tool that allows you to visually depict and easily identify the structure and nature of changes in the obtained data (assess the distribution), which are difficult to notice when presented in a table.

By analyzing the shape of the resulting histogram and its location relative to the tolerance interval, one can make a conclusion about the quality of the product in question or the state of the process being studied. Based on the conclusion, measures are developed to eliminate deviations in product quality or process state from the norm.

Depending on the method of presenting (collecting) the initial data, the method of constructing a histogram is divided into 2 options:

Option I To collect statistical data, checklists of product or process indicators are developed. When developing a checklist form, you must immediately decide on the number and size of intervals in accordance with which data will be collected, on the basis of which, in turn, a histogram will be constructed. This is necessary due to the fact that after filling out the check sheet it will be almost impossible to recalculate the indicator values ​​for other intervals. The maximum that can be done is to ignore intervals in which no value falls and combine by 2, 3, etc. interval, without fear of distorting the data. As you understand, with such restrictions, for example, it is almost impossible to make 7 out of 11 intervals.

Construction method:

1. Determine the number and width of intervals for the control sheet.

The exact number and width of intervals should be chosen based on ease of use or according to statistical rules. If there are tolerances for the measured indicator, then you should focus on 6-12 intervals within the tolerance and 2-3 intervals outside the tolerance. If there are no tolerances, then we evaluate the possible spread of indicator values ​​and also divide into 6-12 intervals. In this case, the width of the intervals must be the same.

2. Develop checklists and use them to collect the necessary data.

3. Using the completed checklists, calculate the frequency (i.e., how many times) of the obtained indicator values ​​in each interval.

Typically, a separate column is allocated for this, located at the end of the data recording table.

If the indicator value exactly matches the boundary of the interval, then add half to both intervals on the border of which the indicator value falls.

4. To construct a histogram, use only those intervals that contain at least one indicator value.

If there are empty intervals between the intervals in which the indicator values ​​fall, then they also need to be plotted on a histogram.

5. Calculate the average of the observation results.

The arithmetic mean of the resulting sample must be plotted on the histogram.

The standard formula used for calculations is:

Where x i– obtained values ​​of the indicator,

N –the total number of data obtained in the sample.

How to use it if not exact values indicator x 1, x 2, etc. It's not explained anywhere. In our case, to roughly estimate the arithmetic mean, I can suggest using my own methodology:

a) determine the average value for each interval using the formula:

where j –intervals selected for constructing the histogram,

x j max –the value of the upper limit of the interval,

x j min –the value of the lower boundary of the interval.

b) determine the arithmetic mean of the sample using the formula:

where n –number of selected intervals for constructing a histogram,

v j –frequency of sample results falling within the interval.

6. Construct the horizontal and vertical axes.

7. Draw the boundaries of the selected intervals on the horizontal axis.

If in the future you plan to compare histograms that describe similar factors or characteristics, then when drawing a scale on the abscissa axis, you should be guided not by intervals, but by data units.

8. On vertical axis draw a scale of values ​​in accordance with the selected scale and range.

9. For each selected interval, construct a column whose width is equal to the interval, and whose height is equal to the frequency of observation results falling into the corresponding interval (the frequency has already been calculated earlier).

Draw a line on the graph corresponding to the arithmetic mean value of the indicator under study. If there is a tolerance zone, draw lines corresponding to the boundaries and center of the tolerance interval.

Option II Statistics have already been collected (for example, recorded in log books) or are intended to be collected in the form of accurately measured values. In this regard, we are not limited by any initial conditions, so we can choose and change the number and width of intervals at any time in accordance with current needs.

Construction method:

1. Compile the received data into one document in a form convenient for further processing (for example, in the form of a table).

2. Calculate the range of indicator values ​​(sample range) using the formula:

Where xmax– the highest value obtained,

xmin– the smallest value obtained.

3. Determine the number of histogram bins.

To do this, you can use a table calculated based on the Sturgess formula:

You can also use a table calculated based on the formula:

4. Determine the width (size) of the intervals using the formula:

5. Round the result up to a convenient value.

Please note that the entire sample must be divided into equally sized intervals.

6. Determine the boundaries of the intervals. First define the lower bound of the first interval so that it is less than xmin. Add the width of the interval to it to get the border between the first and second intervals. Next, continue adding the width of the interval ( N) to the previous value to get the second boundary, then the third, etc.

After performing these actions, you should make sure that the upper limit of the last interval is greater than xmax.

7. For the selected intervals, calculate the frequency of occurrence of the values ​​of the indicator under study in each interval.

If the indicator value exactly matches the boundary of the interval, then add half to both intervals whose boundary the indicator value falls on.

8. Calculate the average value of the indicator under study using the formula:

Follow the order of constructing a histogram, starting from step 5, of the above method for Option I.

Histogram Analysis is also divided into 2 options, depending on the availability of technological approval.

Option I Tolerances for the indicator are not specified. In this case, we analyze the shape of the histogram:

Regular (symmetrical, bell-shaped) shape. The mean value of the histogram corresponds to the middle of the data range. The maximum frequency also occurs in the middle and gradually decreases towards both ends. The shape is symmetrical.

This form of histogram is the most common. It indicates the stability of the process.

Negatively skewed distribution (positively skewed distribution). The mean value of the histogram is located to the right (left) of the middle of the data range. The frequencies decrease sharply when moving from the center of the histogram to the right (left) and slowly to the left (right). The shape is asymmetrical.

This shape is formed either if the upper (lower) limit is adjusted theoretically or by a tolerance value, or if the right (left) value cannot be achieved.

Distribution with a cliff on the right (distribution with a cliff on the left). The mean value of the histogram is located far to the right (left) of the middle of the data range. The frequencies decrease very sharply when moving from the center of the histogram to the right (left) and slowly to the left (right). The shape is asymmetrical.

This form is often found in situations of 100% product control due to poor process reproducibility.

Comb (multimodal type). Intervals of one or two have lower (higher) frequencies.

This form is formed either if the number of individual observations included in the interval fluctuates from interval to interval, or if certain rule data rounding.

A histogram that does not have a high central part (plateau). The frequencies in the middle of the histogram are approximately the same (for the plateau, all frequencies are approximately equal).

This form occurs when several distributions with means close to each other are combined. For further analysis, it is recommended to use the stratification method.

Double peak type (bimodal type). Around the middle of the histogram, the frequency is low, but there is a frequency peak on each side.

This form occurs when two distributions with means that are far apart are combined. For further analysis, it is recommended to use the stratification method.

A histogram with a gap (with a “pulled out tooth”). The shape of the histogram is close to the usual type distribution, but there is an interval with a frequency lower than both adjacent intervals.

This form occurs if the width of the interval is not a multiple of the unit of measurement, if the scale readings are incorrectly read, etc.

Distribution with an isolated peak. Along with the normal histogram shape, a small isolated peak appears.

This form is formed when a small amount of data is included from another distribution, for example, if the controllability of the process is impaired, errors occurred during measurement, or data from another process was included.

Option II. There is a technological tolerance for the indicator under study. In this case, both the shape of the histogram and its location in relation to the tolerance zone are analyzed. Possible options:

The histogram looks like a normal distribution. The average value of the histogram coincides with the center of the tolerance zone. The width of the histogram is less than the width of the tolerance field with a margin.

In this situation, the process does not need to be adjusted.

The histogram looks like a normal distribution. The average value of the histogram coincides with the center of the tolerance zone. The width of the histogram is equal to the width of the tolerance interval, which is why there are concerns about the appearance of substandard parts from both the upper and lower tolerance margins.

In this case, it is necessary to either consider changing technological process in order to reduce the width of the histogram (for example, increasing the accuracy of equipment, using more quality materials, changing the processing conditions of products, etc.) or expand the tolerance range, because The requirements for the quality of parts in this case are difficult to meet.

The histogram looks like a normal distribution. The average value of the histogram coincides with the center of the tolerance zone. Histogram width more width tolerance interval, and therefore substandard parts are detected from both the upper and lower tolerance margins.

In this case, it is necessary to implement the measures described in paragraph 2.

The histogram looks like a normal distribution. The width of the histogram is less than the width of the tolerance field with a margin. The average value of the histogram is shifted to the left (right) relative to the center of the tolerance interval, and therefore there are concerns that substandard parts may be located on the side of the lower (upper) limit of the tolerance zone.

In this situation, it is necessary to check whether the measurement tools used are introducing a systematic error. If the measuring instruments are working properly, the process should be adjusted so that the center of the histogram coincides with the center of the tolerance field.

The histogram looks like a normal distribution. The width of the histogram is approximately equal to the width of the tolerance field. The average value of the histogram is shifted to the left (right) relative to the center of the tolerance interval, with one or more intervals outside the tolerance zone, which indicates the presence of defective parts.

In this case, it is initially necessary to adjust the technological operations so that the center of the histogram coincides with the center of the tolerance field. After this, measures must be taken to reduce the histogram span or increase the size of the tolerance interval.

The center of the histogram is shifted to the upper (lower) tolerance limit, and the right (left) side of the histogram near the upper (lower) tolerance limit has a sharp break.

In this case, we can conclude that products with an indicator value outside the tolerance range were excluded from the batch or were deliberately distributed as suitable for inclusion within the tolerance limits. Therefore, it is necessary to identify the reason that led to the occurrence of this phenomenon.

The center of the histogram is shifted to the upper (lower) tolerance limit, and the right (left) side of the histogram near the upper (lower) tolerance limit has a sharp break. In addition, one or more intervals are outside the tolerance range.

The case is similar to 6., but the histogram intervals outside the tolerance range indicate that the measuring instrument was faulty. In connection with this, it is necessary to verify the measuring instruments, as well as re-instruct workers on the rules for performing measurements.

The histogram has two peaks, although the values ​​of the indicator were measured for products from the same batch.

In this case, we can conclude that the products were received in different conditions(for example, materials of different grades were used, equipment settings were changed, products were produced on different machines, etc.). In this regard, it is recommended to use the stratification method for further analysis.

The main characteristics of the histogram are in order (corresponding to case 1.), while there are defective products with indicator values ​​outside the tolerance range, which form a separate “island” (isolated peak).

This situation may have arisen as a result of negligence in which defective parts were mixed with good ones. In this case, it is necessary to identify the causes and circumstances leading to the occurrence of this situation, and also take measures to eliminate them.

Views