Content area
Purpose
The purpose of the study is to examine how risk factors contribute to the occurrence of defects in a process. By analyzing these risk factors in relation to process quality, the study aims to help organizations prioritize their resources and efforts toward addressing the most significant risks. These challenges, integrated with the emerging concept of Quality 4.0, necessitate a comprehensive risk assessment technique.
Design/methodology/approach
Fuzzy logic integrated with an analytic network process is used in the process failure mode and effects analysis for conducting risk identification and assessment under uncertainty. Through a mathematical model, the linkage of risk with Six Sigma is established and, finally, a value–risk matrix is developed for illustrating and analysing risk impact on process quality.
Findings
A case study on fused filament fabrication demonstrates the proposed methodology’s applicability. The results show its effectiveness in assessing risk factors’ impact on Six Sigma metrics: defects per million opportunities/sigma level.
Practical implications
By integrating qualitative assessments and leveraging available data, this approach enables a more comprehensive understanding of risks and their utilization for an organization’s quality improvement initiatives.
Originality/value
This approach establishes a risk-centric Six Sigma assessment method in accordance with the requirement of ISO 9001:2015 and in the context of Quality 4.0.
1. Introduction
In today’s competitive business landscape, ensuring high process quality is crucial for organizations to deliver reliable products and services and maintain customer satisfaction (Lee et al., 2013). Traditional quality evaluation methods often focus solely on measuring the adherence to predefined specifications and standards. These methods often overlook the underlying risks that can impact process quality and lead to undesirable outcomes or defects.
ISO 9001:2015 places significant importance on recognizing and dealing with potential risks and opportunities that could impact an organization’s ability to deliver consistent, compliant products or services and achieve its quality goals. The standard mandates organizations to identify and address risks and opportunities in both their internal and external environments and take appropriate actions to manage or benefit from them (ISO 9001, 2015). By integrating risk assessment methodologies into the quality evaluation, organizations can proactively identify, analyse and manage risks that may undermine process quality. Organizations in the era of Quality 4.0, which involves implementing Industry 4.0 technology in the realm of quality, acknowledge the significance of risk-based Six Sigma evaluation. Six Sigma evaluation is a systematic and data-driven approach used to measure the performance and effectiveness of processes within an organization. It involves the use of statistical tools and methodologies to identify and reduce defects, variations and inefficiencies in processes, ultimately aiming to achieve a higher level of process quality and customer satisfaction. However, Industry 4.0’s advancements will lead to the generation of more in-process data, rendering traditional Six Sigma approaches unsuitable. To enhance process performance, it is often vital to have in-depth knowledge about the specific product and its manufacturing process (Giannetti and Ransing, 2016). The use of tools such as failure mode and effects analysis (FMEA) in this context indeed emphasizes continuous improvement in value delivery. Despite their benefits, these tools can still be affected by evaluators’ opinions and backgrounds during data collection and processing, potentially leading to imprecise information when adapting to the context of Industry 4.0 (de Andrade et al., 2022).
This paper delves into fuzzy set theory, multi-criteria decision-making (MCDM) method, process FMEA (PFMEA) for integrating risk assessment into the Six Sigma evaluation and value–risk matrix (VRM) for visually representing and analysing how risks influence the quality of a process. The study aims to address the concept of risk-based process quality evaluation, highlighting its significance and potential benefits for organizations. This paper addresses the limitations of traditional PFMEA by proposing a modified fuzzy PFMEA approach that integrates Six Sigma metrics, fuzzy set theory and MCDM techniques. Fuzzy set theory is used to manage uncertainties, ambiguities, inconsistencies and subjectivity in the data, whereas the MCDM method computes the weights of risk factors by taking into account their interdependencies, aiding in data gathering and analysis within the Industry 4.0 context. Additionally, by translating the occurrence of failure modes into Six Sigma metrics, the process can accurately determine the actual number of defects, leading to more informed decisions that better reflect real manufacturing performance and conditions. The justification for translating the variable (O) with the Six Sigma metrics is further explained in Section 4. Additionally, in this study, the utilization of VRM analysis serves to offer the organization a structured method for visually evaluating and ranking options according to how risks may affect the quality of the process. This approach involving the above-mentioned techniques for risk assessment and their integration with the process quality metric with limited numerical data is not addressed in the literature so far. Furthermore, the paper discusses a case study to demonstrate the application and effectiveness of the risk-based approach.
This paper is organized as follows. Section 2 reviews the literature, Section 3 presents the research gaps, objectives and contribution to the literature, Section 4 describes the methodology, Section 5 discusses a case study to explain the proposed methodology, Section 6 describes results and discussions, Section 7 concludes the paper with the limitations and future scope of the proposed work.
2. Literature review
In the context of this study, the literature review explores three main areas: Process quality evaluation through Six Sigma Metric, risk and Six Sigma analysis and PFMEA. The PFMEA part is further divided into two parts: limitations of traditional FMEA and modified FMEA.
Process quality evaluation through Six Sigma metric
Process quality evaluation is a crucial aspect of facilitating performance monitoring and tracking of goals. Traditional performance measurements relying on accounting figures such as sales turnover, profit, debt and return on investment can certainly highlight performance issues, but they often fall short of explaining the underlying causes of these problems (Lin and Li, 2010). The proposed approach relies on Six Sigma assessment as it focuses on addressing and identifying the fundamental root cause of problem and with the goal of redesigning the process for long-lasting improvements.
Six Sigma stands as a cornerstone for standardized and reliable quality evaluation. However, its efficacy hinges on the availability and thorough analysis of extensive data sets, a requirement amplified by the challenges of Industry 4.0 and the emergence of Quality 4.0 principles. Saghaei et al. (2014) examine Six Sigma’s role in supply chains, emphasizing metric rolled throughput yield (RTY) for entity enhancement. Using numerical examples and sensitivity analysis, the study stresses data collection and advanced modelling to boost supply chain performance, underscoring RTY’s pivotal role in data-driven improvement efforts. Complementing this perspective, Ketan and Nassir(2016) applied the Six Sigma define-measure-analyze-improve-control (DMAIC) methodology to the aluminium extrusion process to reduce variation and improve capability. Through detailed data gathering and analysis using tools such as suppliers-inputs-process-outputs- customers charts, Pareto charts and X̅-R charts, the study increased process yield from 46% to 81%. This highlights the critical role of comprehensive data analysis in successful Six Sigma applications. Based on this comprehensive data analysis, Gupta et al.(2018) used the Six Sigma define-measure-analyze-improve-control (DMAIC) method to address process variations in the tyre manufacturing industry. Through data gathering and analysis using quality tools such as cause–effect diagrams and statistical process control, the study successfully reduces standard deviation and enhances process capability index (Cpk) from 0.94 to 2.66. This implementation of DMAIC highlights the significance of extensive data utilization for process improvement. Similarly, Darmawan et al.(2020) aimed to evaluate the quality of raw materials and implement improvements based on data obtained from 48 group samples, using the calculation of defects per million opportunities (DPMO). The result revealed that the proposed approach would be beneficial for understanding the workings of the raw material processing. Both studies demonstrate the application of data-driven methodologies to enhance quality in different domains, emphasizing the importance of systematic analysis for process improvement.
As evidenced in the literature, process quality evaluation in terms of Six Sigma metrics offers a standardized and reliable method for quantifying quality, enabling consistency across a variety of processes and initiatives. However, process quality evaluation through Six Sigma metrics (such as yield, RTY, sigma level, DPMO, Cp/Cpk) also requires a considerable amount of data and experimentation for analysing the situation.
Moreover, there has been a growing focus on the concept of Quality 4.0 in recent literature. Quality 4.0 emergence has led to an increase in the volume of data required, and issues of biases arising in data gathering and analysis have become a crucial issue (de Andrade et al., 2022). In response to the challenges posed by Industry 4.0’s influx of in-process data, Giannetti and Ransing(2016) used a novel algorithm to embed risk-based thinking in quantifying uncertainty within manufacturing operations. By using penalty functions and a quantile regression tree approach, the method allows for the identification and visualization of optimal ranges associated with quality improvements. This work highlights that in Industry 4.0, Six Sigma encounters challenges because of increased data streams and complex manufacturing processes. Traditional statistical methods may be inadequate for handling noisy, sparse data, limiting process improvement opportunities.
Although the Six Sigma method provides a standardized metric across various industries, its limitations in managing an increase in the volume of data sets, as well as inconsistent, incoherent, ambiguous and uncertain data in Industry 4.0 scenario, underscore the necessity for a new framework to address these challenges effectively.
Risk and Six Sigma analysis
The prevailing ideology in risk analysis endeavours is centred on mitigating potential losses and increasing profits (He et al., 2019). Owing to the release of the ISO 9001:2015 standard, which emphasizes the concept of risk, there is a growing awareness of the need for risk-based quality assessment. In this sub-section, literature related to risk and Six Sigma-based analysis is being reviewed because of its importance in quality management initiatives.
Usman Tariq(2013) developed a risk management framework that combines Six Sigma tools and techniques to handle undesired effects during project execution. This framework emphasizes continuous improvement by monitoring and regulating processes throughout the project’s lifecycle, thereby reducing defects to a minimum. However, this work primarily focuses on controlling risk factors and evaluating sigma metrics pre- and post-implementation, without delving into the quantitative relationship between specific risks and sigma metrics.
Expanding on the integration of risk management and Six Sigma, Alharthi et al.(2014) applied Six Sigma within the entertainment media industry to address safety concerns. Their approach successfully reduced employee accidents from 25 to 7 incidents per year, highlighting the practical benefits of integrating Six Sigma into risk management. Despite these successes, their study does not quantify how safety risks correlate with changes in improving the process quality. This restricted predictive capacity can result in reactive rather than proactive approaches to risk management.
In the medical industry, Westgard and Westgard(2017) introduced a risk-based statistical quality control (SQC) procedure. Their work underscores the practical significance of error models in managing quality within medical laboratories. However, while they demonstrate the utility of risk-oriented SQC methods, their study does not establish a quantitative link between risk factors and sigma metrics, which is essential for precise risk management.
Al Khamisi et al.(2019) introduced a knowledge-based system designed to facilitate the implementation of Lean Six Sigma principles in a health-care setting, effectively enhancing the performance of quality management. However, in the absence of quantitative benchmarks, establishing meaningful goals for project performance becomes challenging. Sanni-Anibire et al.(2020) aimed at developing a risk assessment approach that can be used to enhance the safety performance of construction projects. Furthermore, Six Sigma evaluation is conducted to evaluate the average project safety performance.
In the literature, reasonable research has been resorted to risk-based Six Sigma assessment by controlling the risk factors and evaluating the sigma metrics before and after the implementation. However, the formulation of quantitative relationship between the occurrence of risk and sigma metric is found limited.
Process failure mode and effects analysis
There are several risk assessment tools available in the literature, with FMEA being one of them. In contrast to other risk assessment tools that primarily focus on finding solutions after a failure has already occurred, FMEA’s functions involve the identification of potential failures and the assessment of their associated risks before the occurrence of failure (Qin et al., 2020).
The PFMEA, a type of FMEA, is a structured approach used to assess potential FMs and their effects on a process, with the goal of prioritizing and implementing preventive actions (Johnson and Khan, 2003). It focuses on identifying FMs, determining their severity (S), likelihood of occurrence (O), ability to be detected (D) and calculating the risk priority number (RPN) as a measure of risk.
Limitations of traditional failure mode and effects analysis
The traditional FMEA has certain limitations and are mentioned as follows:
Subjectivity: The data for ranking of severity, occurrence and detection may include uncertain, inconsistent and imprecise information.
Same risk value: It is possible for two FMs to have the same RPN value. For instance, Failure Mode “A” with a severity ranking of 8, an occurrence ranking of 5 and a detection of 2 may have the same RPN as another failure mode with a severity ranking of 5, an occurrence ranking of 8 and a detection ranking of 2 (Baghbani et al., 2019).
Conventional PFMEA does not consider the interdependency of the risk factors while calculating their impact on the FMs.
Traditional PFMEA relies on engineers’ experience to assess the occurrence of FMs. However, this approach lacks sufficient research data for comparison with other quality metrics, leading to limited and subjective analysis (Yeh and Chen, 2014). To address these limitations of the conventional/traditional PFMEA, a modified fuzzy PFMEA is proposed in this research work.
Modified failure mode and effects analysis.
As mentioned in the previous sub-section, the conventional FMEA computational framework exhibits several limitations, prompting a need for innovative and creative approaches in the current literature. Fuzzy set theory is a mathematical framework that allows for representing and handling uncertainty, subjectivity and vagueness in data (Nguyen et al., 2014). Instead of relying solely on numerical values, a fuzzy triangular number is used to represent the linguistic opinions provided by experts and the survey data obtained for severity and occurrence rating. Fuzzy logic integrated with the MCDM technique is an effective approach for the evaluation of weightage of the factors in this scenario. Fattahi et al.(2020) proposed a fuzzy MCDM-based FMEA model for risk assessment. While this model uses S, O, D, time, cost and profit as primary risk factors, it acknowledges that each hazard may have unique contributing factors that need to be considered for a more precise assessment.
Building on this, Boral and Chakraborty(2021) introduced an interval type-2 fuzzy analytic hierarchy process (AHP), Decision-Making Trial and Evaluation Laboratory and Measurement of Alternatives and Ranking according to Compromise Solution. This approach ranks risks associated with human error using four crucial risk factors: safety, occurrence, consequences, economic impact and detection. Although interval type-2 fuzzy sets improve decision-making accuracy, this method still only considers a limited number of constant risk factors for ranking failure modes. Additionally, AHP does not account for the interdependencies between risk factors.
Khalilzadeh et al.(2021) further expanded the FMEA methodology by integrating fuzzy MCDM techniques with a multi-objective programming model for risk assessment during the planning phase. This approach enhances precision by incorporating multiple criteria beyond the traditional S, O and D factors. However, it still primarily focuses on these core risk factors, without fully addressing additional contributing factors.
Similarly, Yener and Can(2021) proposed a three-stage intuitionistic fuzzy risk assessment approach based on FMEA. Their method highlights the real constraints faced by organizations in risk assessment, revealing the limitations of existing frameworks in capturing a comprehensive risk profile. While this approach introduces intuitionistic fuzzy logic to enhance decision-making, it predominantly relies on conventional risk factors and does not fully explore the aggregated impact of other significant contributors.
De Andrade et al. (2022) studied a technique called FMEA enhancement through MCDM – AHP and TOPSIS (FEMATO), based on MCDM methods, aimed at enhancing accuracy and minimizing bias in the FMEA evaluation process across various organizations. Although FEMATO represents a significant step toward addressing biases inherent in traditional FMEA processes, it still focuses on a limited set of risk factors and does not consider the interdependency between the risk factors.
Bhattacharjee et al. (2022) proposed a risk-expected value approach, using an interval number-based MCDM technique instead of the traditional RPN method for calculating the weightage of risk factors. While these studies present innovative methodologies that enhance the traditional FMEA framework by incorporating advanced fuzzy and MCDM techniques, there remains a gap in comprehensively accounting for a broader range of risk factors. Although the literature adequately assesses and ranks FMs concerning S, O and D, the aggregated impact of additional contributing factors such as environmental conditions, human factors and operational complexities is not sufficiently addressed. Moreover, the risk factors identified and used for ranking the FMs are not being considered adequately in the literature. Decision-makers can focus on mitigating risks that have a cascading effect on others, thereby addressing root causes rather than symptoms and thus these interdependencies must be addressed in the analysis. These gaps underscore the need for further research and development of more holistic risk assessment models that can capture the full spectrum of risks associated with various failure modes.
3. Research gaps, objectives and contributions
Due deliberations on the existing framework and methodologies along with the research gaps on Process Quality Evaluation through Six Sigma Metric, Risk and Six Sigma analysis and PFMEA have already been conducted in the foregoing literature review in Section 2. However, the research gaps that have emerged from the review are consolidated and summarized in this section followed by the framing of research questions, objectives and the contributions of this research work, for better comprehension of the readers:
From the literature review on process quality evaluation presented in Section 2, it is observed that traditional performance measurements often highlight issues without explaining underlying causes. While Six Sigma aims to identify root causes, it relies heavily on extensive, high-quality data, which is challenging with the increased data volumes of Industry 4.0. Issues of data biases and inconsistencies are prominent, and traditional statistical methods used in Six Sigma may not adequately handle the noisy, sparse data typical of modern manufacturing processes. Additionally, there are limited methods available for situations where experimental data is scarce, with imprecise data, such as during new product conceptualization or product design modifications.
The literature review on integrating risk analysis with Six Sigma for process quality evaluation reveals that, despite increased awareness of risk-based quality assessment due to ISO 9001:2015, studies often fall short in establishing quantitative relationships between specific risks and sigma metrics. This limitation hinders predictive and proactive risk management. Furthermore, the absence of quantitative benchmarks complicates the setting of meaningful performance goals. As risks are potential causes that lead to defects, each with an associated probability and impact, it is crucial to understand these dynamics thoroughly. However, many approaches remain reactive rather than proactive because of restricted predictive capacity. Additionally, there is a lack of methodologies for handling scarce or imprecise data, particularly during new product development. The increased data volume and complexity in Industry 4.0 environments further challenge traditional Six Sigma methods. This underscores the need for new frameworks that integrate advanced data analysis and risk-based thinking. Addressing these gaps is essential to enhance Six Sigma’s effectiveness across diverse industries.
Based on the literature review conducted in Section 2 on FMEA/PFMEA methods for risk evaluation, it is evident that despite advancements, many studies focus on a limited number of constant risk factors such as severity, occurrence, detection, time, cost and profit, without fully addressing the unique contributing factors of each hazard. Additionally, methods such as AHP do not account for interdependencies between risk factors. Current approaches, while improving decision-making accuracy, often remain reactive rather than proactive because of restricted predictive capacities. There is also a notable lack of methodologies for handling imprecise or scarce data, particularly in new product development. Furthermore, traditional Six Sigma methods do not adequately address the challenges posed by increased data volume and complexity in Industry 4.0 environments. Further to our observations, an important phenomenon in risk assessment research is that different types of risk factors are often interrelated. It is desirable to consider these interactions when assessing the risk impacts of FMs by assigning weights to different risk factors, which is again limited in the literature. These gaps underscore the need for more holistic risk assessment models that comprehensively account for a broader range of risk factors and their interdependencies, facilitating a more proactive and precise risk management approach.
The paper basically intends to answer the following questions:
What advancements can be made in risk evaluation methodologies, particularly in FMEA/PFMEA methods, to address the limitations related to interdependencies between risk factors and the handling of scarce or imprecise data, while also considering the complexities of Industry 4.0 environments?
What strategies can be devised to establish quantitative relationships between specific risks and sigma metrics, thereby enabling predictive and proactive risk management within the Six Sigma framework?
How can traditional performance measurements be enhanced to better explain the underlying causes of issues in modern manufacturing processes?
The questions outlined provide clear direction for the objectives this paper aims to achieve.
To identify process FMs followed by the impact assessment of associated risk factors through a modified fuzzy PFMEA approach.
To develop a comprehensive risk-centric process quality evaluation framework that integrates Six Sigma metrics with advanced data analysis techniques to address the increased data volume and complexity of Industry 4.0 environments.
To illustrate and prioritize the risk based on their potential impact on the process quality for conducting risk management planning.
The contributions of the research work are summarized and presented as follows:
The proposed work evaluates process quality through a risk-centric approach using qualitative data. This method enables focused resource allocation and efforts to address potential risks, leading to reduced time and cost expenditures. Quality managers can use this methodology to ensure new designs, meet quality standards and regulatory requirements without waiting for extensive field data. Unlike traditional methods such as environmental stress screening testing, acceptance sampling, control charts, first article inspections and customer feedback, which rely on costly and time-consuming physical prototypes and field tests, this approach uses qualitative data from expert opinions across cross-functional departments. This allows for the early detection and correction of potential issues, thereby lowering development costs and accelerating time-to-market.
The proposed work contributes to enhancing the traditional risk assessment approach, addressing prevalent issues such as data biases and inconsistencies, especially in the context of modern manufacturing in Industry 4.0. Traditional statistical methods may struggle to handle the noisy, sparse data typical of this environment. By integrating fuzzy logic and MCDM techniques, this research expands the scope of considered risk factors and significantly enhances decision-making accuracy. Additionally, leveraging highly experienced expert opinion through the proposed mathematical model, which integrates the occurrence of risk with sigma metrics, ensures effective analysis of risk impact on process quality, enabling proactive risk management and improved process reliability. Furthermore, it enhances traditional statistical methods to handle the complex data environments of modern manufacturing, thus reinforcing the reliability of Six Sigma applications. Manufacturing organizations can minimize production disruptions and reduce defect-related costs, improving overall operational efficiency.
Particularly in the risk assessment section, by considering the individual impacts of each cause/risk factor and their interdependencies and assigning appropriate weights, the proposed approach accurately reflects the relative significance of each factor in influencing the failure modes. The weightage can reflect the relative contribution of each risk factor relative to other risk factor towards the overall risk assessment of the failure mode. This approach allows for a more nuanced understanding of the risks involved and enables better-informed decision-making and risk-management strategies. Hence, the proposed MCDM-based PFMEA approach for determining the risk impact of FMs on the quality of the manufacturing process is a significant contribution to the literature as such study is limited.
4. Methodology
The flowchart in Figure 1 presents the proposed framework for this research paper. The research methods, the justification for their selection and their implementation are outlined in the subsequent steps. The research is carried out in three phases, where the first phase presents the identification of risk and assessment through modified fuzzy PFMEA method. The second phase evaluates the manufacturing process quality utilizing the risk impact value assessed in Phase 1. The third phase visually illustrates the categorization of alternatives by plotting a graph between the expected process quality (EPQ) and risk value obtained from Phases 1 and 2, which can eventually be helpful for providing risk mitigation strategy to the industry.
Phase 1: risk identification and assessment through modified fuzzy process failure mode and effects analysis method
Industry 4.0, the Fourth Industrial Revolution, emphasizes a holistic and integrated approach heavily reliant on data from various sources. This shift also introduces the concept of Quality 4.0, which integrates traditional quality management practices with digital technologies to enhance quality assurance and control. However, initial data often contains biases, inaccuracies and other issues that hinder effective analysis, interpretation and decision-making. As discussed earlier in Section 2, traditional FMEA has several limitations, with one of the most significant issues being its subjectivity, especially in the context of Industry 4.0. To address these challenges, the proposed approach enhances traditional FMEA by improving the handling of data, addressing risk factor interdependencies and establishing correlations with sigma metrics. The procedure for risk identification and assessment within a process through the proposed modified fuzzy PFMEA approach is as follows:
Step 1. The system overview is thoroughly analysed, and a process flow chart of the selected case (Section 5) is constructed to understand the overall system. A process flowchart assists in illustrating the progression of a product throughout the entire procedure, starting from its input to reaching the final output and facilitates in the identification of process steps and their interactions.
Literature survey and brainstorming are used as proactive ways to identify all the possible ways the process could fail for the selected case (Section 5). Initially, a literature survey to understand the process FMs of the selected case (Section 5) in the existing study is carried out to identify the failure mode and its associated risk factors. The search phase is carried out using the keywords specific to the considered case of manufacturing process in the Scopus database. The survey included peer-reviewed journal article, conference paper and book chapters. The details of the literature survey in this regard form part of Section 5. The identified FMs and risk factors are further discussed with the experts of the organizations to shortlist and finalize in the brainstorming session.
A cross-functional team of experts consisting of manufacturing engineers, design engineers, quality control inspectors, production personnel and customer representatives, in the organization, is set up. The qualification and experience of the team is provided in Table 1. All the participants for the session are brought into the discussion room and are provided with notepads. The identified FMs and risk factors through the extensive literature survey are discussed with the experts and are asked to suggest additions and deletions of the risks. The experts are asked to note down ideas in the provided notepad and, finally, the process FMs and risk factors are finalized from the discussion.
Step 2. The influence of each failure mode is determined using the risk impact assessment method. This assessment is divided into two parts: the estimation of severity and occurrence of process FMs. Fuzzy-based ratings are assigned to the severity of each risk factor. The evaluation of severity based on linguistic terms and their corresponding fuzzy numbers is presented in Table 2. The qualitative grouping of failure mode occurrences (Standard, 1980) is placed in Table 3. Since the conventional PFMEA relies on the expertise of engineers to assess the occurrence of failure factors, this approach is limited in providing information on the relation between the standard occurrence probability and the corresponding defects in terms of Six Sigma metrics such as DPMO. By translating the occurrence failure probabilities to Six Sigma quality metrics values, the proposed method allows for a more realistic and context-specific understanding of failure occurrences in actual processes. Hence, the DPMO values (Group and DeCarlo, 2007) corresponding to respective FM occurrence values with their fuzzy ratings also form part of Table 3.
Step 3. This step is dedicated to estimate risk factor weights. In the literature, various MCDM techniques have been proposed to address complex problems involving multiple interacting factors. One such technique is the AHP developed by Saaty (1987). However, AHP lacks the ability to incorporate interdependencies among the factors. To overcome this limitation, the analytic network process (ANP), also developed by Saaty (2004), is used in this approach to effectively handle interdependencies and enhance the decision-making process. The ANP excels in handling complex interactions and dependencies between criteria and alternatives, while AHP assumes independence, making it less suitable for complex problems. ANP’s network structure offers greater flexibility and robustness for modelling real-world scenarios than AHP’s hierarchical structure and can accommodate feedback loops, a capability lacking in AHP. Hence, fuzzy ANP is used for calculation of risk factor weightage in the proposed approach. The weightage of the risk factor obtained from the fuzzy ANP is further used in the calculation of risk impact of failure mode.
The fuzzy values are converted into crisp numbers according to the formula extracted from Eastman(1987). Next, the weight of each risk factor is calculated using the fuzzy ANP approach. The procedure for evaluating the weights is adopted from Büyüközkan et al.(2004). The steps for calculation of weights are as follows:
Step 3.1. The risk factors are grouped under four categories (4 M) (i.e. machine, man, material, mother nature), and their interdependencies are depicted in a network form (Figure 2). The main factors/causes must be studied, classified and prioritized, so actions can be elaborated and focused on correcting the priority causes of the problem. The risk factors are categorized according to the 4Ms, as these verticals can encompass most of the associated risk factors in the context of the case study (Section 5) adopted in this paper. Manpower assesses individuals’ skills and contributions to industry processes. Machinery focuses on production equipment and tools, ensuring optimal performance. Material management oversees raw materials and consumables, emphasizing accurate specifications and appropriate utilization. The impact of environmental factors on operational processes such as humidity and contamination are included in the Mother Nature category.
Next, for the estimation of the weightage of the risk factors, vector W1 and matrices W2, W3 and W4 (as explained below) are initially estimated to provide a structured framework for understanding and analysing the relationships and influences between FMs, risk factor categories and individual risk factors. W1 is a vector that signifies the extent to which the failure mode aligns with the risk factor categories, W2 is a matrix that describes how factor categories correlate with the corresponding factors. W3 and W4 are the matrices that represent the inner dependencies of risk factor categories and factors, respectively.
Step 3.2. Calculation of W1: Based on the triangular fuzzy conversion scale (Esen et al., 2019), a pairwise comparison matrix for the risk factor categories with respect to the FM is constructed. It is assumed that there is no dependence among the factor categories, and the eigenvector for the same is obtained by performing the geometric mean methodology of fuzzy AHP (Huynh et al., 2018).
Step 3.3. Calculation of W2: It is assumed again that there is no dependency among the factors, and the pairwise comparison matrix is constructed with respect to each factor category resulting in the eigenvectors for each factor category. Here, the fuzzy AHP geometric mean methodology is applied, and the degree of relative importance of the factors for the remaining factor categories is calculated in a similar manner.
Step 3.4. Calculation of W3: The inner dependency among the factor categories is obtained through investigating the effect of each factor categories on other factor categories by using pairwise comparisons. In the resulting eigenvectors obtained from pairwise comparisons, zeros are allocated to the eigenvector weights of the factor categories, which are independent.
Step 3.5. Calculation of W4: The dependency among the factors is calculated in this step. As previously discussed for factor categories, the inner dependencies are calculated and the required pairwise linguistic comparisons are carried out.
Step 3.6. Calculation of Wc: The interdependent importance of the factor categories as Wc = W3 × W1 are obtained in this step.
Step 3.7. Calculation of Wa: The interdependent importance of the factors is obtained as follows: Wa = W4 × W2.
Step 3.8. Calculation of the weights of the risk factors: Finally, the overall weights of the factors’ involving the interdependencies are obtained by using WfuzzyANP = Wa × Wc.
Step 4. The weights obtained from the previous step represent the relative contribution of each risk factor towards the FM. Next, the weightage of risk factors is aggregated to obtain the overall risk impact of the FM. Several aggregation techniques such as Bayesian aggregation, weighted decision matrix, exponential moving average available in the literature are investigated. However, among all the methods weighted sum is found to be a simple, methodologically valid approach that is straightforward to articulate and is characterized by its transparency and hence used for the calculation (Janssen, 2001). The risk impact of each failure mode is determined by calculating the weighted sum of the risk impact of each risk factor that affects the specific failure mode. The formulae for evaluating the risk impact is presented by the formula: (1) Where, wi, Si, Oi are the weightage, severity and occurrence of the ith risk factor. As the risk impact value increases, the potential hazard associated with the process failure mode also increases (Tian et al., 2019). Therefore, the feature of the failure mode with the maximum risk impact is considered as the key quality characteristics (KQC) of the product and is considered for the assessment of the manufacturing process’s yield.
Phase 2: evaluation of manufacturing process quality considering risk impact
Risk can substantially impact process quality by introducing variability, defects and errors due to resource constraints and regulatory non-compliance. Complex processes can amplify the impact of risk, and supplier-related issues and affect quality over time or reliability. Consequently, integrating risk management into process quality evaluation is essential for identifying uncertainties and implementing measures to proactively address these risk factors, thereby assuring a consistent and robust level of quality for the final product or service.
As per the literature provided in Section 2, the formulation linking the performance value and the risk is found to be limited. In the context of this, Xu et al.(2020) assessed the performance of several types of e-waste management improvement strategies considering risk. Based on this, the expected process quality (EPQ) equation, considering the RIFM, is proposed using the following formulation: (2) where, RIFM – risk impact of the failure mode
YT – Targeted yield representing the percentage of non-defect items required for the organization considering the KQC of the product
Next, the calculated EPQ is transformed into sigma-level values and DPMO (Kumar, 2006) is placed as an equation: (3) This formula enables the conversion of EPQ metrics into Six Sigma metrics, which assess the process sigma level and defect rates within the given context.
Phase 3: visualizing and analysing risk impact on process quality through value–risk matrix
To gauge and rank alternatives based on their perceived significance and the level of threat they pose to the desired process quality outcomes is extremely beneficial for providing risk mitigation strategies and improving the process quality. There are several tools available in the literature for categorization of alternatives, for instance, suppliers score cards (SSC) (Park et al., 2005), ABC analysis (Liu et al., 2016), MCDM methods (Awasthi et al., 2018; Nguyen et al., 2014), VRM (Caniëls and Gelderman, 2005), etc. However, SSCs, ABC analysis and MCDM methods fail to identify and classify alternatives on the basis of risks. The VRM also known by the name Kraljic matrix or the supplier portfolio matrix is a strategic tool used in supply chain management to assess and categorize suppliers based on their importance and the level of risk associated with them (as shown in Figure 3). It provides a framework for understanding the strategic positioning of suppliers and helps organizations develop appropriate strategies to manage their supplier relationships effectively (Caniëls and Gelderman, 2005).
In the context of this study, the VRM analysis is used in the organization to provide a systematic approach to visually assess and prioritize the alternatives according to the potential impact of risk on the quality of the process. In the framework of the VRM, “value” serves as a performance indicator, while “risk” represents the risk measures associated with different alternatives. Building upon this, the proposed approach uses expected process quality-risk impact of the FM matrix.
The matrix is classified based on two dimensions: the risk impact and the EPQ value (or the process quality value considering risk) and into four quadrants as explained below:
In the strategic quadrant, manufacturing process with high process quality value but a high level of risk impact is placed. These processes require continuous monitoring of risk factors to ensure a reliable and efficient fabrication.
The leverage quadrant consists of processes which offer good process quality value but pose a low level of risk impact. They require least effort and supervision.
Manufacturing process in the Bottleneck quadrant provides low process quality value and comes with a high level of risk impact. It is crucial to develop contingency plans, explore alternative sources or seek risk mitigation strategies to ensure stable production.
Finally, the non-critical quadrant includes manufacturing process which provides low process quality and pose a low level of risk impact. These processes do not require significant attention.
5. Case study
The proposed methodology is demonstrated through a case study on the fused filament fabrication (FFF) process. An organization uses FFF systems in its five different workshops (i.e. WS1, WS2, WS3, WS4, WS5) for manufacturing adjustable wrenches. The FFF process is a widely used additive manufacturing technique that involves the deposition of molten thermoplastic material layer by layer to create three-dimensional objects. The organization aims to evaluate the process quality of the fabrication process of each workshop while considering potential risks to prioritize the workshops and implement appropriate measures for process improvement. The assessment will involve identifying and assessing the impact of risks on the manufacturing process and determining areas for enhancement. By prioritizing workshops based on their process quality and risk profiles, the company can focus on targeted improvements to optimize the overall manufacturing process and ensure customer satisfaction.
Risk identification and assessment through modified fuzzy process failure mode and effects analysis method
The primary purpose of PFMEA is to identify and analyse the potential FMs and their associated risks in the process. The conduction of PFMEA is divided into three parts, namely, system details, process details and PFMEA table, which collectively facilitate the execution of PFMEA for the FFF process. Each part provides valuable insights into the system components, process intricacies and an excerpt of PFMEA table, enabling a comprehensive assessment of FM and its potential impact on the process quality.
System details.
In this section, a detailed analysis of the FFF system is presented. The entire system is divided into sub-systems, and then the sub-systems are further subdivided into sub-sub systems (as shown in Figure 4). The complete hierarchy is as described in detail below:
Controller board: This is considered as the brain of the system. It controls all the electronic functions and regulates temperature and motion.
Filament: It is the thermoplastic feedstock for FFF system. There are several categories of filaments available with distinct physical properties and thereby requiring different temperatures to print. It is widely available in the two standard diameters of 1.75 mm and 2.85 mm.
Frame: These are the support structure for the FFF system. It supports all the mechanical and electrical components that carry out the actual fabricating work.
Motion component: This component is responsible for the print head and print bed movement and positioning in the system. It achieves its operation with the help of the following parts:
Stepper motor: Stepper motors are the specific type of high-torque motor commonly used in the FFF system to control the moving parts.
Belts: The system uses belt drives for linear motion control.
Threaded rod: It is used to control the movement of the print head.
End stops: End stops indicates the home position of each axis of the printer. When moving the printer’s axes to their home position, each axis moves towards these ends stops. The moment it reaches the end stop, the movement of that axis stops.
Power supply units: It is used to supply current to the FFF system.
Print bed surface: This is the flat surface where the fabricated product is built in. It also has an adhesive property such that the extruded plastic form temporary bonds during the time of fabrication.
Feeder system: It is used to supply the filament into the FFF system for building the product.
Extruder: The extruder motor with the gear turns and slowly pushes filament into the nozzle. This can be classified into two categories: Bowden and Direct drive.
The extruder part consists of the following sub-components:
Gears (Hobbed gear and Idler gear): These are responsible for transferring the motion from one shaft to another.
Heat cartridge: The heater cartridge is considered as the source of heat for melting the filament.
Thermocouple: This is positioned inside the heater block and is used to read the temperature of the heated filament.
Nozzle: The essence of the nozzle is to heat and melt the filament to its semi-molten stage. It is available in various sizes, which range from 0.1mm to 2mm or beyond, depending on the requirement.
Cooling fan: The cooling fan is used to cool the heated filament material deposited on the print bed surface.
Having discussed the system and the sub-systems of FFF in this section, it is essential to understand the FFF process in detail to maintain the quality of the fabricated part.
Process details.
The FFF process involves several steps, as depicted in Figure 5. It begins with creating a computer-aided design (CAD) model and converting it to an .STL file. The model is then sliced into layers, and process parameters are set. The machine is calibrated for printing. During production, the filament is fed, liquefied and extruded from the nozzle to deposit molten material layer by layer. Motion control aids in spreading the material across the print bed. The deposited material is then cooled and bonded. After production, the fabricated piece is inspected, calibrated if needed and undergoes support removal, testing and finishing for market readiness. Having explained the details of FFF system and process of FFF system, the PFMEA table is presented subsequently.
Process failure mode and effects analysis table.
As discussed in Section 4, the PFMEA is used in the identification and assessment of risk to further calculate EPQ. A detailed analysis of the FFF system, sub-system, components and process steps are presented. In this section, based on the analysis of the previous sub-sections and collected data, PFMEA for the FFF process is developed. An extensive literature survey followed by brainstorming sessions, as explained in the Methodology section (Section 4), is carried out to identify the FMs and associated risk factors for each process step. The search phase is carried out using keywords such as failure modes in FFF, risk factors in FFF, FFF process, FFF product failures etc. in the Scopus database. Only a selected important set of literatures for the identifications of risks is presented because of wording constraints in Table A1 of Appendix 1. After the brainstorming session, the finalized failure mode and risk factors are presented in the PFMEA Table. The severity and occurrence have been developed on a fuzzy scale, with details placed in Table 2 and Table 3. The path for calculations of risk factors’ weights is summarized and attached as Appendix 2 to this paper. The risk impact of the failure mode is calculated using equation (1) (Section 4) and by using the weights obtained from Table 4. The risk impact of the FMs for the five workshops is presented in Table 6. The sample calculation for the risk impact of failure mode (with maximum risk impact value) of Workshop 1 is presented in Table 7. After conducting, PFMEA for all five workshops, in each of them “dimensional is out of tolerance” is identified as the failure mode with the highest risk impact. Therefore, the features of this failure mode, i.e. dimension of the wrench component, are considered the KQC of FFF process and are used for yield calculation. The excerpts of the PFMEA table for demonstration are presented in Table 5.
Analysis of risk impact on process quality and risk mitigation strategy through value–risk matrix
The targeted manufacturing process quality in terms of yield considering the KQC of the product for the five workshops is tabulated in Table 8. Next, the EPQ value considering the impact of risk is presented in Table 8 by using the equation (2) presented in Section 4. The plotted graph as shown in Figure 6 from the VRM provides valuable insights into the relationship between risk impact and process quality, allowing for a comprehensive interpretation of the assessed risks. The graph consists of two axes: the horizontal axis represents the impact of the assessed risks, while the vertical axis represents the EPQ for the five workshops. The graph allows for prioritization of risks based on their position. Workshop 4 in the upper-right quadrant of the graph indicates high-impact and high-probability risks, which demand immediate attention and mitigation strategies. The strategies for risk mitigation are further explained in the results and discussion section. Conversely, Workshops 2 and 3 situated in the upper-left quadrant have a high process quality but relatively low impact, suggesting the need for proactive measures to prevent their occurrence. Workshop 1 and 5 in the lower-right quadrant have low probability and high impact, indicating the importance of developing contingency plans or response strategies.
6. Results and discussions
The three-dimensional (3D) plot as shown in Figure 7 visualizes the relationship between risk impact, yield and expected process quality. The x-axis represents the risk impact, which indicates the potential negative consequences or risks associated with the manufacturing process. The y-axis represents the yield, which reflects the level of successful outcomes or defect-free results achieved in the process and the z-axis represents the expected process quality. Interpreting the graph, we can observe different patterns and trends. When the risk impact is low and the yield is high, the expected process quality is also high, indicating a favourable outcome with minimal risks. Conversely, as the risk impact increases or the yield decreases, the expected process quality decreases, indicating a higher likelihood of quality issues or failures in the manufacturing process. The graph allows to identify the trade-off between risk and yield in relation to process quality. It provides insights into the critical points where improvements are needed to enhance process quality. For example, if the graph shows regions with low expected process quality, it indicates areas where risks need to be addressed or yield needs to be improved to achieve better overall process quality. Furthermore, the graph enables a visual comparison and evaluation of different combinations of risk impact and yield, allowing decision-makers to identify optimal ranges or thresholds for risk management and yield optimization. It helps in identifying the sweet spot where the expected process quality is maximized while balancing the associated risks. Overall, the 3D plot provides a comprehensive understanding of the relationship between risk impact, yield and expected process quality, enabling organizations to make informed decisions and take necessary actions to improve their manufacturing processes and achieve higher levels of quality.
The graph as shown in Figure 8 depicts the process quality with and without the consideration of risk impact. By comparing the two lines (coloured blue and orange) representing the process quality scenarios, insights into the influence of risk on the overall process quality of the workshops is gained. When examining the blue line in the graph (Figure 8) representing process quality without considering risk impact, it provides a baseline measure of the inherent quality levels of the manufacturing processes for the workshops (namely, WS1, WS2, WS3, WS4, WS5). This represents the ideal scenario where risks are not taken into account, and the processes operate optimally. It shows how the process quality varies in adherence to specifications. In contrast, the orange line in the graph (Figure 8) representing process quality with risk impact takes into consideration the potential negative consequences or risks associated with the process. This includes risk factors, such as layer thickness, contamination, humidity, that could impact the quality of the final product. By incorporating risk impact, the graph showcases a more realistic view of the process quality, accounting for potential deviations from the desired outcomes. Overall, the graph provides a visual representation of the relationship between process quality and risk impact, allowing organizations to assess the impact of risks on their manufacturing processes and prioritize efforts to enhance overall quality by effectively managing and minimizing risks.
The graph as shown in Figure 9 depicts the risk impact of various risk factors in five different workshops. By examining the graph, insights into the relative significance of each risk factor across the workshops and their potential impact on the manufacturing processes can be obtained. The graph showcases the variation in risk impact values for each risk factor, with the workshops represented along the horizontal axis and the risk impact scale on the vertical axis. Interpreting the graph allows to identify the risk factors that have the most significant impact on the manufacturing processes in each workshop. For example, as illustrated in Figure 6, WS4 (workshop 4) needs immediate risk improvement strategies. As depicted in Figure 9, it is observed that the risk factor w4 (designing skill) followed by w5 (operating skill) and w1 (layer thickness) have high-risk impact and should be given necessary attention for the workshop. Overall, the graph provides a visual representation of the risk impact of different factors across multiple workshops. This information helps in prioritizing risk management efforts, allocating resources effectively and developing targeted strategies to mitigate the identified risks and enhance the overall resilience and robustness of the manufacturing processes. The proposed approach is validated with the help of sensitivity analysis in the subsequent paragraph.
Sensitivity analysis
A sensitivity analysis is conducted to assess the robustness and validity of the proposed approach. The weights of the risk factors are varied and their effect on the EPQ of the workshops is studied. An approach introduced by Keshavarz Ghorabaee et al. (2018) is used for the purpose of generating weightings for the risk factors. Following this approach, the weights for the risk factors are systematically altered, ranging from two extreme values within each set. These generated weights are documented in Table 9 and illustrated in Figure 10. The EPQ of the workshops are calculated with respect to the varied set of weights and are ranked on the basis of EPQ. The ranking results, reflecting the impact of varying weight assignments, are presented in Figure 11. The analysis exhibits that the ranking outcome is sensitive and reliable to the varying weights, affirming the validity and reliability of the proposed approach.
Theoretical implications
In this research work, the proposed methodology for risk-based Six Sigma quality assessment is simple to understand and easy to comprehend without unwarranted mathematical complexities and has not been attempted in past. The efforts are made to identify the risk factors that have direct bearing on O and S of the process FMs, while estimating RPN. The relation between the identified risk factors and Six Sigma quality in terms of DPMO and Sigma level is further established to understand the impact of each such factor on the process quality. This research work provides an added dimension to the traditional process Six Sigma quality assessment which otherwise shows an overestimated quality performance. This research work develops a practical concept in synchronization with the quality concepts advocated in ISO 9001:2015 and can be well integrated with the concepts of Quality 4.0. The adoption of MCDM-based fuzzy PFMEA for estimation of factor weights provides a simplistic and a precise method to be used in absence of quantitative data analysis that can be resorted to during design of new products and services.
Managerial implications
The proposed methodology has been demonstrated with a case study of FFF process. The designers, manufacturers and process engineers can use the developed concept and the methodology to identify and estimate the impacts of the various risk factors on various process FMs with more clarity. This will enhance the spectrum of understanding of the process by all stakeholders. Further, they can use the weights of the risk factors in evaluating the process quality through an integrated metric with erstwhile DPMO and sigma measures as devised through the proposed methodology. This will help the managers with a more precise quality metric rather than an overestimated one. The industry can thereafter allocate their operational and other managerial resources in a more scientific manner. An improved risk-based Six Sigma quality metric will also help in understanding predicted and future defects in the product that can help in understanding the process capability during the manufacturing process and in making accurate decisions while formulating warranty policies.
7. Conclusion
This paper presents a modified approach that extracts input weights of risk factors from an integrated MCDM–PFMEA approach and also proposes a methodology to assess the impact of risks on Six Sigma process quality evaluation. By overcoming the limitations of traditional PFMEA, this approach offers a novel way to analyse risks and establish a relationship between risk factors and process quality metrics. The proposed approach aligns with the requirement of Clause 6.1 of ISO 9001:2015 standard for the study of risk-based quality assessment and hold potential to work under Quality 4.0 scenario. The case study on FFF serves as a practical demonstration of the proposed methodology’s applicability.
The 3D plot presented in the results and discussion section provides a clear understanding of relationship between risk impact, yield and EPQ, enabling organizations to make informed decisions and implement the appropriate actions to enhance manufacturing processes and attain higher process sigma level. From the comparative analysis, it is observed that the identified risks have a significant negative influence on the overall quality of the process. The results indicate that the modified approach effectively identifies risks and facilitates quality improvement efforts, even in scenarios where numerical data is limited or unavailable in adequate amount.
In this paper, the inputs obtained from the extensive literature review and the experts are rigorous. The experts involved are specialists in the field and have hands-on experience for over more than eight years and hence quite confident. In addition, fuzzy techniques have also been used to accommodate subjectivities in the experts’ judgement. Further, the proposed approach objectively quantifies the co-relationship between risk and Six Sigma which would be beneficial for enhancing their risk identification and assessment processes, prioritizing resource allocation and making informed decisions to mitigate risks and improve overall process quality. By adopting this approach, organizations can enhance their risk identification and assessment processes, prioritize resource allocation and make informed decisions to mitigate risks and improve overall process quality. Furthermore, the integration of fuzzy MCDM with PFMEA enables a comprehensive evaluation of risks and their impact on process performance.
The proposed approach uses fuzzy MCDM-based approach for weight assessment of risk factors. The researcher fraternity may explore advanced techniques for acquisition of more precise risk factor weights as much as possible for such qualitative analysis. This study considers a single decision period, potentially overlooking changes in criteria weights or preferences over time. Also, assuming uniform linguistic term sets may oversimplify decision-makers’ preferences. Future research could collect data from multiple periods for dynamic MCDM, accommodating varying linguistic sets’ granularities.
Future research can also focus on expanding the application of this approach to different industries and processes, exploring additional criteria for risk assessment and refining the weighting methodology to enhance accuracy and reliability.
Figure 1.Proposed framework
Figure 2.Network representation of the model
Figure 3.Value–risk matrix
Figure 4.Sub-division of the entire FFF system
Figure 5.Process flow chart of FFF process
Figure 6.Value–risk matrix for the selected case study
Figure 7.(x) risk impact vs (y) process yield vs (z) expected process quality
Figure 8.Comparison of process quality considering with and without risk impact
Figure 9.Impact of each risk factor in the five workshops
Figure 10.Graphical view of the generated weights
Figure 11.Ranking of alternatives’ EPQ with respect to the generated varying weights
Figure A1.Analytic network process diagram
Table 1.
Experts details
| Sl. no. | Designation | Qualification | Experience (years) |
|---|---|---|---|
| 1. | Manufacturing engineer | Masters in engineering | 12 years |
| 2. | Design engineer | Masters in engineering | 9 years |
| 3. | Quality control inspector | Bachelor degree in engineering | 8 years |
| 4. | Production personnel | Diploma in engineering | 9 years |
| 5. | Customer representative | Graduate | 10 years |
Source:Authors’ own work
Table 2.
Fuzzy-based rating for severity
| Severity | Fuzzy no. |
|---|---|
| Frequent | (7,9,9) |
| Probable | (5,7,9) |
| Occasional | (3,5,7) |
| Remote | (1,3,5) |
| Extremely unlikely | (1,1,1) |
Source:Authors’ own work
Table 3.
Fuzzy based rating for occurrence
| Occurrence | Probability of failure range | Fuzzy no. | DPMO range | Yield range (%) |
|---|---|---|---|---|
| Frequent | p >= 0.20 | (7,9,9) | > 200,000 | < 80 |
| Probable | 0.10 <= p < 0.20 | (5,7,9) | 100,000–199,999 | 80–90 |
| Occasional | 0.01 <= p < 0.10 | (3,5,7) | 10,000–99,999 | 90–99 |
| Remote | 0.001 <= p < 0.01 | (1,3,5) | 1,000–9,999 | 99–99.9 |
| Extremely unlikely | p < 0.001 | (1,1,1) | < 1,000 | > 99.9 |
Source:Authors’ own work
Table 4.
Weights obtained from fuzzy ANP
| Factors | Weights from fuzzy ANP |
|---|---|
| Layer thickness (w1) | 0.391 |
| Printing temperature (w2) | 0.130 |
| Flow rate (w3) | 0.047 |
| Designing skill (w4) | 0.154 |
| Operating skill (w5) | 0.178 |
| Filament quality (w6) | 0.023 |
| Filament thermal characteristics (w7) | 0.047 |
| Contamination (w8) | 0.011 |
| Humidity (w9) | 0.014 |
Source: Authors’ own work
Table 5.
PFMEA: the process step with highest risk impact estimate
| Sl. no. | Process step | Function | Failure mode | Causes of failure | Effect of failure |
|---|---|---|---|---|---|
| 1. | CAD model design and STL file conversion | Creation of model by design software/scanner. The STL file format is used to encrypt the surface geometry of a three-dimensional object. It encrypts this data using a method called tessellation or triangular approximation | Dimension is out of tolerance | 1. Poor designing skill of the designer |
1. Part failure |
Source:Authors’ own work
Table 6.
Risk impact of key FMs of five workshops
| Workshops (WS) | Risk impact of the FMs |
|---|---|
| 1 | 33.42 |
| 2 | 21.17 |
| 3 | 21.12 |
| 4 | 31.66 |
| 5 | 38.35 |
Source: Authors’ own work
Table 7.
Sample calculation for the risk impact (RI) of the failure mode (Workshop 1)
| Risk factors | Severity | Occurrence | Crisp S | Crisp O | Factor weightage | RI of risk factors | RI of failure mode |
|---|---|---|---|---|---|---|---|
| w1 | 1,3,5 | 7,9,9 | 3 | 8.5 | 0.391 | 9.99 | 33.42 |
| w2 | 7,9,9 | 5,7,9 | 8.5 | 7 | 0.130 | 7.77 | |
| w3 | 5,7,9 | 3,5,7 | 7 | 5 | 0.047 | 1.66 | |
| w4 | 3,5,7 | 7,9,9 | 5 | 8.5 | 0.154 | 6.56 | |
| w5 | 7,9,9 | 1,3,5 | 8.5 | 3 | 0.178 | 4.54 | |
| w6 | 3,5,7 | 1,1,1 | 5 | 1 | 0.023 | 0.118 | |
| w7 | 7,9,9 | 3,5,7 | 8.5 | 5 | 0.047 | 2.019 | |
| w8 | 1,3,5 | 5,7,9 | 3 | 7 | 0.011 | 0.249 | |
| w9 | 3,5,7 | 5,7,9 | 5 | 7 | 0.014 | 0.498 |
Source:Authors’ own work
Table 8.
Calculated EPQ value
| Workshops |
Risk impact of failure modes (RIFM) | Normalized RIFM value | Without consideration of risk | With consideration of risk | ||||
|---|---|---|---|---|---|---|---|---|
| Targeted | Sigma level | DPMO | Expected process quality (EPQ) | Expected DPMO | Expected |
|||
| 1 | 9.99 | 0.229 | 97.5 | 3.46 | 25000 | 75.17 | 248300 | 2.17 |
| 2 | 7.77 | 0.145 | 96 | 3.25 | 40000 | 82.08 | 179200 | 2.42 |
| 3 | 1.66 | 0.144 | 95.5 | 3.20 | 45000 | 81.74 | 182600 | 2.40 |
| 4 | 6.56 | 0.217 | 99 | 3.82 | 10000 | 77.51 | 224900 | 2.25 |
| 5 | 4.54 | 0.263 | 95 | 3.15 | 50000 | 70.01 | 299900 | 2.00 |
Notes:***Sample calculation for expected process quality (EPQ)/DPMO/Sigma level (Workshop 1);
EPQ = (1 – RIFM) × YT
EPQ = (1 – 0.229) × 97.5 = 75.17;
expected DPMO = 1000000 × (1 – 75.17 / 100) = 248300;
Source: Authors’ own work
Table 9.
Generated weights for ranking
| Set no. | w1 | w2 | w3 | w4 | w5 | w6 | w7 | w8 | w9 |
|---|---|---|---|---|---|---|---|---|---|
| w1 | 0.022 | 0.044 | 0.066 | 0.088 | 0.111 | 0.133 | 0.155 | 0.177 | 0.2 |
| w2 | 0.044 | 0.066 | 0.088 | 0.111 | 0.133 | 0.155 | 0.177 | 0.2 | 0.022 |
| w3 | 0.066 | 0.088 | 0.111 | 0.133 | 0.155 | 0.177 | 0.2 | 0.022 | 0.044 |
| w4 | 0.088 | 0.111 | 0.133 | 0.155 | 0.177 | 0.2 | 0.022 | 0.044 | 0.066 |
| w5 | 0.111 | 0.133 | 0.155 | 0.177 | 0.2 | 0.022 | 0.044 | 0.066 | 0.088 |
| w6 | 0.133 | 0.155 | 0.177 | 0.2 | 0.022 | 0.044 | 0.066 | 0.088 | 0.111 |
| w7 | 0.155 | 0.177 | 0.2 | 0.022 | 0.044 | 0.066 | 0.088 | 0.111 | 0.133 |
| w8 | 0.17777 | 0.2 | 0.0222 | 0.044 | 0.066 | 0.088 | 0.111 | 0.133 | 0.155 |
| w9 | 0.2 | 0.022 | 0.044 | 0.066 | 0.088 | 0.111 | 0.133 | 0.155 | 0.177 |
Source:Authors’ own work
Table A1.
Excerpts of literature survey for the identification of FM and risk factors
| FM/ risk factors | Sources |
|---|---|
| Dimension is out of tolerance (dimensional inaccuracy)- FM | Bähr and Westkämper (2018) |
| Layer thickness (w1) | Bähr and Westkämper (2018); Wichniarek et al. (2021) |
| Printing temperature (w2) | Bähr and Westkämper (2018); Bellini et al. (2004); Wichniarek et al. (2021) |
| Flow rate (w3) | Bellini et al. (2004) |
| Designing skill (w4) | Song and Telenko (2019) |
| Operating skill (w5) | Song and Telenko (2019) |
| Filament quality (w6) | Bähr and Westkämper (2018); Song and Telenko (2019) |
| Filament thermal characteristics (w7) | Bähr and Westkämper (2018); Bellini et al. (2004) |
| Contamination (w8) | Tang and Seeger (2022) |
| Humidity (w9) | Wichniarek et al. (2021) |
Source:Authors’ own work
Table A2.
Triangular fuzzy conversion scale
| Linguistic scale | Triangular fuzzy numbers | The inverse of triangular fuzzy numbers |
|---|---|---|
| Equal importance | (1,1,1) | (1,1,1) |
| Moderate importance | (1,3,5) | (1 / 5,1 / 3,1) |
| Strong importance | (3,5,7) | (1 / 7,1 / 5,1 / 3) |
| Very strong importance | (5,7,9) | (1 / 9,1 / 7,1 / 5) |
| Demonstrated importance | (7,9,9) | (1 / 9,1 / 9,1 / 7) |
Source:Authors’ own work
Table A3.
Pair wise comparison matrix for factor categories
| Machine | Human | Material | Environment | |
|---|---|---|---|---|
| Machine | (1,1,1) | (3,5,7) | (5,7,9) | (3,5,7) |
| Human | (1 / 7,1 / 5,1 / 3) | (1,1,1) | (1,3,5) | (3,5,7) |
| Material | (1 / 9,1 / 7,1 / 5) | (1 / 5,1 / 3,1) | (1,1,1) | (1,3,5) |
| Environment | (1 / 7,1 / 5,1 / 3) | (1 / 7,1 / 5,1 / 3) | (1 / 5,1 / 3,1) | (1,1,1) |
Source:Authors’ own work
Table A4.
Importance of factors with respect to the factor categories
| W2 | Machine | Human | Material | Environment |
|---|---|---|---|---|
| Layer thickness | 0.71 | 0 | 0 | 0 |
| Printing temperature | 0.19 | 0 | 0 | 0 |
| Flow rate | 0.092 | 0 | 0 | 0 |
| Designing skill | 0 | 0.5 | 0 | 0 |
| Operating skill | 0 | 0.5 | 0 | 0 |
| Filament quality | 0 | 0 | 0.292 | 0 |
| Filament thermal characteristics | 0 | 0 | 0.707 | 0 |
| Contamination | 0 | 0 | 0 | 0.89 |
| Humidity | 0 | 0 | 0 | 0.1 |
Source:Authors’ own work
Table A5.
The inner dependence matrix of the factor categories
| W3 | Machine | Human | Material | Environment |
|---|---|---|---|---|
| Machine | 0.59 | 0.7 | 0.64 | 0 |
| Human | 0.11 | 0.29 | 0 | 0 |
| Material | 0 | 0 | 0.31 | 0.29 |
| Environment | 0.28 | 0 | 0.04 | 0.71 |
Source:Authors’ own work
Table A6.
The inner dependence matrix of the factors
| W4 | Layer thickness | Printing temperature | Flow rate | Designing skill | Operating skill | Filament quality | Filament diameter | Temperature | Humidity |
|---|---|---|---|---|---|---|---|---|---|
| Layer thickness | 0.71 | 0 | 0 | 0 | 0.470 | 0 | 0 | 0 | 0 |
| Printing temperature | 0 | 0.71 | 0 | 0 | 0.303 | 0 | 0 | 0 | 0 |
| Flow rate | 0 | 0 | 0.71 | 0 | 0.112 | 0 | 0 | 0 | 0 |
| Designing skill | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |
| Operating skill | 0.29 | 0.29 | 0.29 | 0 | 0.114 | 0 | 0 | 0 | 0 |
| Filament quality | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| Filament thermal characteristics | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.71 |
| Contamination | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| Humidity | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.29 |
Source:Authors’ own work
Table A7.
Factors’ weights obtained using fuzzy ANP
| Factors | Weights from fuzzy ANP |
|---|---|
| Layer thickness | 0.391 |
| Printing temperature | 0.130 |
| Flow rate | 0.047 |
| Designing skill | 0.154 |
| Operating skill | 0.178 |
| Filament quality | 0.023 |
| Filament thermal characteristics | 0.047 |
| Contamination | 0.011 |
| Humidity | 0.014 |
Source:Authors’ own work
© Emerald Publishing Limited.
