To differentiate it from “classical” FMEA, the result of the collaboration between AIAG (Automotive Industry Action Group) and VDA (Verband der Automobilindustrie) is called the “aligned” Failure Modes and Effects Analysis process. Using a seven-step approach, the aligned analysis incorporates significant work content that has typically been left on the periphery of FMEA training, though it is essential to effective analysis. In this installment of the “FMEA” series, development of a Design FMEA is presented following the seven-step aligned process. Use of an aligned documentation format, the “Standard DFMEA Form Sheet,” is also demonstrated. In similar fashion to the classical DFMEA presentation of Vol. III, the content of each column of the form will be discussed in succession. Review of classical FMEA is recommended prior to attempting the aligned process to ensure a baseline understanding of FMEA terminology. Also, comparisons made between classical and aligned approaches will be more meaningful and, therefore, more helpful. The format of the aligned FMEA form is significantly different from that in classical FMEA. It guides the user along the seven-step path, with the information required in each step organized in multiple columns. Color-coding is used to correlate related information in each step. The aligned “Standard DFMEA Form Sheet” (“Form A” in Appendix A of the AIAG/VDA FMEA Handbook) is reproduced in Exhibit 1. For ease of presentation, the form is shown in a stacked format. In use, however, corresponding information should be recorded in a single row. As has been done in previous installments of the “FMEA” series, portions of the form will be shown in close-up, with reference bubbles correlating to discussion in the text. The goal is to facilitate learning the aligned process by maintaining the links between the 7-step approach, form sections, and individual columns where information is recorded. Conducting an ‘Aligned’ Design FMEA Failure Modes and Effects Analysis is conducted in three “stages” – System Analysis, Failure Analysis and Risk Mitigation, and Risk Communication. These three stages are comprised of the seven-step process mentioned previously. A graphical representation of the relationships between the three stages and seven steps is shown in Exhibit 2. For each step, brief reminders are provided of key information and activities required or suggested. Readers should reference this summary diagram as each step is discussed in this presentation and while conducting an FMEA. System Analysis 1st Step – Planning & Preparation Classical FMEA preparations have often been viewed as separate from analysis. Recognizing the criticality of effective planning and preparation, the aligned process from AIAG and VDA formally incorporate them in the 7-step approach to FMEA. This is merely a change in “status,” if you will, for preparatory activities. Thus, the discussion in “Vol. II: Preparing for Analysis” remains valid, though introduction of the tools is dispersed among steps in the aligned approach. Specifically, the realm of appropriate context for FMEA remains the three uses cases defined, briefly described as (1) new design, (2) modified design, or (3) new application. Within the applicable context, the FMEA team must understand the level of analysis required – system, subsystem, or component. The core and extended analysis team members are chosen in the same manner as before. Whereas classical FMEA defines four customers of concern, the aligned process focuses on two: (1) assembly and manufacturing plants and (2) end users. Defining customers in advance facilitates efficient, thorough analysis, with all foreseeable Effects of Failure identified. To state it generally, the inputs needed to conduct an effective FMEA have not changed. However, the aligned process provides a framework for organizing the accumulated information into a coherent project plan. Using the Five Ts structure introduced in “FMEA – Vol. V: Alignment,” project information is presented in a consistent manner. The scope of analysis, schedule, team members, documentation requirements, and more are recorded in a standardized format that facilitates FMEA development, reporting, and maintenance. Key project information is recorded on the DFMEA form in the header section labeled “Planning & Preparation (Step 1),” shown in Exhibit 3. The labels are largely self-explanatory, though some minor differences exist between the aligned and classical forms. One such difference is the definition of the “Subject” (1) of analysis. On this line, identify the design analyzed by name, part number, and “nickname,” if commonly referred to by another identifier. For example, an aircraft flight data recorder is commonly known as a “black box;” some automobiles have similar devices that may use the same moniker. Key Date on the classical form has been replaced by “Start Date” (2) on the aligned form; both refer to the design freeze date. “Confidentiality Level” (3) has been added to the form. Three levels are suggested in the Handbook – “Business Use,” “Proprietary,” and “Confidential” – but no further guidance on their application is provided. Discussions should take place within an organization and with customers to ensure mutually agreeable use of these designations and information security. Refer to the “FMEA Form Header” section of “Vol. III: ‘Classical’ Design Failure Modes and Effects Analysis” for additional guidance on completing this section of the form. Continuous Improvement The first two columns in the body of the DFMEA form, shown in Exhibit 4, are not included in the discussion of the seven steps. The first column, “Issue #” (A), is used to simply number entries for easy reference. The purpose of column B, “History/Change Authorization,” is left to users’ interpretation. Presumably, it is used to record management approval of design changes or acceptance of existing risk, though the Handbook offers no guidance on this topic. In whatever fashion an organization chooses to use this portion of the form, it should be documented in training to ensure consistency and minimize confusion. 2nd Step – Structure Analysis In the Structure Analysis step, the design described in Step 1 is defined further by decomposing it into systems, subsystems, and components. To visualize the scope of analysis, block diagrams and structure trees are used. Only those elements over which the analysis team or FMEA owner (“Design Responsible”) exerts design control are within the scope of analysis, as identified by the boundaries of the diagrams. Interfaces may be in or out of scope, depending on the system in question. Design responsibility must be clearly defined to prevent oversight of important performance characteristics and controls. Five “primary” interface types are delineated in the Handbook: “physical connection,” “material exchange,” “energy transfer,” “data exchange,” and “human-machine.” A sixth type, “physical clearance,” is mentioned separately, though it is of equal importance to the five primary types. In addition to the type, interface analysis must also define the strength and nature (e.g. positive/negative, advantageous/detrimental, etc.) of each interface, whether internal or external to the system. Proper analysis and development of adequate controls requires a full complement of information. Structure Analysis is recorded on the DFMEA form in the section displayed in Exhibit 5. In column C, the “highest level of integration” is identified as the “Next Higher Level.” This is the highest level system within the scope of analysis, where the Effects of Failure will be noticed. The “Focus Element,” named in column D, is the item in the failure chain for which Failure Modes, the technical descriptions of failures, are identified. In column E, name the “Next Lower Level” of the structure hierarchy, where Causes of Failure will be found. Creation of a structure tree as a component of preparation for Structure Analysis is particularly helpful in populating the DFMEA form. As can be seen in the example in Exhibit 6, the organization of information lends itself to direct transfer to the relevant section of the standard form. Use of this tool can accelerate analysis and assure a comprehensive assessment of components. Enter information in the DFMEA form proceeding across (left to right), then down, to accommodate multiple Failure Modes of a single Focus Element. 3rd Step – Function Analysis Function Analysis is the next level in the FMEA information hierarchy; it closely parallels Structure Analysis. As seen in Exhibit 7, the Function Analysis columns in the DFMEA form are numbered, labeled, and color-coded to create visual links to those in the Structure Analysis section. In this section, the functions and requirements of the Focus Element and adjacent levels are recorded in the same pattern used in Step 2. A Focus Element may serve multiple functions and have multiple requirements; each should be considered separately, recorded in its own row on the form to facilitate thorough analysis. Like the structure tree, information can be transferred directly from the function tree to the DFMEA form. The function tree is constructed in the same manner as the structure tree in Step 2, replacing visual representations of components with descriptions of their contributions to the operation of the system. An example function tree and information transfer to DFMEA are shown in Exhibit 8. Developing a parameter diagram can help organize analysis information and visualize influences on the system in operation. Inputs and outputs, noise and control factors, functions, and requirements can be concisely presented in this format. An example parameter diagram (P-Diagram) is shown in “Vol. II: Preparing for Analysis;” another, from the AIAG/VDA FMEA Handbook, is shown in Exhibit 9. Failure Analysis and Risk Mitigation 4th Step – Failure Analysis Failure Analysis is the heart of an FMEA, where a system’s failure network is established. The links between Failure Modes, Effects of Failure, and Causes of Failure, in multiple levels of analysis, are made clear in this step. The definitions of Failure Mode, Effect of Failure, and Cause of Failure, in practice, remain the same as in classical FMEA. Briefly, these are:
Each element is not limited to a single failure chain. The occurrence of a Failure Mode may be perceived in several ways (Effects of Failure) or have multiple potential causes. A Cause of Failure may also result in several potential Failure Modes, and so on. Considering failure chains at multiple levels of analysis within a system defines the system’s failure network. A Failure Mode at the highest level of integration becomes an Effect of Failure when analyzing the next lower level. Similarly, a Cause of Failure is the Failure Mode in the next lower level of analysis. This transformation, presented in Exhibit 11, continues through all levels of subsystem analysis to the component-level FMEAs. Each failure chain is documented in a single row of the DFMEA form, in the Failure Analysis section, shown in Exhibit 12. The form maintains the relationships within the failure chain, as shown in Exhibit 10 and discussed above. It also maintains the numbering and color-coding convention that links Failure Analysis to Function Analysis (Step 3) and Structure Analysis (Step 2), as shown in Exhibit 13. As is the case for the previous two steps, information can be transferred directly from the structure tree, were it created in advance, to the DFMEA form. The Failure Analysis information transfer is shown in Exhibit 14. To complete the Failure Analysis step, the Effect of Failure must be evaluated and assigned a Severity (S) score. To do this, consult the Severity criteria table, shown in Exhibit 15, and select the Severity score that corresponds to the applicable effect and criteria description. Enter this number on the DFMEA form in the column labeled “Severity (S) of FE.” If there is uncertainty or disagreement about the appropriate Severity score to assign an Effect of Failure (e.g. “Is it a 3 or a 4?”), select the highest score being considered to ensure that the issue receives sufficient attention as development proceeds. Confidence in evaluations and scores typically increase as a design matures; concordance often follows. The last column of the “DFMEA Severity Criteria Table” (Exhibit 15) is left blank in the Handbook because it is not universally applicable. An organization can record its own examples of Effects of Failure to be used as comparative references when conducting a new FMEA or to aid in training. Examples cited for various Severity scores can provide great insight into an organization’s understanding, or lack thereof, of the customer perspective. 5th Step – Risk Analysis The design team conducts Risk Analysis to identify the controls used to prevent or detect failures, evaluate their effectiveness, and prioritize improvement activities. Relevant information is recorded in the Risk Analysis section of the DFMEA form, shown in Exhibit 16. In column F, record the Design Prevention Controls incorporated to preclude activation of the failure chain. A vast array of prevention controls is available; however, it is likely that a very limited selection is applicable to any single Cause of Failure. Cite only those expected to prevent the specific Cause of Failure with which they are correlated (i.e. in the same row on the form). Though not an exhaustive list of Design Prevention Controls, some examples follow:
If sufficient data is available to make reasonable predictions, a quantitative method of scoring can be used. For this, substitute the alternate DFMEA Occurrence criteria table, shown in Exhibit 18, for the table of Exhibit 17. The detailed criteria descriptions are unchanged; the “incidents per 1000” estimates simply replace the qualitative summary terms (high, low, etc.). If the frequency of occurrence of a Cause of Failure “falls between” two Occurrence scores, or there is disagreement about the correct frequency, select the higher Occurrence score. Additional review in subsequent development cycles is likely to resolve the matter more efficiently than extensive debate in early stages. Both DFMEA Occurrence tables have a column left blank for organization-specific examples to be recorded. These can be used as comparative references to facilitate future DFMEA development or as training aids. In column H of the Risk Analysis section of the DFMEA form, identify the Design Detection Controls in place to warn of the existence of the Failure Mode or Cause of Failure before the design is released to production. Examples include endurance testing, interference analysis, designed experiments, proof testing, and electrical (e.g. “Hi-Pot”) testing. Assess the effectiveness of current detection controls according to the Detection criteria table, shown in Exhibit 19. Select the Detection (D) score that corresponds to the Detection Method Maturity and Opportunity for Detection descriptions that most accurately reflect the state of the controls and enter it in column J. If there is a discrepancy between the Detection Method Maturity and Opportunity for Detection, for example, select the higher Detection score. As a design matures, its controls evolve. Review in subsequent development cycles is more efficient than dwelling on the matter in early stages. Like the S and O tables, the D criteria table has a column left blank for organization-specific examples. Future DFMEA development and training may benefit from the experience captured here. Determining the priority of improvement activities in aligned FMEA is done very differently than the classical method. For this purpose, the AIAG/VDA Handbook introduces Action Priority (AP), where a lookup table replaces the RPN calculation of classical FMEA. The AP table is shown in Exhibit 20. The AP Table is used to assign a priority to improvement activities for a failure chain. To do this, first locate the row containing the Severity score in the S column. Then, in the O column, find the sub-row containing the Occurrence score, then the sub-row containing the Detection score in the D column. In this sub-row, in the Action Priority (AP) column, one of three priorities is assigned:
The final column of the AP Table, “Comments,” is left blank. Organization-specific protocols, historical projects, acceptance authority, or other information can be cited here to assist DFMEA teams in completing analyses. Column L in Exhibit 16 is an optional entry. “Filter Code” could be used in various ways. Examples include:
6th Step – Optimization In the Optimization step, improvement activities are identified to reduce risk, assigned to individuals for implementation, monitored for progress, and evaluated for effectiveness. Relevant information is recorded in the Optimization section of the DFMEA form, shown in Exhibit 21. Preventive Actions (column M) are preferred to Detection Actions (column N); it is far better to eliminate an issue than to develop a better reaction to it. The Handbook asserts that the most effective sequence of implementation is as follows:
The Handbook suggests five possible statuses, to be recorded in column P, for the actions defined:
The remaining columns in the Optimization section of the DFMEA form are, essentially, carryovers from classical FMEA, with predicted AP replacing predicted RPN. Refer to Vol. III for further discussion of these entries. Risk Communication 7th Step – Results Documentation Much like Planning and Preparation, Results Documentation consists of tasks that have been performed for, but often considered separate from, classical FMEA. The aligned approach formalizes Risk Communication as an essential component of analysis by incorporating it in the Seven-Step Approach. A significant portion of the content of an FMEA Report, as outlined in the Handbook, is contained in the DFMEA form. For example, the scope of analysis, high-risk failures, action prioritization, status of actions, and planned implementation dates can be culled from the form. A comparison of the project plan, created in Step 1, with the execution and final status of the analysis may also be a useful component of the report. Future analysis teams may be able to apply lessons learned from deviations revealed in this assessment. AIAG/VDA also suggest that a commitment to review and revise the DFMEA be included in the FMEA Report. The discussion of classical DFMEA treated this as if it were a separate endeavor, providing further evidence of the expanse of AIAG/VDA’s efforts to create a thorough and consistent analysis process. For further discussion on the topic, see “Review and Maintenance of the DFMEA” in FMEA – Vol. III: “Classical” Design Failure Modes and Effects Analysis. A Design FMEA is a valuable development and communication tool. It ensures that the impacts of a product design on customers and other stakeholders are given proper consideration. It ensures that legal and regulatory requirements are met. It also creates a record of development activity that can be used to refine a product, develop a derivative product, or train others to develop successful products. Practitioners are encouraged to extract maximum value from FMEA and, in the effort, possibly discover a competitive advantage for their organization. For additional guidance or assistance with Operations challenges, feel free to leave a comment, contact JayWink Solutions, or schedule an appointment. For a directory of “FMEA” volumes on “The Third Degree,” see Vol. I: Introduction to Failure Modes and Effects Analysis. References [Link] “Potential Failure Mode and Effects Analysis,” 4ed. Automotive Industry Action Group, 2008. [Link] “FMEA Handbook.” Automotive Industry Action Group and VDA QMC, 2019. Jody W. Phelps, MSc, PMP®, MBA Principal Consultant JayWink Solutions, LLC jody@jaywink.com
0 Comments
Your comment will be posted after it is approved.
Leave a Reply. |
AuthorIf you'd like to contribute to this blog, please email jay@jaywink.com with your suggestions. Archives
November 2023
Categories
All
![]() © JayWink Solutions, LLC
|