Balancing the Need for Speed Versus Quality in Product Development

While every organization is different, PLM can help find the balance.

In the engineering world, new product development refers to a holistic framework to bring new products and associated services to the market. This framework includes the use of an integrated network of enterprise platforms with associated tools and methods to develop, industrialize and commercialize new products. It typically follows a rigorous stage-gate process, from product ideation, planning, concept design to detailed engineering, testing and validation, manufacturing planning and execution, the start of production, commercialization, maintenance and support. 

Once things are in full speed motion, it is obviously harder to change them, and sometimes even monitor them. (Stock image.)

Once things are in full speed motion, it is obviously harder to change them, and sometimes even monitor them. (Stock image.)

Product Lifecycle Management, or PLM, refers to the combined operating ecosystem and digital platform to manage, share and reuse enterprise data along the product development lifecycle and beyond. As business leaders and entrepreneurs focus on delivering new products, they look for ways to improve delivery time, throughput and productivity, customer and employee experience, and perhaps before all, consistently delivering quality. 

First and foremost, this means tracking compliance to legislative requirements and quality standards as well as achieving and maintaining the required certification levels. This perhaps is not a new topic, but still a relevant one: how do companies balance the need for speed, to deliver products to the market quickly, with the need for product quality traceability? How do PLM platforms contribute to enabling product development and achieving this balance? Beyond the toolset, how does this materialize in terms of the business improvement roadmap?

PLM and the Need for Speed

Speed is critical for any manufacturing start-up: getting concepts out to capture product feedback, planning the next development phase, releasing the next funding tranches, seeking new shareholder investments, etc. For established organizations, the need for speed materializes in improved time to deliver products to the market—i.e., how do companies get a new product out more effectively.

From a PLM perspective, improving the time consists of a combination of:

  • Reducing physical prototyping, increasing the use of iterative virtual simulation and validation.
  • Leveraging digital mock-up practice to achieve early product validation.
  • Implementing integrated change management and data traceability across engineering change orders and manufacturing change orders.
  • Using systemic and optimized workflows, supporting effective approval, release, issue tracking and change processes.
  • Automating interfaces with ERP, MES and other enterprise platforms.
  • Improving data access and classification for increased re-use.
  • Optimizing data integrity and continuity across teams, fostering better collaboration internally and externally with suppliers.
  • Sharing and learning from product and project data transparency across functions and programs, hence leveraging data reuse and carry-over across products.
  • Focusing on end-to-end process integration and simplification versus more detailed functional capabilities.
  • Reducing and managing complexity, enabling product modularity and bill of material (BOM) configuration across enterprise platforms.
  • Generating data analytics and process-driven product insights to improve decision-making.
  • Leveraging rapid 3D prototyping to accelerate iterative product validation.

When it comes to accelerating product development, horizontal integration includes the need for better inter-functional and enterprise-wide collaboration across the same level of the value chain, beyond the engineering or manufacturing teams. 

Clearly, PLM, ERP and MES platform integrations, with robust upstream-downstream master data definition, contribute to improving the time to deliver products to the market. In industries that rely on mass production, this also includes the need for robust CRM integration. This can be referred to as vertical integration for connectivity across the value chain: ensuring data continuity across the entire lifecycle and into in-service operations.

Both improved horizontal and vertical integration within the product development ecosystem contribute to an improved time. This does not have to translate to more controls or tighter processes, but more data accessibility and transparency, powered by holistic (cross-functional) analytics to better inform decision-making. This implies keeping processes simple and looking at ways to use out-of-the-box tools to achieve this while remaining flexible and open to change.

PLM and the Need for Quality

Product quality and associated quality management expectations are on everyone’s agenda. For start-up organizations, product quality translates into setting the relevant standards, aligning the right control mechanisms to comply with market legislation and certification requirements. As businesses mature, quality requirements widen to more enterprise operations. 

From a PLM perspective, quality management is not bound to a business function. It is about data consistency covering quality planning and assurance, to quality control and improvement across the data creation process:

  • Setting the right quality targets, policies, ISO and other standards.
  • Tracking change managing product requirements throughout, including customer requirements.
  • Tightening the change process and controls as the product matures.
  • Tracking risks, issues and defects across the product lifecycle, including rework duplication.
  • Managing complaints and full audit trails as part of the same ecosystem.
  • Connecting holistic data sets across engineering, manufacturing and service operations.
  • Tracking non-conformance reports (NCRs) from both internal and external teams.
  • Managing corrective action plans, corrective and preventive actions (CAPAs) and supplier corrective action requests (SCARs).
  • Performing design and process failure mode and effect analysis (FMEAs) and tracking identified risks and actions for improvements.
  • Aligning working practices and knowledge across functions and teams, leveraging incentives to improve training and certification.
  • Defining cross-functional standards and managing adherence to standards and ongoing compliance governance toward “right-first-time” outputs.

Each industry has its quality standards and verification processes based on regulatory and competitive requirements. Long gone are the days when quality management systems were stand-alone; quality is now part of all operations and embedded into enterprise platforms. Integrating and making visible quality-related information at every stage of the product lifecycle can help identify and resolve issues earlier and ensure that the relevant countermeasures are implemented in the product and process specifications to avoid further occurrences (feedback control loops).

From a PLM standpoint (the discipline, not only the tools), integrating data quality and process quality directly into the product innovation ecosystem means making quality the systemic responsibility of everyone in a comprehensive way. This can manifest, for example, in how an organization manages supplier deliverables and data, including quality control and issue resolution within the integrated product and each relevant component—also referred to as part of model-based system engineering (MBSE) methodologies. 

Balancing Speed and Quality: Improving Enterprise Data Governance

Fundamentally, balancing both speed and quality refers to a common dilemma that links to cost. Speed is about development, input/output, supply chain and production, and getting product out to customers on time whereas quality is about know-how, capabilities, standards, processes, continuous improvements, and delivering to consistent specification and legislative requirements. 

Such balance will depend on the industry, the organization, the maturity of its operations and its products, its operative model, etc. Hence, businesses will always look to improve speed and quality at once to reduce cost and improve customer (and in turn employee) satisfaction. Driving down the cost of quality and time to market includes improving data governance and traceability.

Switching on or off enterprise capabilities can be “made easy” for small and medium enterprises embarking onto the SaaS ladder though speed to value realization and quality management clearly go beyond the PLM system capability.

What are your thoughts? Leave your comments below.

Written by

Lionel Grealou

Lionel Grealou, a.k.a. Lio, helps original equipment manufacturers transform, develop, and implement their digital transformation strategies—driving organizational change, data continuity and process improvement, managing the lifecycle of things across enterprise platforms, from PDM to PLM, ERP, MES, PIM, CRM, or BIM. Beyond consulting roles, Lio held leadership positions across industries, with both established OEMs and start-ups, covering the extended innovation lifecycle scope, from research and development, to engineering, discrete and process manufacturing, procurement, finance, supply chain, operations, program management, quality, compliance, marketing, etc.

Lio is an author of the virtual+digital blog (www.virtual-digital.com), sharing insights about the lifecycle of things and all things digital since 2015.