Enterprise Data Deep Dive: Making Sense of Engineering and Production Operations

"Data is the new oil," so does this mean that "data deep dives" are the new oil rigs?

“Data is the new oil,” so does this mean that “data deep dives” are the new oil rigs? Clearly not. There is a limit to such ready-made marketer jargon. (Stock photo.)

Every organization thrives on data and analytics to drive operational efficiency and strategic alignment. This includes product design, production data, and product maintenance data, as well as how resources are utilized and optimized across functions. Making sense of enterprise data is making sense of organizational health, meaning: 

  • How well it runs
  • What decisions are made, by whom, when and for what purpose
  • How actions are driven
  • How effective decisions are implemented
  • How efficient business processes enable converting goals into outcomes, etc. 

Data deep dives are typically punctual data explorations, brainstorming, or in-depth analyses of the behavior of a given system, ecosystem, or function. They are contextual and practical exercises, not theoretical. They help to understand and possibly evaluate new products, operating options and prototypes. Deep dives often require an external perspective to bring new ideas and best practices to light.

Data deep dives serve multiple purposes:

  • Investigating specific operational issues, either to narrow down root causes or to provide large picture perspectives of the business
  • Interrogating existing data and process patterns to derive new improvements and optimizations, above and beyond business-as-usual analytics (e.g., focusing on adherence to process, capacity, deviation, trends, etc.)
  • Assessing product or service data quality (e.g., before data migration or new interface cut-over, or to respond to new requirements and aiming at a step-change)
  • Contributing to creating new services or product features to serve better customers or improve employee experiences 
Data deep dives can be heavy-duty sports as several digital enterprise solutions can be fairly static

Data deep dives can be heavy-duty sports as several digital enterprise solutions can be fairly static “data monsters.” These data monsters have been organically growing unattended for years, lacking proper integration and continuous improvement. (Stock image.)

Data deep dives are not just about enterprise auditing or data migration preparation. There are different types of “deep dives” based on data types and formats. For example, deep dives can follow different approaches for structured or unstructured data sets, in single integrated or disperse systems, used by single or multiple teams and functions, and based on the organizational operating model. This includes internal and wide-enterprise collaboration with partners, suppliers, and customers. Hence the need to define clear “boundaries” of data deep dives to avoid boiling the ocean or diving into heavy details.

Manufacturers leverage enterprise and product data to improve competitive advantage, accessing new opportunities for improvement and value creation throughout the product lifecycle. There are many data aspects to consider improving across multiple systems and operational teams, various data types and formats, data hubs, and other interfaces. Here are two specific examples (among others) that are typical to product innovation and engineering:

  1. Cross-BOM master data clarity: how multiple views of the enterprise meta-data backbone inter-connect and serve different purposes. This also connects to PLM-ERP-MES integration and data traceability: how data flows across the enterprise as well as how master data architecture and structures are set to govern alignment, throughput variability, and change traceability. 
  2. Multi-CAD / software data quality: how mechanical and electrical CAD, and more holistically CAD and simulation data is managed and aligned to software data and BOM data.

Data deep dive: cross-BOM master data clarity

To keep it simple, let’s say that BOMs have multiple purposes and perspectives. BOMs are essential data holders across the product realization cycle, making a product reality by linking virtual and physical execution. Looking at one view in isolation might not accurately describe the entire operation. The BOM logical data model can identify opportunities for improvement by looking at its structure, attributes, and lifecycle of relevant objects. This is to help users and their management confirm how BOM information is constructed but, more importantly, understand how effectively it is used. 

Most enterprise software vendors align the data model of their platforms and tools with ISO or other STEP related standards. Nevertheless, to understand this translation to optimum usability and flexibility in operations, consider the following questions:

  • How is BOM objects inter-related? Are data sets managed in an optimum way? 
  • Is the enterprise data architecture sound and fit-for-purpose?
  • How does BOM data link (and drive) data alignment, modularity and flexibility with other data sets?
  • Are BOM structures utilized as designed, or are processes allowing users to work out of context, out of sights or perhaps tick boxes? 
  • Are structured data sets (databases) and unstructured data sets (files, documents, process analytics and other “dark data” content) managed accordingly and appropriately?
  • Are BOM solutions aligned with compliance and certification requirements?

Data deep dive: multi-CAD / software data quality

When it comes to product development, from engineering and design to manufacturing and commercialization, PDM is often referred to as the core engine to manage CAD data sets. It includes the basic modularity, configuration, variants, change management and integration of enterprise data. CAD obviously involves mechanical and electrical data in multi-format, multi-source, etc. CAD data is rarely unaligned with software nowadays. Hence, it is impossible to speak of one without linking it to the other in any product lifecycle context.

Diving into CAD modularity. It includes standards and the associated governance for quality management, issue identification and resolution, deviation tracking and acceptance, conformance, and other countermeasures to deliver quality data, and ultimately quality products. With new CAD design advances, CAD has changed to cover more complex data sets such as meshing prediction, simulation parameters and output, generative design, rapid prototyping, operating systems, firmware, and other specific component software. Managing such complex data relationships must be done in the context of data inter-connectivity / relationships and changing quality standards. 

Thus, CAD data deep dives can help answer several operational questions across disciplines, like:

  • How are geometrical (hardware) and non-geometrical (materials, documents and software) data sets inter-connected in the product development context?
  • What CAD and software data quality standards are enforced and verified? How are they maintained as standards evolve?
  • How does one track data alignment over time? How does it re-align or re-validate throughout as part of change management, carry-over and other data reuse activities?
  • How are data formats, native or neutral released, communicated to the relevant stakeholders?
  • How are these data sets used once they reach a supplier or the shop floor?
  • How are data sets consumed in documents or across other data sets downstream, such as electronic work instructions, assembly manuals, technical publications, etc.?
  • How are production or assembly requirements and changes rolled-back (feedback loops) into the product development process?
  • How are machine learning and other optimizations combined or aligned with the more “traditional” data sets and their managing platforms?

Data deep dives are operationally driven by nature. They focus on the users and the bridges (system-driven or manually run) between users. Fixing the data maturity gaps across enterprise platforms requires a holistic view of the enterprise architecture. It is not only about multi-function, multi-phase or multi-disciplinary, but also multi-thread and multi-platform.

Gathering information from data deep dives can sometimes be the easy part, but converting such information into actionable insights is always difficult. This is because it combines credible data architecture knowledge, analysis skills and a business mindset to highlight value opportunities and define next-step operational improvements. 

What are your thoughts?

Written by

Lionel Grealou

Lionel Grealou, a.k.a. Lio, helps original equipment manufacturers transform, develop, and implement their digital transformation strategies—driving organizational change, data continuity and process improvement, managing the lifecycle of things across enterprise platforms, from PDM to PLM, ERP, MES, PIM, CRM, or BIM. Beyond consulting roles, Lio held leadership positions across industries, with both established OEMs and start-ups, covering the extended innovation lifecycle scope, from research and development, to engineering, discrete and process manufacturing, procurement, finance, supply chain, operations, program management, quality, compliance, marketing, etc.

Lio is an author of the virtual+digital blog (www.virtual-digital.com), sharing insights about the lifecycle of things and all things digital since 2015.