A single build in additive manufacturing can generate hundreds of gigabytes, even terabytes, of melt pool data. And that’s just a fraction of what can be collected across the entire production chain: chamber conditions, powder batch histories, post-processing operations, and more. Over time, one thing has become clear: capturing and quantifying data is often the easy part. The real challenge lies in turning it into value. Insights from Melissa Jech, Responsible for Planning, Tools, and Data Analytics at BMW Group’s Additive Manufacturing Campus, and NIST researchers (Yan Lu, Milica Perisic, and Albert T. Jones) highlight expectations from the industry and approaches that could be explored to implement a data management strategy.

While data management is a key topic across all manufacturing processes, AM introduces unique layers of complexity due to its digital and iterative nature. This article does not aim to compare both approaches but rather to highlight the specific aspects that make AM-related data management a discipline of its own.

With that in mind, we have identified 5 reasons why AM users face data management-related challenges: 

  • Volume and complexity of data: AM processes generate massive amounts of data from design to process monitoring and post-processing logs.  Since these sources come with different formats and protocols, they don’t often communicate seamlessly, which makes it difficult to integrate data into a unified system.
  • Traceability requirements: Because AM is often used to manufacture critical parts, manufacturers must maintain a full digital thread, linking design, build, inspection, and certification data for every part.  Each parameter must be traceable to ensure repeatability and compliance with industry standards.
  • To this, one can add custom manufacturing complexity.   AM’s core strength — its ability to produce highly customized, intricate designs — also introduces operational complexity. The more sophisticated the geometry, the greater the likelihood that engineers will need to start the design process from scratch. While this approach helps meet immediate needs, it can limit the accumulation of technical knowledge and design best practices that would otherwise enhance efficiency over time.
  • Process variability: In AM, small fluctuations in powder quality, machine calibration, or environmental conditions can influence part quality. Effective data management is essential to detect correlations, establish repeatability, and comply with standards.   
  • And the lack of standardized data models. If standardization is an issue across several areas of the AM field in general; in data management, the lack of universal standards for data formats and interoperability, makes integration across platforms and machines difficult. 
Picture of a woman - Melissa Jech 
Melissa Jech

Of all these pain points, Melissa Jech believes that integration challenges are the most critical ones to address in the industry today. She explains: “Connecting machines often stalls due to the diversity of systems and the lack of existing standards, making collaboration and monitoring more difficult. Many machine makers claim to offer solutions, but oftentimes this is only true for a limited and closed ecosystem, which doesn’t help when working with other systems or competitors. At the same time, incomplete data along the process chain means information is lost or fragmented, from design through simulation to production, hindering scalable integration. Therefore, open, end-to-end data pipelines that seamlessly connect design, simulation, and production data would be my most critical wish.”

Connecting machines and systems so that data flows consistently is probably the most critical wish shared by most AM users across the industry, not just at BMW Group. From Jech’s statement, we understand that addressing this challenge would enable companies to implement strategies that prioritize the right information, link it into digital threads, and ensure clear governance and interoperability. In essence, such a strategy could have a domino effect across other areas of the value chain and, potentially, help overcome additional data management challenges.

While approaches to building the foundation for connectivity are often holistic, they often involve the adoption of built-in interfaces, the implementation of middleware for harmonizing proprietary formats, and the modernization of legacy systems via IoT gateways.

The priority at BMW Group’s additive manufacturing campus

Pellets for 3D printers
Credit: BMW Group

If you’re a regular reader of 3D ADEPT Media, you probably already know that at BMW Group, additive manufacturing is not only a key enabler for prototyping, but is also increasingly used for small-series production parts. Additionally, it is enhancing the production system by delivering quickly available, lightweight, and locally manufactured solutions.

When asked about the top priority for BMW Group’s additive manufacturing operations, Jech points to standardization and interoperability across the entire shopfloor. “This is essential to generate insights that improve the efficiency of the entire value chain,” she explains.

Although she does not highlight a single concrete project, Jech emphasizes the company’s approach to smarter data management, focusing on data quality and transparency. We aim to see standardized models and APIs feeding live into our central data lake,” she says. “Where we have a live connection, we observe faster validation, clearer transparency, and scalable data pipelines.”

Separately, she notes some of the measurable benefits that such practices can bring to AM operations. “The availability of relevant metadata increases significantly while the error rate decreases,” Jech observes. “This enables more precise data analyses that provide actionable insights to improve process efficiency. We’ve used it to reduce delivery times and to enhance both machine utilization and workers’ capabilities.”

Exploring a “divide and conquer” approach

antenna lab
Credit: NIST

A recent research from the U.S. National Institute of Standards and Technology (NIST) highlights a “divide and conquer” approach to data integration. Based on 7 steps, this framework could be used with the current industry standards. 

 These steps include: 

  • Defining dataset/data source
  • Collecting data
  • Queueing data
  • Archiving data
  • Downgrading data amount
  • Building decision models
  • And using decision models. 

 

1. Define the data

Yan Lu, Milica Perisic, and Albert T. Jones, lead experts behind this approach, stress that a clear definition of a data source and its content is critical for both the Data Provider and the System Maintainer. It ensures that both parties share the same understanding and forms a joint agreement on what the data represents.

Our take: Without standardization, undefined data can create misunderstandings across the entire lifecycle, from design to production to analysis.

2. Decide how data is collected

Once defined, it’s important to determine how the data will be collected. In AM and other manufacturing processes, data can be captured in three ways:

  • Sample – at regular intervals
  • On event – triggered by a defined event
  • On condition – when a specific condition is met

Data collection can follow push or pull approaches, whether streaming in real time or processing in batches.

Our take: Choosing the right collection strategy is the foundation for reliable, actionable insights later.

3. Queueing data

Processing each data instance requires system resources. To avoid overload, a message queue can temporarily store data until it is processed. Multiple queues may be used: for example, one for raw, unprocessed data and another for processed data waiting to be stored.

Our take: Queues prevent bottlenecks and make sure no critical information is lost in the process.

4. Archiving data

Selecting the right persistent storage technology is key. Typically, metadata and image data are stored separately:

  • Images in a file system
  • Metadata in a searchable, queryable database

The metadata often includes a pointer to the image location for easy retrieval. This separation ensures efficiency and stability while allowing rapid access to information when needed.

5. Downgrading data amount

To maintain sustainability, systems should manage data size and store only what’s necessary. Policies can include:

  • Deleting old data
  • Aggregating data
  • Removing duplicates
  • Reducing data quality if acceptable

Efficient data reduction saves storage costs and improves system performance without losing critical insights.

6. Building decision models

Once the data is organized, value can be extracted through decision-making models:

  • AI-based models:  predictive (require historical labeled data) or clustering (group similar items to support expert decisions)
  • Rule-based expert systems: if-then rules defined by domain experts

This would mean that combining AI and rule-based approaches often yields the most reliable, actionable results.

7. Using decision models in practice

Finally, decision models are applied to real-world scenarios. In AM:

  • Predicting a critical event triggers alerts for staff
  • Detecting in-process anomalies can adjust process parameters or stop the build

This is where the data strategy translates directly into operational value, improving efficiency, quality, and responsiveness.

Concluding notes

On paper, the NIST “divide and conquer” framework provides a solid foundation for AM data management. Yan Lu, Milica Perisic, and Albert T. Jones explain that this framework represents a new application of the five-layer ISA 95 architecture:

Layer 0 involves the functions and standards associated with the physical production process. Layer 1 involves the functions and standards associated with sensing and manipulating that physical process. Layer 2 involves the functions and standards associated with automatically monitoring and controlling that process. Layer 3 has manufacturing operation functions, and Layer 4 covers enterprise functions.”

The framework offers a clear roadmap for structuring data across the entire AM workflow, from sensors and machines to operational and enterprise systems. However, it was developed with LPBF processes in mind, and there is no guarantee that it will translate seamlessly to other AM technologies.

For companies like BMW Group, thoughtfully adapting this framework to their specific processes, machines, and operational context could help ensure that AM data is reliable, traceable, and actionable, supporting better decision-making and process efficiency across the shopfloor and beyond.

This is a topic we will continue to monitor, to share relevant insights that could help AM users better structure and implement their data management approaches.

This dossier has first been shared in the 2025 November/December edition of 3D ADEPT Mag. Discover our 2025 year in review series here.