How Data, Standards, and Automation Are Reshaping Environmental Product Declarations

Key Takeaways from the Digital EPD Session at eClad Conference

1. The EPD market is Scaling Fast, but the Foundation is Still Fragmented

EPDs are growing rapidly across industries, driven by regulatory pressure, customer demand, and procurement requirements. But the underlying systems have not kept pace. As Robert highlighted, the EPD ecosystem has evolved organically over time. Different regions, standards, tools, and workflows have developed independently.

The result is a fragmented landscape where:

    • Data formats are inconsistent
    • Processes vary by region and program operator
    • Digital workflows are not fully standardized
    • Scalability remains limited

This creates a fundamental challenge. The industry is trying to scale outputs without first standardizing the data infrastructure.

2. The Real Bottleneck is Not EPD Creation – It’s the Data.

Across every EPD workflow, the same bottlenecks appear:

    • Data collection
    • Data transformation
    • Data completeness and consistency
    • LCA modeling
    • EPD & LCA verification
    • Non-harmonized calculation rules

These challenges are not new. But they become exponentially more complex as companies try to scale across hundreds or thousands of products. The takeaway is clear: EPD challenges are not primarily about reporting. They are about data architecture.

3. Digital EPDs are the Path Forward. But Only if Done Correctly

Digital EPDs have the potential to solve many of these challenges.

They enable:

    • Automated data validation
    • Structured, machine-readable datasets
    • Faster integration into downstream systems
    • Scalable lifecycle assessments

However, the current reality is more complicated. In many cases today, the process is still reversed. Teams generate a PDF first, then manually transfer data into digital formats. This introduces errors, inconsistencies, and inefficiencies.

The future state is the opposite. A digital dataset should be the single source of truth. From that, any human-readable format, including PDFs, can be generated.

4. Verification Must Evolve to Support Automation and Scale

As EPD volumes grow, traditional verification approaches become a bottleneck.

The current verification guidelines are often:

    • Tool-specific instead of tool-agnostic
    • Lacking detailed requirements
    • Not designed for automated workflows

To address this, new approaches are emerging that focus on:

    • Tool-based verification frameworks
    • Logging and traceability of data and mapping
    • Scalable validation processes
    • Integration of AI-assisted verification

The goal is not just faster verification. It is more consistent and reliable verification at scale.

5. The Shift to Digital Enables Interoperability and Global Alignment

One of the biggest barriers to scaling EPDs today is lack of harmonization. Different program operators, regions, and standards require different formats and calculations. This creates duplication, inefficiency, and inconsistencies.

Digital EPD initiatives aim to solve this by:

    • Standardizing machine-readable formats
    • Enabling interoperability across systems
    • Reducing reliance on region-specific PDF formats
    • Supporting global comparability

This is a foundational shift. It moves EPDs from static documents to interoperable data assets.

6. EPDs are Not the End Goal. Decision-Making Is.

One of the most important points from the session was simple but critical. Companies are not creating EPDs just to have EPDs. They are creating them to enable better decisions.

Whether at the product level, building level, or portfolio level, EPD data should support:

    • Material selection decisions
    • Product design improvements
    • Procurement strategies
    • Regulatory compliance

Without this connection to decision-making, EPDs remain a reporting exercise rather than a business capability.

The Core Problem: EPD Workflows Are Not Built for Scale

Across industries, the challenge is consistent. Organizations are trying to scale EPDs using processes that were never designed for volume, speed, or interoperability.

This leads to:

    • Manual, time-intensive data collection
    • Inconsistent and non-harmonized datasets
    • Duplicated effort across regions and standards
    • Limited ability to reuse or integrate data
    • Slow and costly verification processes

The result is a system that struggles to keep up with growing demand.

The Solution: From EPD Documents to Product Lifecycle Intelligence

The path forward is not just digitization. It is transformation. Instead of creating EPDs one by one, the model shifts to:

    • Ingest all available product and supply chain data
    • Structure it into a unified, digital data model
    • Create digital twins of products
    • Apply logic to generate lifecycle insights
    • Output results across multiple use cases

This approach enables:

    • Automated EPD generation
    • Substance compliance analysis
    • Lifecycle impact modeling
    • Continuous data improvement

All from the same underlying data foundation. This is what Makersite defines as Product Lifecycle Intelligence.

From Data to Decisions: Why This Matters Now

For manufacturers and construction stakeholders, this shift is critical. The market is demanding:

    • More EPDs
    • More specific EPDs
    • Faster turnaround
    • Higher data quality
    • Greater transparency

At the same time, products are becoming more complex and configurable. This creates a new requirement: the ability to generate accurate, scalable, and decision-ready environmental data.

Companies that can do this gain a significant advantage:

    • Faster compliance and reporting
    • Improved product design decisions
    • Reduced operational effort
    • Stronger, more credible sustainability claims

What the Market Is Moving Toward

The conversation is changing. Organizations are no longer asking: “Can we create EPDs?”

They are asking:

    • Can we scale EPDs across entire product portfolios?
    • Can we trust and verify the data consistently?
    • Can we integrate EPDs into digital workflows and systems?
    • Can we use EPD data to drive real decisions?

This reflects a broader shift:

    • From static documents to dynamic data
    • From manual workflows to automated systems
    • From reporting outputs to decision intelligence

Final Thought

The biggest takeaway from the session: EPDs are evolving from documents into infrastructure. Digital EPDs, standardized data models, and automated workflows are not just improving reporting. They are enabling a new foundation for environmental decision-making.

By moving toward connected, digital, and scalable data systems, organizations can turn EPDs from a compliance requirement into a strategic capability.

Want to Scale EPDs Without Scaling Manual Effort?

If your team is working to:

    • Automate EPD generation
    • Improve data quality and consistency
    • Reduce verification bottlenecks
    • Connect EPDs to product and design decisions

See how Makersite enables digital EPD workflows, lifecycle intelligence, and scalable sustainability insights across your product portfolio.

Download Makersite’s EPD ebook 

9 AI-Powered PLM Software Solutions for Manufacturers in 2026

What is AI Powered PLM?

AI-powered PLM refers to Product Lifecycle Management systems enhanced with artificial intelligence to improve how manufacturers manage, analyze and act on product data across the lifecycle. Traditional PLM systems are systems of record. They store CAD files, manage engineering change orders, track part structures and maintain BOM integrity. AI-powered PLM systems go further. They transform structured product data into decision intelligence.

In practice, AI in PLM can mean:

  • Automatically classifying and cleansing part data
  • Predicting the impact of engineering changes
  • Optimizing simulation models
  • Mapping multi-tier suppliers
  • Filling gaps in material or process data
  • Enriching BOMs with cost, risk, carbon or compliance signals
  • Enabling real time trade off analysis across engineering and procurement

For enterprise manufacturers managing thousands of components across global supply chains, AI-powered PLM becomes less about automation and more about infrastructure. It connects engineering, procurement, compliance and sustainability inside the digital thread.

However, not all AI in PLM is equal.

Some vendors embed AI directly into engineering workflows. Some apply AI primarily to simulation and digital twins. Some use AI to harmonize enterprise data across ERP and PLM. Others focus on sustainability intelligence and supplier risk modeling. For global enterprise manufacturers above operating complex, configurable BOMs, the critical question is not whether AI exists inside the platform.

The critical question is: Does the AI operate at BOM level and influence real product decisions across engineering, sourcing and compliance?

Below are nine AI-powered PLM software solutions shaping enterprise manufacturing in 2026.

1. Makersite

Makersite is a granular, AI-powered Product Lifecycle Intelligence platform purpose built for complex manufacturing sectors, with a strong presence in electronics, automotive, industrial machinery, construction, chemicals and industrial goods.

Makersite tackles the core issue of enterprise PLM environments: structured product data exists, but cross functional intelligence does not. BOMs sit in PLM. Supplier data sits in ERP. Environmental data lives in separate tools. Critical decisions are made without a unified intelligence layer. Rather than replacing PLM systems, Makersite connects to them and enriches structured product data using deeply specialized AI.

How AI is used:

  • Context rich gap filling: Dedicated industry trained AI agents infer missing supplier, material and process data by analyzing BOM structure, manufacturing context and sourcing patterns across multi tier supply chains.
  • Automated background database matching: AI automatically maps BOM inputs to environmental datasets, risk databases and compliance indicators, reducing manual mapping effort dramatically.
  • What if scenario modeling: AI enables real time trade off analysis across carbon, cost, supplier risk and regulatory exposure at configuration level.
  • Multi tier supplier mapping: AI reconciles inconsistent supplier naming and identifies relationships across complex global networks.

Differentiator:

Makersite’s differentiator is its combination of a large structured manufacturing data foundation with highly specialized AI agents trained on industrial context. Its AI understands manufacturing logic, making it highly accurate for complex, configurable BOMs. Best for enterprise manufacturers managing complex BOMs who need accurate environmental, cost and compliance modeling integrated into engineering workflows.

2. Siemens Teamcenter with AI Capabilities

Siemens Teamcenter is a leading enterprise PLM system with embedded AI-focused on engineering optimization and digital twin enablement. Teamcenter addresses the need for structured product data governance at global scale. Its AI capabilities enhance internal engineering processes rather than external supplier intelligence.

How AI is used:

  • Intelligent part classification to reduce manual categorization
  • Change management automation through predictive impact analysis
  • Digital twin optimization using simulation driven AI
  • Knowledge reuse across engineering programs

Differentiator:

Teamcenter’s differentiator is the depth of AI embedded directly inside core engineering workflows and digital twin environments. The AI operates within the system of record rather than as an external layer. Best for large global manufacturers with mature PLM environments focused on engineering performance and simulation optimization.

3. PTC Windchill

PTC Windchill combines PLM with IoT data through its broader ecosystem, using AI to enhance lifecycle visibility and configuration management. Windchill addresses the need to connect product data with real world performance signals.

How AI is used:

  • Predictive analytics on product performance
  • Configuration optimization across variants
  • Closed loop lifecycle insights from connected product data
  • Automated impact analysis across engineering changes

Differentiator:

Windchill’s differentiator is its integration of PLM with IoT and service data, allowing AI to inform decisions using real world performance feedback. Best for industrial machinery and heavy equipment manufacturers managing connected assets and configurable products.

4. Dassault Systèmes 3DEXPERIENCE

Dassault’s 3DEXPERIENCE platform embeds AI primarily within simulation and advanced modeling workflows. The platform addresses the need for design optimization and performance simulation in highly engineered environments.

How AI is used:

  • Simulation driven optimization of materials and structures
  • Predictive modeling of performance scenarios
  • AI-assisted design exploration
  • Digital twin refinement

Differentiator:

Dassault’s differentiator lies in simulation depth. AI enhances computational modeling rather than multi tier supplier intelligence. Best for aerospace and automotive manufacturers with heavy reliance on simulation and advanced materials engineering.

5. SAP PLM with AI

SAP integrates PLM functionality into its ERP backbone, using AI for data harmonization and predictive enterprise analytics. SAP addresses enterprise wide data consistency and financial integration.

How AI is used:

  • Master data harmonization across systems
  • Predictive supply chain insights
  • Demand forecasting and risk identification
  • Intelligent workflow automation

Differentiator:

SAP’s differentiator is enterprise integration. AI connects lifecycle data with financial and procurement systems at scale. Best for global enterprises prioritizing unified ERP and lifecycle data governance.

6. Aras Innovator

Aras Innovator is a flexible PLM platform that supports AI extensions through configurable architecture. Aras addresses manufacturers that require adaptable lifecycle workflows across diverse product portfolios.

How AI is used:

  • Custom analytics and reporting extensions
  • AI powered document search and knowledge retrieval
  • Configurable workflow automation

Differentiator:

Aras differentiates through architectural flexibility. AI capabilities are shaped by implementation rather than delivered as fixed modules. Best for manufacturers seeking customizable PLM infrastructure with tailored AI workflows.

7. Oracle Agile PLM

Oracle Agile remains strong in compliance driven PLM environments, particularly in electronics and high tech sectors. Agile addresses structured documentation, regulatory management and controlled product record environments.

How AI is used:

  • Automated classification and search
  • Compliance analytics through Oracle Cloud services
  • Risk monitoring across supplier documentation

Differentiator:

Oracle Agile differentiates through compliance centric PLM strength, with AI augmenting documentation and regulatory tracking. Best for electronics manufacturers managing strict compliance and documentation requirements.

8. Propel PLM

Propel is a cloud native PLM built on Salesforce infrastructure, targeting modern manufacturing companies. Propel addresses collaboration and lifecycle visibility in cloud first environments.

How AI is used:

  • CRM integrated product insights
  • Workflow automation through Salesforce AI
  • Analytics across customer and product lifecycle data

Differentiator:

Propel differentiates through tight CRM and PLM integration, bringing AI insights across customer and product domains. Best for growth oriented manufacturers aligning product management with customer intelligence.

9. Sustainability Platforms Adjacent to PLM

Platforms such as Sphera focus on compliance databases and environmental risk monitoring that operate adjacent to PLM systems. These platforms address regulatory intelligence rather than engineering integrated intelligence.

How AI is used:

  • Automated regulatory tracking
  • Risk signal monitoring
  • Data normalization for reporting

Differentiator:

These platforms differentiate through regulatory database breadth and compliance depth rather than embedded product level intelligence. Best compliance focused sustainability programs that operate parallel to engineering workflows.

 

When evaluating AI-powered PLM Software Solutions, enterprise manufacturers should ask these questions

1. Is the platform a system of record or an intelligence layer?

Some platforms replace or serve as core PLM systems. Others operate as AI intelligence layers that integrate with existing PLM and ERP environments.

If your organization already runs Siemens Teamcenter, PTC Windchill or Dassault, replacing PLM may not be realistic. In that case, an AI enrichment layer may be more strategic.

Clarify whether you are modernizing infrastructure or augmenting it.

2. Does AI operate at BOM level depth?

High level dashboards are not enough for complex manufacturing.

Ask:

• Can the platform ingest multi level BOMs?
• Can it analyze configuration variants?
• Does AI enrich individual line items?
• Can it model trade offs at component level?

For manufacturers managing thousands of components per product, BOM level intelligence is critical.

This shifts sustainability from retrospective reporting to proactive design decision support.

3. How does the platform handle missing supplier or material data?

Incomplete data is the norm, not the exception.

Evaluate:

• Does the system rely solely on declared supplier data?
• Does it use context aware AI to infer missing attributes?
• Are modeling assumptions transparent and traceable?
• Can estimated values be replaced with primary data later?

The ability to manage uncertainty intelligently often determines scalability.

4. How well does it integrate with existing enterprise systems?

AI-powered PLM should not create new silos.

Assess:

• API depth with PLM and ERP systems
• Compatibility with supplier portals
• Ability to export structured outputs for reporting
• Security and data governance controls

Enterprise adoption depends on seamless integration into current workflows.

5. Does it support cross functional decision making?

PLM historically served engineering.

Modern AI-powered PLM must also serve:

• Procurement teams evaluating supplier risk
• Sustainability teams modeling Scope 3 impact
• Compliance teams tracking regulatory exposure
• Finance teams analyzing cost exposure

Ask whether the platform enables concurrent evaluation of carbon, cost and compliance trade offs.

6. Can it scale across global, multi-tier supply chains?

Enterprise manufacturers operate across regions, currencies and regulatory regimes.

Evaluate:

• Multi tier supplier mapping capabilities
• Localization for regulatory frameworks
• Ability to support digital product passport requirements
• Performance at enterprise data volumes

Scalability is not just about user count. It is about data complexity.

7. Does it influence decisions before design freeze?

Many tools accelerate reporting. Fewer influence product design.

The most strategic AI-powered PLM solutions:

• Integrate directly into early design workflows
• Enable what if scenario modeling
• Provide insights during sourcing decisions
• Support engineering trade off analysis in real time

If intelligence only appears after the product is finalized, the strategic value is limited.

Final Thought: The Future of PLM Is Decision Intelligence

PLM modernization is no longer a technology upgrade.

It is a strategic shift in how manufacturers make product decisions.

As supply chains become more complex and regulatory expectations intensify, intelligence cannot remain siloed in reporting tools or disconnected systems. AI-powered PLM must operate inside the digital thread, linking engineering structure with supplier visibility, cost dynamics and sustainability impact in real time.

The competitive advantage will not come from managing more product data.

It will come from transforming product data into actionable intelligence at the exact moment decisions are made.

Still Have Questions? Let’s Dig Deeper

What makes PLM software “AI-powered” versus traditional PLM systems?

Traditional PLM systems act as systems of record. They manage CAD files, BOM structures, engineering change orders and product documentation. Intelligence typically comes from human analysis layered on top of structured data.

AI-powered PLM introduces machine learning, semantic mapping and predictive modeling directly into the lifecycle workflow. Instead of simply storing product data, AI enriched systems classify components automatically, infer missing attributes, predict the impact of engineering changes, map suppliers across inconsistent naming structures and generate scenario based insights in real time.

The key difference is that AI-powered PLM transforms product data into decision intelligence rather than static documentation.

How does AI in PLM handle incomplete or inconsistent BOM data?

Incomplete BOM data is one of the biggest constraints in enterprise manufacturing. Supplier declarations may be missing. Material compositions may be partially defined. Multi tier sourcing data is rarely transparent.

AI-powered PLM platforms address this through context aware modeling. Instead of relying solely on declared attributes, AI analyzes the component’s category, application, manufacturing context and known supplier patterns to infer likely material compositions or process assumptions.

More advanced platforms also reconcile duplicate supplier records, normalize inconsistent naming conventions and map parts to standardized datasets automatically. This reduces manual cleansing and accelerates time to insight without compromising engineering governance.

Can AI powered PLM replace sustainability or compliance tools?

In most enterprise architectures, AI-powered PLM does not replace sustainability or compliance platforms. It complements them.

PLM remains the system of record for structured product data. Sustainability tools manage regulatory reporting frameworks. Compliance systems track substance declarations and documentation.

AI-powered PLM acts as a connective layer. It enriches product data with environmental, cost and risk intelligence before reporting begins. Instead of exporting static BOMs to downstream tools, manufacturers can integrate intelligence upstream in the product development lifecycle.

This shifts sustainability from retrospective reporting to proactive design decision support.

How accurate are AI generated environmental or supplier estimates?

Accuracy depends heavily on the platform’s underlying data foundation and modeling methodology.

Some tools rely primarily on spend based emissions or generalized industry averages. Others use contextual AI trained on manufacturing datasets to infer missing attributes more precisely.

For exploratory portfolio level analysis, estimated modeling may be sufficient. For regulatory reporting, digital product passports or configuration level carbon footprints, manufacturers typically require platforms grounded in verified engineering logic and structured lifecycle datasets.

AI should enhance data quality, not obscure it.

When should AI-powered PLM be used in the product development lifecycle?

Historically, lifecycle analysis and risk assessments were conducted after product design was largely finalized. This limited the ability to influence outcomes.

AI-powered PLM shifts intelligence earlier into R and D and sourcing workflows. Because AI can instantly evaluate alternative materials, suppliers or configurations, engineering and procurement teams can compare carbon, cost and compliance trade offs before tooling or production begins.

The greatest value of AI in PLM is realized when intelligence informs decisions before design freeze, not after product launch.

Is AI-powered PLM relevant for companies with mature PLM systems?

Yes. In fact, mature PLM environments benefit the most.

Enterprise manufacturers using systems such as Teamcenter, Windchill or 3DEXPERIENCE already have structured product data. What is often missing is cross functional intelligence layered across cost, supplier risk and sustainability dimensions.

AI-powered PLM does not replace core engineering systems. It supplements them by enriching structured data and connecting it to broader enterprise objectives.

For organizations with global supply chains, AI becomes an infrastructure enhancement rather than a system replacement.

Forrester study offers crucial insight into competitive advantages of sustainable product development

A note from Makersite CEO Neil D’Souza

Makersite is proud to announce the launch of our co-authored study with Forrester, titled “Transform Product Sustainability into Performance Initiatives with Product Lifecycle Intelligence”. Not only is it a pivotal moment in Makersite’s journey and growth as a company, but it is also a confirmation of our vision and a validation of the underlying goals and objectives that serve as the foundation on which our software has been built.

Makersite Forrester

Make it better, not make it faster

As a society, we have reached a critical turning point. More and more companies are producing high-profile, single-use products which, in turn, has created a new normal – a continuing acceptance that increased consumption and rapid wastage is fine. And that doesn’t even consider the fact that some 80% of the ecological impact of a product happens in the design phase alone.

We exist in a place and a time where “take, make and waste” has become the norm. But there is a different future, one based around “make it better” rather than “make it faster.”

In order to make that vision a reality, however, we have to give those leading the charge the right platform to succeed. That is the purpose of Makersite. I see a future where those people who make and manufacture products – our product engineers and product designers – drive strategic transformation underpinned by a shared, compelling vision, financial support based on more than just commercial imperatives, and a dynamic ecosystem that is agile, efficient and geared toward ethical, criteria-driven innovation.

The power of data

Empowered with the right tools and best practices to make better products faster, engineers can provide the solutions needed to collaborate and take the actions that will make a difference. Products can be more sustainable, more efficient and more cost-effective while still making money and ensuring a profitable, healthy business. However, we must give them a foundation to work from first.

If we present engineers with the data they need, they will use it – and use it well. No one wants to make a ‘bad’ product, but ‘good’ products can only be made with the right decisions informed by the right data.

Some may argue that this is wishful thinking or is not worth the effort. However, a Bain & Company study found that while only 40% of businesses are on track to meet their sustainability goals, companies have an increasingly conscious and proactive base of consumers willing to pay 11% more for sustainable products and employees that will help.

A recent IBM report also noted that organizations that embed sustainability in their product design processes experience a 16% higher rate of revenue growth. They’re 52% more likely to outperform their peers on profitability. And they’re two times more likely to attribute great improvement in operating costs to sustainability efforts.

It’s not just blue-sky thinking for a greener future either. The most significant driver for companies to do anything has always been growing revenue. A 2022 report – the Sustainable Market Share Index – by NYU Stern’s Center for Sustainable Business examined what actually happened in the last decade and found that the share of CPG products marketed as being sustainable grew twice as fast as conventional products and accounted for one-third of the total revenue growth in the industry. Customers paid 27% more for those products.

Now is the time

With a massive demographic shift bringing more environmentally conscious buyers into the market already well underway, the time never has been better to build better products. This is what our study with Forrester reiterates. We have seen it and we have talked about it and we have translated ideas into actions with some of our biggest customers – Microsoft, Barco, Cummins, Schaeffler. It is fantastic to see that validated here.

We are amidst an unprecedented revolution that is changing not just the products we make, but how we make them. Companies that are set up for rapid change are becoming the new leaders of tomorrow and we’re already witnessing this evolution. Unfortunately, companies have only a piece of the data needed to make decisions quickly – the internal view. Most of the data needed comes from the world outside – supply chains, costs, regulations, impacts and others. This is the external view. The ability to combine the two instantaneously to drive better decisions and tradeoffs will avoid costly mistakes, shorten time-to-market and create more sustainable and successful products.

Study findings

The study, which includes insights from 493 respondents with product design and sourcing decision-makers in manufacturing, found that the surge in customer expectations alongside stringent regulations is pushing leaders to revolutionize their approaches to product lifecycle management and supply chain operations.

These leaders are increasingly championing sustainability within their organizations. However, they face significant hurdles; they’re often without the backing of upper management to enact meaningful strategic transformation, and they lack the necessary tools to provide access to precise, detailed data on materials, components and suppliers. This scenario underscores a critical juncture for decision-makers in product design and sourcing, urging a radical shift towards integrated, sustainable practices that meet the demands of a dynamic global market.

Ultimately, the study offers compelling evidence for the hypothesis that we have long been promoting as a company: that harmonized access to a breadth of reliable data forms the foundation of efficient product design and sourcing, and that organizations that are able to leverage granular product lifecycle intelligence in product development will enjoy competitive advantage against their peers with a faster time to market and more successful and profitable products.

There is plenty more to uncover in the study itself – from a wealth of new data to in-depth insight on the importance of data quality to a selection of actionable key takeaways that organizations can apply to their own business operations. Take your time to read our findings in full here.

Thanks,

Neil