Makersite Acquires Siemens' SiGREEN

Read more
Close

Key Takeaways: AI-Driven Digital Transformation in EHS & Sustainability

The Core Problem: Complexity Has Outpaced the Tools

The complexity of modern product portfolios and multi-tier supply chains has outpaced what traditional EHS and sustainability tools can handle. Companies are now being asked product-specific, substance-level questions by customers, regulators, and investors. Most lack the integrated data infrastructure to answer them.

This is not a niche problem. Enterprise manufacturers managing millions of products and tens of thousands of chemical substances cannot generate reliable life cycle data, Scope 3 emissions figures, or audit-ready compliance records using spreadsheets, disconnected ERP systems, or manual research. The volume and precision required make human-scale processes structurally unworkable.

AI is entering this space not because it is fashionable, but because there is no other path to scale.

What Is Actually Working vs. What Is Still Hype

The panel was direct on this. AI in EHS and sustainability is generating real value today in specific, well-scoped use cases, and falling short where the underlying data foundation is missing.

What’s working today:

Automated LCA and PCF generation at product and configuration level, where AI processes full material declarations, maps substance-level data to background databases, and generates traceable life cycle inventories without manual modeling effort. AI-assisted chemical data modeling for substances where no emission factors or LCI datasets exist, using synthesis and pathway data to fill gaps rather than defaulting to averages or proxies. Continuous compliance monitoring against expanding regulatory frameworks, where AI matches BOM-level data to substance watchlists in real time. Scope 3 supply chain mapping across multi-tier supplier networks, surfacing hotspots and prioritizing data collection where it matters most.

Still more promise than reality:

Fully autonomous sustainability decision-making without expert validation. AI cannot produce ISO-compliant outputs without human oversight of methodology and data quality. Generic large language model deployments without deep sustainability domain training, the specificity of EHS methodology, LCA system boundaries, and substance-level compliance cannot be approximated by general-purpose models. And AI layered on top of structurally broken data processes will only create fragmented, siloed, unvalidated inputs produce unreliable outputs regardless of the model.

The consistent finding: AI works when it is applied to specific, high-impact use cases on a structured data foundation. It does not work as a substitute for that foundation.

The Real Bottleneck Is Data Readiness, Not Model Capability

One of the most technically substantive discussions in the keynote focused on where enterprise organizations actually get stuck. Not in AI capability, but in data readiness.

Consider what it takes to generate a product carbon footprint for a manufacturer with a complex chemical portfolio. Measured LCI datasets and emission factors exist for only a fraction of the substances involved. The remainder must be modeled from synthesis pathways, process data, or representative chemical categories. For a single product, dozens of custom LCA datasets may need to be generated from hundreds of candidate substances. Across a portfolio of millions of products, this is a multi-year data engineering challenge.

The approach that works is incremental and methodologically rigorous.

  • Map existing coverage first. Identify what background database coverage already exists and where manufacturer-provided LCA data can be matched exactly. This scopes the true gap before any modeling begins.
  • Prioritize by impact. Focus custom dataset generation on substances with the greatest frequency and material contribution across the portfolio. Starting with the most-used materials delivers meaningful coverage without attempting to solve everything at once.
  • Model the long tail by category. Remaining substances can be grouped into chemical categories, solvent classes, inorganic groups, and represented by datasets with defined variance, min/max ranges, and documented error margins. This is scalable and auditable.
  • Handle marginal contributors appropriately. Substances that contribute negligible quantities to the final product can be represented using high-level grouped data, such as average organic or inorganic chemical classifications, without materially affecting output accuracy.
  • Align on methodology before scaling. ISO compliance requirements, error margin conventions, and how averages are applied in reporting must be agreed between technical and sustainability teams before outputs are used for external disclosure.

This is not a “plug in AI and get answers” workflow. It is a structured, expert-guided process in which AI dramatically accelerates each step. The methodological rigor is still required. Makersite is built to support exactly this kind of layered, scalable approach, from data ingestion and substance-level mapping through to audit-ready LCA and PCF outputs.

How the Sustainability Leader Role Changes in 3 to 5 Years

The panel’s view here was grounded rather than speculative.

The shift is not from human judgment to automated decision-making. It is from reactive reporting to real-time insight generation. Sustainability leaders who today spend significant time on data collection, supplier follow-up, and manual LCA modeling will increasingly function as analysts and strategists, interpreting AI-generated outputs, setting data quality standards, and embedding sustainability criteria directly into product design and procurement decisions.

The implication for organizations is clear: the value of the sustainability function is increasingly determined by the quality of its data infrastructure, not its headcount. Teams that build structured, auditable data pipelines now will have a structural advantage in regulatory readiness and decision speed within the 3 to 5 year window.

Where Scope 3 Is Hardest

Scope 3 emissions, particularly Category 1 purchased goods and services, remain the most difficult area, and the panel was specific about why.

The problem is not the emissions calculation. It is the absence of primary data at the supplier level. Most Scope 3 analyses rely on spend-based or industry-average approaches because supplier-specific, product-level emissions data does not exist in structured, accessible form. AI can model gaps with defined uncertainty, but it cannot compensate for missing primary data.

Organizations making the most progress on Scope 3 share three characteristics. They have built structured supplier data collection processes, full material declarations, BOM-level inputs, that feed directly into LCA and PCF workflows. They have invested in component-level modeling that can be reused across product families rather than rebuilt product by product. And they have established methodology alignment across sustainability, engineering, and commercial teams so that AI-generated outputs are trusted and acted upon.

The approach that consistently does not work: attempting to resolve Scope 3 at the portfolio level with aggregate methods while continuing to operate disconnected, product-level data systems.

What an AI-Enabled Sustainability Decision Looks Like at the Design Level

The most forward-looking discussion centered on product design, where the greatest leverage exists.

In the next three to five years, engineers making component selection decisions will have real-time access to sustainability impact data at the substance and configuration level. Selecting a different supplier or substituting a material will immediately surface its LCA, compliance, and Scope 3 implications before the decision is finalized, not weeks later during a sustainability review.

This is technically achievable today for organizations that have built the necessary data infrastructure. The constraint is not AI capability. It is the availability of structured, substance-level product and supplier data in a form that AI can use.

The leadership implication is significant: Sustainability decisions in the next decade will increasingly be made by product engineers, not sustainability teams in isolation. The function of the sustainability team shifts to building and maintaining the data systems, methodological standards, and AI tooling that make those decisions possible at scale.

One non-negotiable: AI-generated sustainability outputs require full audit trails. ISO-aligned PCFs and LCAs need traceable, validated data lineages. Explainability is a technical requirement, not an optional feature.

Watch-Outs as AI Gets Embedded in EHS Workflows

The panel closed with a frank assessment of where AI adoption fails in EHS and sustainability.

Treating AI as a substitute for data quality is the most common mistake. AI can model gaps and generate datasets for missing substances, but it cannot produce defensible outputs from structurally flawed inputs. Organizations that skip data foundation work before deploying AI will generate results that fail audit, regulatory, or customer scrutiny.

Neglecting methodology alignment is the second failure pattern. Different LCA system boundary definitions and allocation approaches can produce materially different results from the same underlying data. If sustainability, engineering, and commercial teams are not aligned on methodology before AI outputs are generated, those outputs will be contested internally before they reach any external stakeholder.

Underestimating the supplier engagement requirement is the third. Scaling sustainability data across complex supply chains is not purely a technology problem. Thousands of suppliers must participate in structured data collection for AI-generated outputs to reflect primary data rather than estimates. That requires change management and supplier enablement, not just software.

And finally: confusing speed with accuracy. AI generates outputs faster. Faster outputs with unquantified uncertainty are not more useful than slower outputs with defined error margins. Speed and methodological precision must be calibrated together.

The One Thing Leaders Should Understand Right Now

Organizations that will use AI effectively for sustainability in 3 to 5 years are the ones building structured data foundations today.

The technology is ready. The bottleneck is data, specifically, the absence of product-level, substance-level, supplier-validated data organized in a way that AI can work with. Progress comes from practical, incremental steps: mapping what data exists, identifying the highest-priority gaps, and systematically closing them through supplier engagement, AI-assisted modeling, and expert validation.

As Manuel noted ahead of NAEM OPEX/TECH26, “Progress tends to come from practical steps that build confidence, not from trying to solve everything at once. It’s an evolution, not a revolution.”

That remains the right starting point.

In Practice: How Lenovo ThinkPad Solved This at Scale

The challenge described throughout this keynote is not theoretical. Lenovo’s ThinkPad team worked through it directly with Makersite.

ThinkPad faced increasing pressure in enterprise procurement bids requiring configuration-specific, ISO-aligned Product Carbon Footprints. A single model-level PCF cannot represent variation across customer configurations. Without configuration-level visibility, ThinkPad could not demonstrate how component choices influenced the final footprint. This was a measurable gap in competitive tenders.

The approach Makersite and Lenovo took maps precisely to the methodology described in this here in this article. Supplier Full Material Declarations were ingested and automatically converted into substance-level LCA models. More than 2.5 million FMDs processed through Makersite. Rather than modeling every product variant independently, ThinkPad shifted to a shared-component approach: SSDs, displays, memory, and chassis were validated once and reused across product families. Makersite then generated the highly granular substance-level LCAs for the parts and assemblies behind each shared component. Work that would have taken years manually.

The outcome: configuration-level, ISO-aligned PCFs generated at scale, certified, and usable in enterprise sales conversations. ThinkPad sellers can now demonstrate how specific component choices move the carbon footprint up or down, using traceable data rather than estimates.

Internally, sustainability, engineering, and commercial teams now work from the same data. That alignment between people, methodology, and system is what makes the outputs usable, not just accurate.

Next Steps

If your organization is navigating product-level sustainability data challenges, whether for LCA, Scope 3, PCF, or compliance, the starting point is understanding where your data stands and where the highest-impact gaps exist. Makersite works with enterprise manufacturers to build and scale that foundation.

Book a conversation with Makersite >

9 AI-Powered PLM Software Solutions for Manufacturers in 2026

What is AI Powered PLM?

AI-powered PLM refers to Product Lifecycle Management systems enhanced with artificial intelligence to improve how manufacturers manage, analyze and act on product data across the lifecycle. Traditional PLM systems are systems of record. They store CAD files, manage engineering change orders, track part structures and maintain BOM integrity. AI-powered PLM systems go further. They transform structured product data into decision intelligence.

In practice, AI in PLM can mean:

  • Automatically classifying and cleansing part data
  • Predicting the impact of engineering changes
  • Optimizing simulation models
  • Mapping multi-tier suppliers
  • Filling gaps in material or process data
  • Enriching BOMs with cost, risk, carbon or compliance signals
  • Enabling real time trade off analysis across engineering and procurement

For enterprise manufacturers managing thousands of components across global supply chains, AI-powered PLM becomes less about automation and more about infrastructure. It connects engineering, procurement, compliance and sustainability inside the digital thread.

However, not all AI in PLM is equal.

Some vendors embed AI directly into engineering workflows. Some apply AI primarily to simulation and digital twins. Some use AI to harmonize enterprise data across ERP and PLM. Others focus on sustainability intelligence and supplier risk modeling. For global enterprise manufacturers above operating complex, configurable BOMs, the critical question is not whether AI exists inside the platform.

The critical question is: Does the AI operate at BOM level and influence real product decisions across engineering, sourcing and compliance?

Below are nine AI-powered PLM software solutions shaping enterprise manufacturing in 2026.

1. Makersite

Makersite is a granular, AI-powered Product Lifecycle Intelligence platform purpose built for complex manufacturing sectors, with a strong presence in electronics, automotive, industrial machinery, construction, chemicals and industrial goods.

Makersite tackles the core issue of enterprise PLM environments: structured product data exists, but cross functional intelligence does not. BOMs sit in PLM. Supplier data sits in ERP. Environmental data lives in separate tools. Critical decisions are made without a unified intelligence layer. Rather than replacing PLM systems, Makersite connects to them and enriches structured product data using deeply specialized AI.

How AI is used:

  • Context rich gap filling: Dedicated industry trained AI agents infer missing supplier, material and process data by analyzing BOM structure, manufacturing context and sourcing patterns across multi tier supply chains.
  • Automated background database matching: AI automatically maps BOM inputs to environmental datasets, risk databases and compliance indicators, reducing manual mapping effort dramatically.
  • What if scenario modeling: AI enables real time trade off analysis across carbon, cost, supplier risk and regulatory exposure at configuration level.
  • Multi tier supplier mapping: AI reconciles inconsistent supplier naming and identifies relationships across complex global networks.

Differentiator:

Makersite’s differentiator is its combination of a large structured manufacturing data foundation with highly specialized AI agents trained on industrial context. Its AI understands manufacturing logic, making it highly accurate for complex, configurable BOMs. Best for enterprise manufacturers managing complex BOMs who need accurate environmental, cost and compliance modeling integrated into engineering workflows.

2. Siemens Teamcenter with AI Capabilities

Siemens Teamcenter is a leading enterprise PLM system with embedded AI-focused on engineering optimization and digital twin enablement. Teamcenter addresses the need for structured product data governance at global scale. Its AI capabilities enhance internal engineering processes rather than external supplier intelligence.

How AI is used:

  • Intelligent part classification to reduce manual categorization
  • Change management automation through predictive impact analysis
  • Digital twin optimization using simulation driven AI
  • Knowledge reuse across engineering programs

Differentiator:

Teamcenter’s differentiator is the depth of AI embedded directly inside core engineering workflows and digital twin environments. The AI operates within the system of record rather than as an external layer. Best for large global manufacturers with mature PLM environments focused on engineering performance and simulation optimization.

3. PTC Windchill

PTC Windchill combines PLM with IoT data through its broader ecosystem, using AI to enhance lifecycle visibility and configuration management. Windchill addresses the need to connect product data with real world performance signals.

How AI is used:

  • Predictive analytics on product performance
  • Configuration optimization across variants
  • Closed loop lifecycle insights from connected product data
  • Automated impact analysis across engineering changes

Differentiator:

Windchill’s differentiator is its integration of PLM with IoT and service data, allowing AI to inform decisions using real world performance feedback. Best for industrial machinery and heavy equipment manufacturers managing connected assets and configurable products.

4. Dassault Systèmes 3DEXPERIENCE

Dassault’s 3DEXPERIENCE platform embeds AI primarily within simulation and advanced modeling workflows. The platform addresses the need for design optimization and performance simulation in highly engineered environments.

How AI is used:

  • Simulation driven optimization of materials and structures
  • Predictive modeling of performance scenarios
  • AI-assisted design exploration
  • Digital twin refinement

Differentiator:

Dassault’s differentiator lies in simulation depth. AI enhances computational modeling rather than multi tier supplier intelligence. Best for aerospace and automotive manufacturers with heavy reliance on simulation and advanced materials engineering.

5. SAP PLM with AI

SAP integrates PLM functionality into its ERP backbone, using AI for data harmonization and predictive enterprise analytics. SAP addresses enterprise wide data consistency and financial integration.

How AI is used:

  • Master data harmonization across systems
  • Predictive supply chain insights
  • Demand forecasting and risk identification
  • Intelligent workflow automation

Differentiator:

SAP’s differentiator is enterprise integration. AI connects lifecycle data with financial and procurement systems at scale. Best for global enterprises prioritizing unified ERP and lifecycle data governance.

6. Aras Innovator

Aras Innovator is a flexible PLM platform that supports AI extensions through configurable architecture. Aras addresses manufacturers that require adaptable lifecycle workflows across diverse product portfolios.

How AI is used:

  • Custom analytics and reporting extensions
  • AI powered document search and knowledge retrieval
  • Configurable workflow automation

Differentiator:

Aras differentiates through architectural flexibility. AI capabilities are shaped by implementation rather than delivered as fixed modules. Best for manufacturers seeking customizable PLM infrastructure with tailored AI workflows.

7. Oracle Agile PLM

Oracle Agile remains strong in compliance driven PLM environments, particularly in electronics and high tech sectors. Agile addresses structured documentation, regulatory management and controlled product record environments.

How AI is used:

  • Automated classification and search
  • Compliance analytics through Oracle Cloud services
  • Risk monitoring across supplier documentation

Differentiator:

Oracle Agile differentiates through compliance centric PLM strength, with AI augmenting documentation and regulatory tracking. Best for electronics manufacturers managing strict compliance and documentation requirements.

8. Propel PLM

Propel is a cloud native PLM built on Salesforce infrastructure, targeting modern manufacturing companies. Propel addresses collaboration and lifecycle visibility in cloud first environments.

How AI is used:

  • CRM integrated product insights
  • Workflow automation through Salesforce AI
  • Analytics across customer and product lifecycle data

Differentiator:

Propel differentiates through tight CRM and PLM integration, bringing AI insights across customer and product domains. Best for growth oriented manufacturers aligning product management with customer intelligence.

9. Sustainability Platforms Adjacent to PLM

Platforms such as Sphera focus on compliance databases and environmental risk monitoring that operate adjacent to PLM systems. These platforms address regulatory intelligence rather than engineering integrated intelligence.

How AI is used:

  • Automated regulatory tracking
  • Risk signal monitoring
  • Data normalization for reporting

Differentiator:

These platforms differentiate through regulatory database breadth and compliance depth rather than embedded product level intelligence. Best compliance focused sustainability programs that operate parallel to engineering workflows.

 

When evaluating AI-powered PLM Software Solutions, enterprise manufacturers should ask these questions

1. Is the platform a system of record or an intelligence layer?

Some platforms replace or serve as core PLM systems. Others operate as AI intelligence layers that integrate with existing PLM and ERP environments.

If your organization already runs Siemens Teamcenter, PTC Windchill or Dassault, replacing PLM may not be realistic. In that case, an AI enrichment layer may be more strategic.

Clarify whether you are modernizing infrastructure or augmenting it.

2. Does AI operate at BOM level depth?

High level dashboards are not enough for complex manufacturing.

Ask:

• Can the platform ingest multi level BOMs?
• Can it analyze configuration variants?
• Does AI enrich individual line items?
• Can it model trade offs at component level?

For manufacturers managing thousands of components per product, BOM level intelligence is critical.

This shifts sustainability from retrospective reporting to proactive design decision support.

3. How does the platform handle missing supplier or material data?

Incomplete data is the norm, not the exception.

Evaluate:

• Does the system rely solely on declared supplier data?
• Does it use context aware AI to infer missing attributes?
• Are modeling assumptions transparent and traceable?
• Can estimated values be replaced with primary data later?

The ability to manage uncertainty intelligently often determines scalability.

4. How well does it integrate with existing enterprise systems?

AI-powered PLM should not create new silos.

Assess:

• API depth with PLM and ERP systems
• Compatibility with supplier portals
• Ability to export structured outputs for reporting
• Security and data governance controls

Enterprise adoption depends on seamless integration into current workflows.

5. Does it support cross functional decision making?

PLM historically served engineering.

Modern AI-powered PLM must also serve:

• Procurement teams evaluating supplier risk
• Sustainability teams modeling Scope 3 impact
• Compliance teams tracking regulatory exposure
• Finance teams analyzing cost exposure

Ask whether the platform enables concurrent evaluation of carbon, cost and compliance trade offs.

6. Can it scale across global, multi-tier supply chains?

Enterprise manufacturers operate across regions, currencies and regulatory regimes.

Evaluate:

• Multi tier supplier mapping capabilities
• Localization for regulatory frameworks
• Ability to support digital product passport requirements
• Performance at enterprise data volumes

Scalability is not just about user count. It is about data complexity.

7. Does it influence decisions before design freeze?

Many tools accelerate reporting. Fewer influence product design.

The most strategic AI-powered PLM solutions:

• Integrate directly into early design workflows
• Enable what if scenario modeling
• Provide insights during sourcing decisions
• Support engineering trade off analysis in real time

If intelligence only appears after the product is finalized, the strategic value is limited.

Final Thought: The Future of PLM Is Decision Intelligence

PLM modernization is no longer a technology upgrade.

It is a strategic shift in how manufacturers make product decisions.

As supply chains become more complex and regulatory expectations intensify, intelligence cannot remain siloed in reporting tools or disconnected systems. AI-powered PLM must operate inside the digital thread, linking engineering structure with supplier visibility, cost dynamics and sustainability impact in real time.

The competitive advantage will not come from managing more product data.

It will come from transforming product data into actionable intelligence at the exact moment decisions are made.

Still Have Questions? Let’s Dig Deeper

What makes PLM software “AI-powered” versus traditional PLM systems?

Traditional PLM systems act as systems of record. They manage CAD files, BOM structures, engineering change orders and product documentation. Intelligence typically comes from human analysis layered on top of structured data.

AI-powered PLM introduces machine learning, semantic mapping and predictive modeling directly into the lifecycle workflow. Instead of simply storing product data, AI enriched systems classify components automatically, infer missing attributes, predict the impact of engineering changes, map suppliers across inconsistent naming structures and generate scenario based insights in real time.

The key difference is that AI-powered PLM transforms product data into decision intelligence rather than static documentation.

How does AI in PLM handle incomplete or inconsistent BOM data?

Incomplete BOM data is one of the biggest constraints in enterprise manufacturing. Supplier declarations may be missing. Material compositions may be partially defined. Multi tier sourcing data is rarely transparent.

AI-powered PLM platforms address this through context aware modeling. Instead of relying solely on declared attributes, AI analyzes the component’s category, application, manufacturing context and known supplier patterns to infer likely material compositions or process assumptions.

More advanced platforms also reconcile duplicate supplier records, normalize inconsistent naming conventions and map parts to standardized datasets automatically. This reduces manual cleansing and accelerates time to insight without compromising engineering governance.

Can AI powered PLM replace sustainability or compliance tools?

In most enterprise architectures, AI-powered PLM does not replace sustainability or compliance platforms. It complements them.

PLM remains the system of record for structured product data. Sustainability tools manage regulatory reporting frameworks. Compliance systems track substance declarations and documentation.

AI-powered PLM acts as a connective layer. It enriches product data with environmental, cost and risk intelligence before reporting begins. Instead of exporting static BOMs to downstream tools, manufacturers can integrate intelligence upstream in the product development lifecycle.

This shifts sustainability from retrospective reporting to proactive design decision support.

How accurate are AI generated environmental or supplier estimates?

Accuracy depends heavily on the platform’s underlying data foundation and modeling methodology.

Some tools rely primarily on spend based emissions or generalized industry averages. Others use contextual AI trained on manufacturing datasets to infer missing attributes more precisely.

For exploratory portfolio level analysis, estimated modeling may be sufficient. For regulatory reporting, digital product passports or configuration level carbon footprints, manufacturers typically require platforms grounded in verified engineering logic and structured lifecycle datasets.

AI should enhance data quality, not obscure it.

When should AI-powered PLM be used in the product development lifecycle?

Historically, lifecycle analysis and risk assessments were conducted after product design was largely finalized. This limited the ability to influence outcomes.

AI-powered PLM shifts intelligence earlier into R and D and sourcing workflows. Because AI can instantly evaluate alternative materials, suppliers or configurations, engineering and procurement teams can compare carbon, cost and compliance trade offs before tooling or production begins.

The greatest value of AI in PLM is realized when intelligence informs decisions before design freeze, not after product launch.

Is AI-powered PLM relevant for companies with mature PLM systems?

Yes. In fact, mature PLM environments benefit the most.

Enterprise manufacturers using systems such as Teamcenter, Windchill or 3DEXPERIENCE already have structured product data. What is often missing is cross functional intelligence layered across cost, supplier risk and sustainability dimensions.

AI-powered PLM does not replace core engineering systems. It supplements them by enriching structured data and connecting it to broader enterprise objectives.

For organizations with global supply chains, AI becomes an infrastructure enhancement rather than a system replacement.

7 Product Compliance Software Solutions for Manufacturers in 2026

What Is Product Compliance Software?

Product compliance software helps manufacturers prove that materials, substances, and chemicals used in their products are not restricted or banned in the regions where they sell.

Modern products contain thousands of components and tens of thousands of materials and chemical substances. Regulations such as the Registration, Evaluation, Authorisation and Restriction of Chemicals (REACH), the Restriction of Hazardous Substances Directive (RoHS), the Toxic Substances Control Act (TSCA), the Substances of Concern In Products database (SCIP), and expanding per- and polyfluoroalkyl substances (PFAS) restrictions continue to grow in scope.

For many manufacturers, the challenge is not understanding the rules. It is proving,at component level, that substances inside products comply across every market where they are sold.

Compliance teams are often:

  • Chasing suppliers for material declarations
  • Reformatting substance data for different reports
  • Reassessing entire portfolios when regulations update
  • Paying external service providers for recurring compliance reports

The real issue is data structure and ownership. If material and substance data are not connected to structured bills of materials (BOMs), compliance becomes manual, slow, and expensive.

Below are seven product compliance software solutions manufacturers evaluate when modernizing their product, material, and chemical compliance programs.

Makersite

Software-first product compliance embedded into structured digital product models. The platform is designed to assess compliance across entire product portfolios rather than generating one-off reports for individual products.

Key Compliance Capability

  • Bill of materials screening against REACH, RoHS, PFAS, TSCA, and SCIP
  • Dynamic restricted substance list management
  • Component-level substance mapping
  • Integration of full material declaration (FMD) and IPC-1752 data into product models
  • Portfolio-wide reassessments when regulations update

Positioning
Compliance is calculated inside the product model, not as a downstream reporting layer or outsourced service.

Best For
Large manufacturers seeking in-house control and full portfolio visibility at component level.

Assent

Supplier-driven material and substance compliance with regulatory expertise. Assent also maintains a large supplier engagement network, which can reduce duplicate declaration requests across shared suppliers.

Key Compliance Capability

  • Supplier outreach and declaration collection
  • Regulatory content covering REACH, RoHS, TSCA, PFAS, and SCIP
  • Documentation management and reporting workflows

Positioning
Emphasizes regulatory experts and a shared supplier data network.

Best For
Organizations whose compliance workload is centered on supplier declaration management.

iPoint

Global product and chemical compliance with automotive integration. iPoint is widely used where International Material Data System (IMDS) reporting is mandatory.

Key Compliance Capability

  • Screening against REACH and RoHS
  • SCIP submission support
  • Integration with the International Material Data System (IMDS)
  • Support for automotive compliance and sustainability workflows

Positioning
Strong footprint in automotive and industries where IMDS reporting is required.

Best For
Automotive and complex manufacturers with established compliance infrastructure.

Source Intelligence

Software platform combined with managed compliance services. The company brings these together with regulatory expertise to help manage ongoing compliance program execution.

Key Compliance Capability

  • Automated supplier declaration workflows
  • Regulatory reporting for REACH, RoHS, and TSCA
  • Program management and compliance oversight

Positioning
Flexible SaaS and managed service delivery model.

Best For
Manufacturers seeking structured supplier engagement with service support.

GreenSoft Technology

Material and substance compliance management with service orientation. GreenSoft has a strong footprint in electronics manufacturing, where detailed material declarations are frequently required.

Key Compliance Capability

  • Material data collection and validation
  • Substance screening for major global regulations
  • Documentation and reporting management

Positioning
High-touch supplier engagement, particularly within electronics supply chains.

Best For
Manufacturers managing large electronic component portfolios with recurring reporting needs.

SAP Green Token

Material traceability within SAP enterprise environments. GreenToken is typically deployed as part of broader SAP sustainability and supply chain programs rather than as a standalone substance screening engine.

Key Compliance Capability

  • Traceability of certified and regulated materials
  • Integration with SAP Enterprise Resource Planning (ERP) systems
  • Documentation of material provenance

Positioning
Enterprise-native traceability solution for SAP-centric organizations.

Best For
Companies operating heavily within SAP ecosystems requiring material transparency.

Sphera BOMcheck

Sphera BOMcheck operates as a shared declaration platform that enables suppliers to submit standardized data to multiple customers through a single interface.

Key Compliance Capability

  • Collection and standardization of supplier material declarations
  • Screening against REACH and RoHS substance lists
  • Support for SCIP workflows

Positioning
Structured declaration exchange network rather than component-level modeling engine.

Best For
Organizations focused on standardized supplier declaration collection across global supply chains.

 

How to Choose Product Compliance Software

1. Where does your compliance complexity actually sit?

If your primary challenge is supplier declaration collection and documentation management, you need a platform built for structured supplier engagement. If your challenge is screening large, complex bills of materials at component level across multiple product lines, you need software that embeds substance compliance directly into structured product data.

Choosing the wrong category leads to ongoing manual work and service dependency.

2. Do you need compliance embedded in engineering systems, or managed externally?

Some platforms integrate directly into Product Lifecycle Management (PLM) and Enterprise Resource Planning (ERP) systems, enabling compliance checks during product design and sourcing. Others focus primarily on supplier outreach and regulatory documentation workflows.

If compliance decisions must happen early in the design cycle, integration depth matters. If the burden sits in supplier documentation, workflow tooling may be sufficient.

3. How will the system handle regulatory updates at portfolio scale?

Regulations such as REACH and PFAS restrictions evolve frequently. The key question is whether the platform can automatically reassess your full product portfolio when substance lists change, or whether updates require manual rework or external service support.

Scalability and dynamic restricted substance list management are critical for long-term cost control.

 

 

Vendor Core Focus Key Compliance Capability Best For
Makersite Software-first product compliance Component-level substance screening and portfolio reassessment Manufacturers embedding substance compliance into product design workflows
Assent Supplier-driven compliance Material declaration collection and regulatory services Supplier-heavy compliance programs
iPoint Automotive and global compliance REACH, RoSH screening and IMDS integration Automotive and global manufacturers
Source Intelligence SaaS and managed compliance Supplier outreach and regulatory reporting Structured supplier programs
GreenSoft Technology Service-led material compliance Data collection and substance screening Electronics-heavy portfolios
Sap Green Token SAP-based traceability Certified material tracking and ERP integration SAP-centric enterprises
Sphera BOMcheck Declaration exchange platform Standardized supplier declaration screening Global supplier networks

Still Have Questions? Let’s Dig Deeper

Can one platform handle all stages of product compliance?

No. Most organizations use multiple tools because compliance spans distinct activities: regulatory research, early design analysis, supplier data collection, and ongoing change monitoring. Each stage has different data inputs, users, and workflow requirements. Platforms that excel at supplier declaration management, for example, are not typically designed for early-stage BOM screening or regulatory horizon scanning. Mature compliance programs layer tools based on where they need intelligence applied.

When should compliance screening happen in product development?

The earlier the better. Identifying restricted substances or regulatory gaps during early design avoids costly late-stage redesigns, supplier changes, or market access delays. However, many organizations still treat compliance as a final validation checkpoint because their tools only work with finalized BOMs. AI-powered platforms that can analyze incomplete or inconsistently formatted data enable compliance screening earlier when design changes are still feasible.

How do compliance tools handle incomplete or messy product data?

Regulatory change monitoring tracks updates to existing regulations—amendments, new substance additions, threshold changes. Horizon scanning goes further by identifying emerging regulations, policy signals, and legislative trends before they become formal requirements. Change monitoring is reactive (what changed today); horizon scanning is predictive (what’s likely coming). Organizations use both: change monitoring for operational compliance, horizon scanning for strategic product planning.

8 AI-Powered LCA Software Solutions for Manufacturers in 2026

What Is AI-Powered LCA Software?

AI-powered Life Cycle Assessment (LCA) software uses machine learning (ML), large language models (LLMs), and predictive algorithms to quantify the environmental impacts of a product across its full value chain. Traditional LCA is notoriously labor-intensive, requiring months of manual data collection and consulting hours. AI disrupts this by automating the most painful friction points: filling data gaps when supplier data is missing, matching complex bills of materials (BOMs) to background databases, and running what-if scenario models at scale.

In 2026, regulatory pressure has accelerated the demand for these tools. However, not all AI is built the same. The market is currently split between platforms built on robust, industry-specific data foundations that use AI to enrich existing data, and newer platforms relying heavily on “synthetic” LLM-generated models to estimate impacts rapidly.

Quick Summary

  • Makersite: Manufacturing-focused LCA platform that uses proprietary industry AI agents for context-rich gap filling and automated BOM-to-database matching.
  • Sphera: Enterprise, service-led LCA platform using AI to automate matching to its proprietary GaBi database, embedded within a broader EHS ecosystem.
  • One Click LCA: Construction-focused platform using AI to automatically map Building Information Modeling (BIM) elements to LCA datasets and EPDs.
  • Minviro: Mining and battery materials platform utilizing automated, data-driven parameterization to instantly update complex geological LCA models.
  • Muir AI: Rapid assessment platform relying heavily on LLMs to create “synthetic” supply chain models and deconstruct products without primary supplier data.
  • CarbonCloud: Food and beverage LCA tool using an AI-driven classification engine to map products to representative agricultural supply chains.
  • Watershed: Enterprise carbon platform utilizing AI to deconstruct purchased goods (Scope 3.1) into sub-materials and production processes.
  • Terrascope: GHG and decarbonization platform using ML for missing data imputation and automated emission factor matching.

Makersite

Makersite is a granular, AI-powered LCA platform purpose-built for complex manufacturing sectors, with a strong presence in electronics, automotive, industrial machinery/construction, consumer goods and chemicals.

Makersite tackles the core issue of manufacturing LCAs: modeling products with thousands of components when primary supply chain data is missing. Rather than relying on generic estimates, it ingests structured product data (BOMs) and enriches it using deeply specialized AI.

How AI is used:

  • Context-rich gap filling: Uses dedicated, industry-level proprietary AI agents to infer missing material or process data. The AI analyzes the context of the product (materials, components, and likely manufacturing processes) to accurately fill gaps.
  • Automated background database matching: AI automatically maps BOM inputs to the most accurate LCA datasets and emission factors (e.g., Ecoinvent) across any impact category, reducing mapping time from months to minutes.
  • What-If Scenario Modeling: AI powers real-time recommendations for material and supplier substitutions, allowing engineering and procurement teams to compare environmental, cost, and compliance trade-offs concurrently.

Differentiator:
Makersite’s differentiator is its combination of a large data foundation with highly specialized, industry-trained AI agents. Unlike generic AI tools, its AI understands manufacturing context, making it highly accurate for complex, multi-tier supply chains.
Best for: Manufacturers managing complex BOMs who need highly accurate environmental, cost, and compliance modeling.

Sphera

Sphera is an enterprise-grade, service-led LCA provider that combines purpose-built software solutions with its legacy GaBi database to automate specific areas of the LCA process for large organizations.

How AI is used:

  • Automated background matching: Uses AI algorithms to automatically match client activity data to its proprietary Managed LCA Content (GaBi) database, which contains over 20,000 verified datasets.
  • Predictive EHS insights: Through “Sphera AI”, the platform leverages machine learning to embed predictive insights into broader Environmental, Health, and Safety (EHS) and operational risk workflows, linking product sustainability to operational safety.

Differentiator:
Sphera’s main strength is its deep integration into enterprise EHS ecosystems and its proprietary GaBi database. It is a service-led offering designed to reduce manual modeling for multinational corporations rather than a pure self-serve software play.
Best for: Large enterprises looking for a service-led approach combined with EHS infrastructure.

One Click LCA

One Click LCA is a construction-focused platform that utilizes AI to automate carbon assessments for the highly fragmented built environment.

How AI is used:

  • Automated material matching: Uses AI to read Building Information Modeling (BIM) files and Bills of Quantities (BOQs), automatically matching architectural design elements to an extensive database of verified LCA datasets and EPDs.
  • Early-stage conceptual modeling: AI-driven tools (like Carbon Designer 3D) help users model the carbon impact of different structural layouts and material choices before finalizing designs.

Differentiator:
Vertical depth. AI in construction LCA is highly specific, requiring the ability to understand architectural plans and regional building codes. One Click LCA’s AI eliminates the manual translation of building designs into LCA models.
Best for: Architects, engineers, and construction firms needing automated EPD matching and green building compliance.

Minviro

Minviro operates in a highly complex niche: the energy transition. It focuses on the cradle-to-gate LCA of mining operations, electric vehicles (EVs), and battery materials.

How AI is used:

  • Data-driven parameterization: While the exact ML architecture is proprietary, Minviro uses automated, data-driven parameterization to manage complex geological variables (ore grade, local energy mix, processing routes).
  • Real-time model updating: Automates LCA recalculations instantly when upstream mining or supplier data changes, ensuring battery compliance models reflect “live” operational realities rather than static industry averages.

Differentiator:
Sector specificity. General-purpose LCA AI cannot account for how a specific mining site’s ore grade impacts total Global Warming Potential (GWP). Minviro provides defensible, site-specific environmental data crucial for EV OEMs.
Best for: Mining companies, battery manufacturers, and EV supply chain teams.

Muir AI

Muir AI is a rapid assessment platform. It takes a fundamentally different approach to LCA, prioritizing speed and portfolio-wide coverage by relying heavily on Large Language Models (LLMs) to generate “synthetic” data.

How AI is used:

  • AI-driven deconstruction: Uses LLMs to break down simple procurement data or generic product descriptions into assumed material components and manufacturing processes.
  • Synthetic supply chain mapping: Employs AI to estimate the likely flow of materials across sourcing countries and assigns synthetic emission models when primary data is entirely absent.

Differentiator:
Speed at the expense of primary data foundations. Because Muir AI relies almost entirely on LLMs to build synthetic LCAs, it can instantly assess entire product portfolios. However, this approach lacks the contextual accuracy and data foundation of tools like Makersite, making it better for high-level hotspotting than precise engineering trade-offs.
Best for: Consumer goods and apparel companies needing rapid, high-level portfolio assessments where primary supplier data is completely unavailable.

CarbonCloud

CarbonCloud is an AI-enhanced LCA platform built specifically to map the immense variability of agricultural and food supply chains.

How AI is used:

  • AI Category Tree Mapping: Uses an AI-driven classification engine to categorize complex food products based on their properties and automatically map them to representative agricultural supply chains.
  • Automated Modeling Engine: Uses predictive mapping to generate climate footprints for large food portfolios in a matter of days by filling ingredient data gaps with verified agricultural metrics.

Differentiator:
CarbonCloud excels at creating automated “digital twins” of food products, providing F&B brands with a consistent baseline for entire product portfolios, even when upstream farm data is missing.
Best for: Food and beverage brands looking to scale carbon footprinting across massive product lines.

Watershed

While traditionally known as an enterprise carbon accounting platform, Watershed has developed specific AI LCA capabilities to tackle Scope 3.1 (Purchased Goods and Services).

How AI is used:

  • Product deconstruction: AI models deconstruct purchased items—from basic office supplies to industrial chemicals—into their sub-materials and likely production processes based purely on spend and procurement descriptions.
  • Automated regional mapping: The AI automatically applies regional emission factors and manufacturing assumptions to these deconstructed components to build rapid Product Carbon Footprints (PCFs).

Differentiator:
Watershed uses AI not for deep product engineering, but for procurement intelligence. It is designed to give enterprise sustainability teams a fast, AI-generated LCA of the things they buy, rather than the things they make.
Best for: Corporate sustainability and procurement teams needing to estimate the footprint of large volumes of purchased goods.

Terrascope

Terrascope focuses on using machine learning to improve the efficiency, accuracy, and scalability of enterprise greenhouse gas accounting and product footprinting.

How AI is used:

  • Missing data imputation: Uses ML models to automatically check for data quality, identify anomalies, and impute (estimate) missing values in bulk supplier data.
  • Intelligent emission factor matching: An AI engine matches company activities and materials with the most appropriate emission factors in minutes, assigning confidence scores and flagging low-confidence matches for human review.

Differentiator:
Terrascope is built for scale and ease of use, utilizing AI to clean up messy corporate data and democratize the emission factor matching process for non-sustainability experts.
Best for: Large enterprises needing scalable ML solutions to clean data and automate GHG/PCF accounting.

How to Choose: Key Questions

  1. Are you engineering complex products, or doing rapid portfolio estimates?If you are a manufacturer designing complex, multi-tier products and need high accuracy for engineering trade-offs, Makersite offers the necessary industry-specific AI and strict data foundation. If you just need a fast, high-level estimate across a consumer portfolio and are comfortable with LLM-generated “synthetic” data, Muir AI provides rapid speed.
  2. What industry are you in? AI in LCA works best when it understands your specific sector. One Click LCA is unmatched for construction and BIM integrations. Minviro is the only logical choice for the geological complexities of battery and EV mining. If you are in food and agriculture, CarbonCloud and HowGood hold the specialized AI engines for crop and ingredient mapping.
  3. What is the end goal of the assessment? If the goal is product design, cost optimization, and supply chain substitution, Makersite connects those workflows natively. If you need to satisfy enterprise Scope 3 reporting and EHS compliance, Sphera or Terrascope are ideal. If you are trying to map the footprint of the products you buy rather than make, Watershed is built specifically for procurement deconstruction.

 

 

Vendor Core Focus Key AI Capability Best For
Makersite Manufacturing, BOM-level PCF, supply chain LCA Industry-specific AI gap filling; semantic DB matching; AI scenario modeling Manufacturers managing complex, multi-tier supply chains (Electronics, Auto, Industrial)
Sphera Enterprise LCA and EHS integration Automated matching to GaBi database; predictive EHS risk insights Large enterprises wanting a service-led approach with EHS infrastructure
One Click LCA Construction and built environment AI matching of BIM/BOQ files to EPDs; early-stage conceptual modeling Architects, engineers, and construction firms
Minviro Mining, EVs, and battery materials Automated data-driven parameterization; real-time model updating Mining companies, battery makers, EV supply chain teams
Muir AI Rapid supply chain assessment LLM-driven product deconstruction; synthetic supply chain modeling Consumer goods needing fast, high-level estimates without primary data
CarbonCloud Food and beverage portfolios AI category tree classification; automated agricultural supply chain mapping Food & beverage brands mapping large product portfolios
Watershed Enterprise Scope 3.1 (Purchased Goods) AI deconstruction of procured items; automated regional mapping Corporate procurement teams measuring supply chain emissions
Terrascope Enterprise GHG and PCF automation ML data imputation; intelligent emission factor matching engine Enterprises needing scalable data cleansing and automated GHG accounting

Still Have Questions? Let’s Dig Deeper

What makes LCA software “AI-powered” versus traditional lifecycle assessment tools?

Traditional LCA software relies on manual data entry, extensive supplier surveys, and human experts spending weeks mapping components to background databases (like Ecoinvent or GaBi). “AI-powered” platforms automate these bottlenecks. They use machine learning and semantic algorithms to automatically match complex Bills of Materials (BOMs) to the correct emission factors, use predictive models to fill in data gaps, and enable real-time “what-if” scenario modeling without requiring a sustainability consultant to recalculate the entire assessment.

How do AI LCA tools handle incomplete or missing primary supplier data?

Missing data is the biggest hurdle in traditional LCA, but it’s exactly where AI excels. Instead of stalling an assessment, AI platforms use context to bridge the gaps. For example, tools built for manufacturing (like Makersite) use industry-specific AI agents to infer the likely materials and manufacturing processes based on the component’s context. Other platforms use machine learning to impute missing values from corporate spend data, or rely on LLMs to generate “synthetic” supply chain estimates to keep the assessment moving.

Are AI-generated or “synthetic” emission estimates accurate enough for regulatory reporting?

It depends heavily on the platform’s data foundation and your end goal. If you are doing rapid, portfolio-wide hotspotting to see where your biggest emissions are, “synthetic” models (relying heavily on LLMs and spend data) are incredibly useful. However, for strict regulatory compliance (like the EU Battery Regulation or CSRD) and precise engineering trade-offs, you need platforms that use AI to enrich a rigid, scientifically verified data foundation (like Makersite, Sphera, or Minviro) rather than relying entirely on AI-generated estimates.

When should AI-powered LCA be used in the product development lifecycle?

Historically, LCA was a retrospective exercise—done after a product was manufactured to create a report. AI-powered LCA shifts this entirely to the left, straight into the R&D and design phases. Because AI can instantly map impacts and run “what-if” scenarios, engineering and procurement teams can use these tools during the early design phase to instantly compare the carbon, cost, and compliance trade-offs of switching a material or supplier before the product is ever built.

From Manual LCAs to Cloud-Scale Measurement: Microsoft’s CHEM Methodology

The challenge: You can’t decarbonize what you can’t measure

For hyperscalers and data center operators, embodied carbon in ICT hardware represents a major share of Scope 3 emissions. In a recent whitepaper, Microsoft notes that reducing this impact requires reliable and granular measurement across a rapidly evolving hardware landscape and a deeply layered global supply chain.

While life cycle assessment (LCA) is a well established methodology for quantifying environmental impacts, Microsoft states that traditional approaches are difficult to apply consistently at cloud scale. Manual steps such as reconstructing complex BOMs and mapping materials to life cycle inventory datasets can take more than 100 hours per server, which makes it difficult to scale process-based LCA across thousands of hardware configurations without significant effort.

The shift: From manual modeling to scalable measurement

To overcome these limitations, Microsoft developed the Cloud Hardware Emissions Methodology, or CHEM. CHEM is an LCA based methodology designed to automate and scale embodied carbon measurement across Azure hardware, while preserving the level of detail needed to identify emissions hotspots and evaluate decarbonization interventions.

How CHEM is built

CHEM was developed using Azure data services alongside cloud based automated LCA software, including Makersite, which Microsoft uses to implement and scale process based  LCA models across complex hardware configurations. This is combined with proxy mapping tooling and state of the art semiconductor life cycle inventory data from the imec Sustainable Semiconductor Technologies and Systems program.

Integrating product data
To reduce manual effort and improve consistency, CHEM integrates directly with Microsoft’s internal product data management systems and full material declarations. This allows complex BOMs hierarches to be transferred automatically into the LCA modeling environment, helping assessments stay aligned as hardware designs evolve.

Automating material to inventory mapping
CHEM automates the mapping of material compositions to representative life cycle inventory datasets from third party sources such as ecoinvent. By reducing manual modeling work, this approach allows practitioners to focus on data quality, supplier specific inputs, and interpretation rather than data entry.

Modeling semiconductors at higher resolution
Microsoft identifies semiconductor components as the primary drivers of embodied carbon in datacenter hardware. To improve accuracy, CHEM incorporates detailed manufacturing data from the imec Sustainable Semiconductor Technologies and Systems program.

Microsoft integrates this data into custom LCA models and uses its automated LCA software environment, including Makersite, to run and scale those models across large numbers of hardware configurations.

Why this matters

By applying CHEM across its cloud hardware fleet, Microsoft describes several practical outcomes:

  • More robust Scope 3 reporting
    Process based data replaces high level financial proxies, supporting disclosures that are more consistent, auditable, and repeatable at scale.
  • Clearer supply chain hotspot identification 
    Granular modeling makes it possible to trace embodied carbon impacts multiple tiers deep and evaluate where targeted interventions could have the greatest effect.
  • Carbon informed hardware design
    CHEM data can be used by system architects to consider embodied carbon alongside power, performance, and cost during hardware design decisions.
  • More precise carbon roadmapping
    Aggregated results across parts, assemblies, and configurations support carbon reduction roadmaps that reflect real manufacturing processes rather than estimates.

A signal for the industry

Microsoft presents CHEM as part of a broader shift toward more scalable, data driven approaches to understanding and reducing the embodied carbon impact of cloud hardware. Th company also highlights ongoing collaboration with industry groups such as the Open Compute Project and the Semiconductor Climate consortium to help improve consistency and standardization in LCA based carbon accounting.

Together, these efforts point toward a future where embodied carbon data is not just reported but operationalized. For organizations managing complex hardware fleets, the CHEM approach illustrates what is required to move from high level estimates towards measurement that can support real supply chain, design, and roadmapping decisions.

This blog is an interpretive summary of Microsoft’s whitepaper ‘How Microsoft is advancing embodied carbon measurement at scale for Azure hardware’, published in 2026. 

You are currently viewing a placeholder content from Articulate 360. To access the actual content, click the button below. Please note that doing so will share data with third-party providers.

More Information

Still Have Questions? Let’s Dig Deeper

How does Microsoft measure embodied carbon for Azure hardware?

Microsoft measures embodied carbon for Azure hardware using the Cloud Hardware Emissions Methodology (CHEM), a process based life cycle assessment methodology. CHEM integrates internal product and supply chain data with environmental lifecycle inventory data to quantify emissions across the full hardware lifecycle.

What is the difference between spend-based and process-based LCA for data centers?

Spend-based methods estimate emissions using financial proxies, which can obscure the true drivers of embodied carbon. Process-based LCA, as used in CHEM, models emissions based on physical manufacturing processes and material flows, enabling more granular and actionable insights into where emissions originate.

How does Microsoft handle the complexity of semiconductor emissions?

Recognizing that semiconductors are a major contributor to embodied carbon, Microsoft incorporates detailed semiconductor life cycle inventory data into CHEM.This includes the use of advanced “virtual fab” models developed with data from the imec Sustainable Semiconductor Technologies and Systems program to represent specific manufacturing process steps rather than generic averages.

Can Life Cycle Assessment (LCA) be automated for hyperscale hardware?

Microsoft’s CHEM methodology demonstrates that significant parts of process-based LCA can be automated when product data systems are connected to cloud-based LCA modeling tools. This reduces the manual effort required to reconstruct BOMs and map materials to life cycle inventory datasets at hyperscale.

What role does Makersite play in the CHEM methodology

Microsoft uses Makersite as part of the CHEM implementation to support automated LCA modeling across complex hardware configurations. Makersite is used to map product structures and materials to environmental datasets, enabling scalable, process-based emission modeling.

Quantifying Circularity: A Data-Driven Approach to Chip Lifecycle Emissions

Turning Vision into Action: Advancing Circular Manufacturing

To open this masterclass, Gruber and Dillman presented a bold perspective on circular economy strategies, using a case study that compared the environmental and economic impacts of reusable and linear semiconductor chip designs. With sustainability leaders from companies like Amazon, IKEA, and Cisco in attendance, the discussion emphasized integrated, data-driven decision-making as a critical enabler for meeting today’s sustainability standards.

Contrasting scenarios included:

  • A linear model, where the chip is manufactured, used, and discarded.
  • A circular model, where the chip is recovered, re-balled, and reused.

The circular model demonstrated slightly higher emissions for the reprocessing step (2.36 kg CO₂e vs. 1.94 kg CO₂e for linear disposal), but by extending the lifetime of the initial chip in the circular model, where the linear would now be replaced by a new chip (1.94 x 2 = 3.88 kg CO₂e) the benefits of the circular approach is shown. By eliminating the need to manufacture new chips for future production cycles, the circular process reduces total, system-wide emissions while also drastically minimizing raw material extraction, water usage, and land use. 

Circular manufacturing offers a transformative solution for reducing environmental impact and building long-term economic resilience. Forward-thinking companies like Jabil are already operationalizing these principles, turning what was once considered waste into valuable resources through systematic recovery and reuse programs that can also deliver significant cost savings.

Gruber and Dillman’s data-driven example underscores how this model can cut resource consumption, support compliance with evolving sustainability regulations, and drive progress toward a fully circular economy. Businesses adopting these strategies position themselves as sustainability leaders, strengthening their operations against resource scarcity and climate challenges. By embracing circular innovation, companies unlock a powerful pathway to sustainable growth and competitive advantage.

“Circularity isn’t just about recycling—it’s about smarter design, sourcing, and evaluating trade-offs,” said Dillman. “To close the loop, we must assess impacts beyond carbon.”

Designing for Circularity: Key Insights from the Session

Key takeaways included:
  • Sustainability Requires System Thinking: Achieving a circular economy demands cross-functional collaboration across design, procurement, logistics, and recovery. A unified data foundation is critical to driving these efforts effectively.
  • Data-Driven Decisions Over Assumptions: The circular chip example scenario underscores the importance of high-fidelity modeling in evaluating circular strategies. Circular initiatives often lack granular emissions and cost data, making it difficult to assess trade-offs or justify actions internally. Digital tools that enable engineers and sustainability teams to quantify carbon impacts and material costs at the component level provide the analytical rigor needed to support data-backed circularity decisions.
  • Leadership Focuses on Actionable Insights: The strong participation of executives and senior managers in the session underscores growing C-level commitment to sustainable innovation and responsible driven business models.
  • Scalable Platforms Are the New Standard: Fragmented tools fall short in today’s complex landscape, creating new data silos and preventing transparency. Forward-thinking sustainability leaders are turning to scalable platforms and digital tools to seamlessly integrate sustainability, cost efficiency, and product compliance into their operations.

Driving Circularity with Actionable Product Intelligence

As manufacturers push toward circular economy goals, decision-makers are increasingly turning to digital tools that provide high-resolution insights across the product lifecycle. These platforms are enabling sustainability, procurement, and design teams to move beyond assumptions by modeling the environmental and economic implications of circular strategies in real time.

By bringing together lifecycle data, cost metrics, and supply chain considerations, these tools support:

  • Comparative analysis of linear vs. circular models
  • Identification of trade-offs across environmental categories
  • Alignment across teams through shared, data-driven insight.

In a rapidly shifting regulatory and market landscape, the ability to simulate design choices at scale — grounded in real-world data — is essential. Organizations that invest in this type of intelligence aren’t just improving products; they’re reshaping how sustainability is operationalized across the enterprise.

Turning Circular Strategies into Scalable Impact

For manufacturers, achieving sustainability success requires integrating data-driven insights and lifecycle thinking into design and procurement processes. This approach empowers teams to scale effective strategies such as reducing product carbon footprints, ensuring regulatory compliance, and driving operational efficiencies. With data and cross-functional alignment at the core, circularity evolves from a lofty goal to a measurable competitive advantage, positioning businesses as leaders in innovation and sustainability.

How EPDs at scale help you win tenders and drive sustainable product innovation

Ebook: Your EPDs Are Holding You Back

DOWNLOAD EBOOK

The EPD landscape has shifted. It’s no longer optional – it’s a competitive battleground where speed and verifiable proof now dictate market access and credibility. Are your EPDs truly enabling sustainable choices, or just checking a box?

Months-long verification bottlenecks, crippling costs per EPD, and—let’s be frank—a reliance on generic ‘family’ averages often chosen less for true efficiency and more to conveniently obscure performance variability across sites or products.

This ebook cuts through the noise. Your EPDs Are Holding You Back exposes the limitations of outdated EPD practices and shows how leveraging EPDs at scale with automation transforms this liability into your competitive advantage.

Download the ebook to uncover how to:

✅ Win more tenders: Instantly verified, product-specific EPDs ready on demand.

✅  Stop hiding behind product families: Differentiate with credible, granular LCA data and expose the limitations of blended data.

✅ Unlock the real power of EPDs: Go beyond marketing to use specific EPD data as an engine for genuine internal innovation, R&D, procurement optimization, and robust decarbonization planning.

✅ Slash crippling cost, time, and verification bottlenecks: Leave manual LCA modeling and never-ending verification loops behind.

Key learnings: Navigating Material & Substance Compliance

You are currently viewing a placeholder content from Wistia. To access the actual content, click the button below. Please note that doing so will share data with third-party providers.

More Information

Masterclass Key Takeaways

Manufacturers today are navigating an increasingly challenging compliance landscape. Global regulations are evolving faster than ever, supply chains are more complex, and regulatory expectations demand far more than just ticking boxes. Modern product compliance now requires robust data management, seamless supplier collaboration, and continuous process optimization to keep pace.

Recognizing these challenges, Makersite’s material & substance compliance experts take a deep dive in our most recent online masterclass to walk through proven strategies to help North American manufacturers not only stay compliant, but scale their compliance operations efficiently, strengthen supplier engagement, and protect product availability.

Here’s what you need to know to build a scalable, resilient product compliance approach, and turn regulatory complexity into a competitive advantage.

The Evolving Compliance Landscape

Regulatory requirements are accelerating at an unprecedented pace, creating new challenges and complexities for manufacturers across every industry. Staying compliant is no longer just about keeping up, it’s about staying ahead.

Here’s a look at the biggest hurdles North American companies are facing right now.

Key Challenges for Manufacturers

  • Complex and Expanding Regulations: Regulations like REACH, TSCA’s PFAS reporting rules, and RoHS exemptions are adding thousands of new substances to watch, often at an accelerating pace.
  • Disjointed and Isolated Data Systems: Traditional tools like spreadsheets, ERP, and PLM platforms often operate in silos, making it challenging for organizations to establish seamless communication between systems. This lack of cohesion leads to disjointed, unstructured data that is difficult to integrate, analyze, and leverage effectively for decision-making. As a result, teams may experience inefficiencies, errors, and missed opportunities for growth and innovation.
  • Fragmented Supplier Communication: Relying on emails and forms, without a centralized platform for managing supplier responses, approvals, and escalations, leads to confusion, delays, and errors. On top of that, suppliers are overwhelmed with requests from hundreds of different customer portals, making engagement and data collection even harder to scale.
  • Compliance Addressed Too Late: Reactive compliance approaches don’t just risk shipment delays, costly redesigns, and regulatory fines. They also limit strategic options. Staying ahead of evolving legislation, like monitoring the SVHC Candidate List, enables companies to substitute risky materials early. New regulations like PFAS reporting in the US require companies to trace product data backwards, in some cases as far as January 2011.

The consequences of non-compliance are becoming more severe, and increasingly business critical. Without robust processes in place, manufacturers risk facing shipment holds, financial penalties, loss of customer trust, and even market bans. In some cases, a single missing declaration or outdated material can block product access to entire regions, leading to lost revenue, disrupted supply chains, and strained customer relationships.

The Exploding Regulatory Horizon

The challenge isn’t static; it’s expanding. Manufacturers must keep pace with key regulatory deadlines such as:

  • California & New York PFAS Bans: Taking effect in 2025. These bans have significant implications for industries like Automotive, where PFAS are commonly used in coatings, upholstery, and other vehicle parts. Additionally, New Mexico’s HB 212, signed into law on April 8, 2025, makes it the third U.S. state, following Maine and Minnesota, to enact a broad PFAS ban.
  • REACH Updates: Universal PFAS restrictions are currently under review, but what makes this regulation unique is that it doesn’t target specific substances, but an entire group of chemicals. This presents a particular challenge for industries like medical devices, where certain products can’t currently be manufactured without PFAS.
  • Current discussions at ECHA indicate two possible directions: Industry may continue to use fluoropolymers only where no alternatives exist. Meaning if a competitor can produce a similar product without PFAS, you may be required to do the same. Secondly, consumer uses of fluoropolymers are still being considered for a complete ban.
  • RoHS Lead Exemption Phaseouts: Changes expected in the next 12–18 months. The EU’s Restriction of Hazardous Substances (RoHS) directive has historically allowed certain exemptions for the use of lead in specific applications, particularly in complex electronics and medical devices where no viable alternatives existed. However, many of these exemptions are now under review and expected to be phased out in the coming 12–18 months. This presents a significant challenge for manufacturers, especially in sectors like electronics, automotive, and industrial equipment, where lead has been critical for soldering and high-reliability components. Companies relying on these exemptions need to act now to identify alternative materials, redesign components, or prepare for requalification processes, all of which can be costly and time-consuming if left too late.

The overlaps in these regulations—such as varying thresholds and contradictory rules between federal and state mandates (e.g., TSCA vs. California PFAS disclosures)—add further complexity.

Pro Tip

To remain competitive and compliant, manufacturers need scalable systems that enable centralized compliance tracking, cross-functional regulatory reviews, and ongoing horizon scans.

Supplier Engagement & Data Collection

Effective compliance starts with obtaining the right input data from suppliers. Without this, meeting regulatory requirements becomes an uphill battle. Leading organizations are overcoming this challenge by leveraging a centralized supplier portal, a single source of truth that not only streamlines data collection but also provides built-in escalation paths and approval workflows.

By equipping suppliers with a central portal that offers escalation and approval functionalities, companies can ensure faster response times, better data accuracy, and improved collaboration. This approach reduces confusion, minimizes back-and-forth emails, and provides full traceability across supplier communications, a critical advantage when managing complex global supply chains.

Minimum Data Requirements

Ensure seamless and comprehensive compliance by securing access to:

  • Bills of Materials (BOMs): A detailed breakdown of all materials and components used in your products, essential for accurate regulatory reporting.
  • Supplier-Provided Files: Full Material Declarations (FMDs) and Certificates of Compliance (CoCs) to ensure traceability and adherence to standards.
  • SCIP and Regulatory IDs: Streamline automated submissions and maintain efficiency in meeting regulatory demands.

FMDs vs. CoCs: Understanding the Difference

  • FMDs provide complete transparency, offering a robust framework for long-term compliance that evolves with regulatory advancements.
  • CoCs, while suitable for immediate needs, require frequent updates to align with changing regulations—making them less sustainable for future-proof compliance strategies.

Pro Tip

Revolutionize your compliance approach with a focus on innovation, efficiency, and sustainability. By leveraging advanced data strategies, your business can stay ahead of regulatory demands while building a foundation for long-term success.

Simplify Supplier Collaboration

Simplifying supplier collaboration isn’t just about sending standardized forms. It requires the right technology to scale effectively. Equip your suppliers with intuitive, standardized formats like IPC 1752 to prevent fatigue and reduce friction. But to truly streamline the process, companies need a software solution that enables automated workflows for collecting, validating, and managing supplier data at scale.

Automation not only saves time for everyone involved but also reduces error rates and ensures data consistency, something manual processes simply can’t deliver when dealing with complex supply chains and evolving regulatory demands.

Automating Internal & External Compliance Reporting

Compliance demands transparency at every level. Here’s how automation transforms reporting processes.

  • Drill into the details: Analyze BOMs at a granular level to pinpoint components and assess compliance risks with precision.
  • Big-picture monitoring: Gain complete visibility across your portfolio with real-time dashboards tracking product status, supplier responsiveness, and key compliance metrics.

External Stakeholder Reporting

Streamline compliance management with automation that eliminates manual processes, delivering:

  • Ready-to-submit regulatory documents (e.g., SCIP or ECHA submissions).
  • Customizable dossiers tailored to meet customer and market-specific requirements.

Manufacturing enterprises need a centralized platform seamlessly integrates with ERP and PLM systems, ensuring stakeholders always have access to accurate, up-to-date compliance data.

Scaling Compliance Efforts-Why it Matters

With growing product lines and expanding global markets, manual compliance efforts no longer cut it. They fail to keep up with evolving regulations, hamper market readiness, and increase operational costs.

Next-Generation Solutions for Scalable Compliance

  • Leverage Automation: Automate workflows and data flows to reduce manual errors and accelerate compliance efforts.
  • Adopt Standardization: Use globally accepted data formats (e.g., IPC), enabling smoother communication across teams.
  • Adapt to Change: Implement systems that not only flex with new regulatory requirements but also enable companies to proactively identify and substitute substances or materials, even before new regulations come into force. This future-proofing approach helps avoid costly redesigns, reduce risk, and accelerate market entry.

By investing in digital tools, companies can significantly reduce time-to-market while managing the growing complexity of product compliance. You can accelerate data processing, automate regulatory checks, and helps identify potential product compliance risks early, even across large, fragmented supply chains. This not only speeds up supplier data validation but also enables smarter decision-making when it comes to material substitutions, regulatory reporting, and risk mitigation.

Looking Beyond Compliance

Compliance isn’t just a legal mandate; it’s a strategic advantage and an untapped opportunity to drive sustainability and innovation.

Product Compliance Managers sit on a gold mine of product and material data, often without realizing its full potential. The detailed supplier, material, and substance information collected for compliance purposes forms the perfect foundation for conducting Product Carbon Footprints (PCFs) and Life Cycle Assessments (LCAs) at scale.

This creates a unique opportunity to break down organizational silos between product compliance and product sustainability teams. By leveraging compliance data more strategically, companies can accelerate sustainability initiatives, reduce Scope 3 emissions, and design greener products — all without starting data collection from scratch.

Driving Sustainability Through Innovation

Enhancing BOM data with material insights empowers manufacturers to:

  • Conduct precise Life Cycle Assessments (LCA) and calculate accurate Product Carbon Footprints (PCF).
  • Monitor and report Scope 3 emissions for comprehensive corporate sustainability strategies.
  • Implement Eco-design Scenarios to replace non-compliant materials with greener, cost-efficient alternatives.

Strategic Recommendations

Adopt a proactive, scalable compliance strategy designed to drive efficiency and ensure sustainability.

  1. Leverage Supplier Data: Analyze existing data to map compliance gaps and address deficiencies with targeted outreach.
  2. Minimize Supplier Fatigue: Implement long-term data solutions like FMDs to reduce repetitive requests and build stronger, collaborative supplier relationships.
  3. Bring Compliance In-House: Enhance transparency, reduce reliance on external consultants, and stay agile in adapting to regulatory changes.
  4. Automate Reporting Processes: Deliver precise, real-time reports that integrate seamlessly with external systems, ensuring compliance with ease.
  5. Future-Proof Your Strategy: Build scalable systems that adapt to evolving regulations, emerging markets, and sustainability requirements, keeping your business ahead of the curve.

With these steps, you can transform compliance from a challenge into a strategic advantage, driving innovation and fostering sustainable growth.

What to Do Tomorrow — Whether You Have a System in Place or Not

Have:

  • Grade your existing BOMs for compliance gaps and missing data points. This helps prioritize where action is needed most.
  • Set up dashboards to provide live updates to stakeholders on product compliance status, supplier responsiveness, and upcoming regulatory risks.
  • Evaluate supplier alternatives early to avoid costly, last-minute substitutions, especially for materials flagged by upcoming regulations like PFAS or RoHS.

Have Not:

  • Start by mapping what data you have today, often in spreadsheets, ERP, or PLM tools, and identify gaps.
  • Engage with suppliers to begin collecting material declarations in standardized formats like IPC 1752.
  • Explore solutions like Makersite to centralize your compliance data and automate reporting, laying the foundation for scalable, future-ready compliance processes.

Compliance doesn’t have to be a burden. With the right tools and approach, it becomes a competitive advantage, helping you enter new markets faster, reduce operational risk, and design more sustainable, innovative products.

How to write an RFP for sustainability solutions

The RFP template

DOWNLOAD MAKERSITE’S RFP TEMPLATE

Writing an effective RFP can be a challenge. Your company will have specific goals and objectives, and you’ll need to be able to put together a logical yet thorough question set that enables you to identify the best vendor for the project. With that in mind, we’ve worked to build out what we consider to be the ideal RFP template – and, given that we’ve seen a fair few, we like to think we know what we’re talking about. 

Our rationale was as follows:

  • The RFP template should be simple in format and execution, but adequately thorough
  • It should help your company to ask qualified questions that unlock the right answers, rather than relying on generic question sets that more often than not produce inadequate answers
  • The question set should always consider the fact that the customer has a specific goal in mind
  • Wherever possible, questions should be open rather than closed, allowing for more detail. If closed questions are deemed necessary, they should be at the bottom of the list
  • We’ll use our expertise as a company to accurately reflect what top tier manufacturers are asking. The RFP template is designed to replicate the RFP processes of leaders in their respective fields
  • To create a template that negates the need for external consultants, who don’t necessarily know what the company needs or how to articulate those needs

There are 123 questions across 13 separate sections that represent an ideal baseline. The template does not have to be set in stone. Certain manufacturers or users will have other questions they may wish to add, or will find questions in this template that they don’t consider necessary. However, its purpose is to help anyone struggling with putting together an RFP to ask the necessary questions rather than relying on a generic question set that any supplier can fulfil. 

This is what the best in their fields seek to create when it comes to building an RFP. In creating this template, our aim was straightforward: make it simple, and help users to design their solution based on what leaders are doing. 

You can view the template for reference in the embed at the bottom of this page, or you can download the Excel version to use and edit for yourself at the link at the top.

What makes a ‘best in class’ RFP?

What does the ideal RFP (Request for Prospoal) look like? It’s a common question, but one without a definitive answer.  No two are the same and many manufacturers remain unsure of what they need, relying on basic templates to communicate often complex needs and requirements. Often, unfortunately, those templates are not up to scratch. But that’s not to say that the perfect example doesn’t exist.

RFPs, without doubt, remain an important part of the manufacturing process – an essential tool when it comes to completing a project that you need outside help with. An RFP done correctly not only enables your organization to find the best solution to the task (given that different companies might have different ideas or ways to tackle it), but also helps you to compare the costs of different providers and find the right option for your budget.   

RFPs also help to negate an element of risk early on in the manufacturing process, allowing you to be sure that the company you choose to do the work knows what they’re doing and can deliver what you need. 

For many, the process of putting together an RFP can be a challenge – a drain on both time and resources. From a lack of initial clarity (meaning that proposals may come back incomplete or unaligned with company needs) to scope creep (where the scope of the project changes during the RFP process due to a lack of forward thinking and due diligence), initial hurdles can make the desired outcome significantly harder to achieve for all concerned. 

Companies also struggle to find the right balance of information. Too much detail can overwhelm potential bidders, while too little may leave vendors guessing. From writing the requirements to reviewing proposals, building out an RFP is a time-consuming process, one often further hindered by vendor management issues (where keeping track of questions, updates, and proposal submissions requires careful organization), budgetary concerns (where companies may have a hard time estimating the right budget for the project, and sometimes don’t include budget information in the RFP) and a lack of defined evaluation criteria (where companies may not have a structured approach for comparing different aspects like price, experience, and quality.) 

When these challenges are made clear and are proactively addressed, companies can begin to streamline the RFP process and increase the chances of selecting the best partner for their project. 

“An RFP done correctly not only enables your organization to find the best solution to the task (given that different companies might have different ideas or ways to tackle it), but also helps you to compare the costs of different providers and find the right option for your budget.”

Why getting it right matters

A good RFP template helps to tick a number of boxes. It saves time, facilitating a faster turnaround in creating and distributing RFPs, reducing delays in project timelines. It ensures consistency, making it easier for vendors to understand what’s required, regardless of the project. Done correctly, it reduces the risk of missing critical information, ensuring vendors have all the details needed to create a thorough proposal. It helps to avoid miscommunication or confusion, leading to proposals that better align with the company’s needs, whilst also simplifying the evaluation process, as the company can quickly compare key factors like costs, timelines, and experience side by side.  

Furthermore, it helps to prevent scope changes and misunderstandings that could arise during the project and has the added benefit of making sure that vendors know exactly what to address in their proposals, reducing back-and-forth and ensuring more complete responses. 

Ultimately, the organization issuing the RFP is seeking help because they need expertise or resources they don’t have internally. The RFP process allows them to gather multiple solutions, ensure fairness, manage costs, and reduce risks, thereby helping them choose the best provider to achieve their project goals. 

The End of the Entrepreneur: The Steps We Must Take to Build a Better Future

What does the ‘Age of the Engineer’ – the term I use to describe our need to empower better product design and manufacturing – look like in reality, and how do we make it possible? I see three first steps:

Getting engineers back into the boardroom

I’ve talked before about entrepreneurs as the ‘villain’ of this narrative because it simplifies the framing. However, it might be better in this instance to specify that I’m talking about a non-founding CEO. As a company grows the need for a generalist – a safe pair of hands – arises. There are many benefits to that approach, of course, but too often the spark is lost – the company stops building great things, and the focus shifts to managing what it does well. We’re in need of something else now – we need to rebuild our products and the infrastructure we use to make and utilize them. We need builders. The founders and early engineers of some of our greatest companies were – and still are – engineers by trade and I think it’s time we put them back in the boardroom. For those starting out, my recommendation is to give their technology leader a seat at the board.

Why? My contention is that businesses who want to succeed in a future likely to be defined by seismic change need to spend more time on innovation-led growth than most large enterprises do now. This requires a different mindset towards risk and reward and one can only achieve that through a voice being present at the highest levels of decision making. A overarching vision of what is possible and the technical understanding of how to achieve it results in speed of execution and that combination is often found with engineering leaders. In business, speed is everything and businesses that do this will innovate out of their current situation faster and more successfully.

Adding sustainability as a core metric to product design

A company is its products. If we want to build more successful companies of the future, we’ll need them to have great products that are sustainable. That is not possible unless we embed sustainability into design, just as we do performance, risk and cost.

Every company is different and even within a company, different product lines may cater to different market segments with different preferences. There are no perfect products because there are no perfect customers or infrastructure to build or use these products, so there will always be trade-offs. I do believe, however, that unless these trade-offs are made consciously, products will continue to diverge from sustainability. This will create a widening gap to market requirements.

I’m already seeing advanced organizations who are most of the way there. They’re what we might call ‘mature’ in their approach, set apart from the ‘novices’ because they have made sustainability a design parameter. For them it is another metric, defined by a series of non-negotiable targets that must be hit in order to unlock the rewards – from growth in new markets to better productivity and efficiency to how people are compensated.

Integrating data and enriching operational systems with it

When it comes to engineering, we don’t need to be doing the same things faster. We need to be doing them better. And to do things better, we don’t need more data – we need smarter data.

Our observations show that up to 90% of the data required to understand how to make and sell products doesn’t sit within a company’s systems. The reason for that is that most products are increasingly becoming “assemblies” with large portions being built in complex upstream supply chains. An average car for example has 70% of its components built in this way. Use and End-of-Life data also typically do not sit in company systems. How could one understand the cost, risk or sustainability impacts from these stages? The solution is to collect and combine this “value chain” information from external sources with company data about the product and operations to allow for full-life-cycle view of the implications of design across all the key design criteria. I call this product lifecycle intelligence.

But it doesn’t stop there. This enriched information needs to be available not in data lakes, expert systems and BI tools, but in operational systems like CAD, PLM and ERP so that engineers can use this information in trade-off analysis, within their existing workflows. This “shifting left” of data and insight, to have it available early on and at every stage of the development process, has long been known to reduce development time and avoid costly mistakes. Technology now allows for this.

Conclusion

The ‘Age of the Engineer’ signifies a pivotal transformation in how we approach innovation and sustainability in business. By reinstating engineers into the boardroom, we leverage their unique expertise to drive not just technological advancements but strategic decisions that prioritize long-term value over short-term gains. By integrating comprehensive data into operational systems to enhance decision-making and efficiency, we will empower businesses to build smarter, more sustainable products that meet the demands of a rapidly changing world.

The ‘Age of the Engineer’ is not just an ideal; it is an imperative, charting a course towards a future where technological prowess and sustainability go hand in hand. By giving engineers the spotlight, and by doubling down on sustainable practices, we’re no longer dreaming about a better tomorrow – we’re actively creating it.

This article first appeared on Forbes.com.