Makersite Acquires Siemens' SiGREEN

Read more
Close

Key Takeaways: AI-Driven Digital Transformation in EHS & Sustainability

The Core Problem: Complexity Has Outpaced the Tools

The complexity of modern product portfolios and multi-tier supply chains has outpaced what traditional EHS and sustainability tools can handle. Companies are now being asked product-specific, substance-level questions by customers, regulators, and investors. Most lack the integrated data infrastructure to answer them.

This is not a niche problem. Enterprise manufacturers managing millions of products and tens of thousands of chemical substances cannot generate reliable life cycle data, Scope 3 emissions figures, or audit-ready compliance records using spreadsheets, disconnected ERP systems, or manual research. The volume and precision required make human-scale processes structurally unworkable.

AI is entering this space not because it is fashionable, but because there is no other path to scale.

What Is Actually Working vs. What Is Still Hype

The panel was direct on this. AI in EHS and sustainability is generating real value today in specific, well-scoped use cases, and falling short where the underlying data foundation is missing.

What’s working today:

Automated LCA and PCF generation at product and configuration level, where AI processes full material declarations, maps substance-level data to background databases, and generates traceable life cycle inventories without manual modeling effort. AI-assisted chemical data modeling for substances where no emission factors or LCI datasets exist, using synthesis and pathway data to fill gaps rather than defaulting to averages or proxies. Continuous compliance monitoring against expanding regulatory frameworks, where AI matches BOM-level data to substance watchlists in real time. Scope 3 supply chain mapping across multi-tier supplier networks, surfacing hotspots and prioritizing data collection where it matters most.

Still more promise than reality:

Fully autonomous sustainability decision-making without expert validation. AI cannot produce ISO-compliant outputs without human oversight of methodology and data quality. Generic large language model deployments without deep sustainability domain training, the specificity of EHS methodology, LCA system boundaries, and substance-level compliance cannot be approximated by general-purpose models. And AI layered on top of structurally broken data processes will only create fragmented, siloed, unvalidated inputs produce unreliable outputs regardless of the model.

The consistent finding: AI works when it is applied to specific, high-impact use cases on a structured data foundation. It does not work as a substitute for that foundation.

The Real Bottleneck Is Data Readiness, Not Model Capability

One of the most technically substantive discussions in the keynote focused on where enterprise organizations actually get stuck. Not in AI capability, but in data readiness.

Consider what it takes to generate a product carbon footprint for a manufacturer with a complex chemical portfolio. Measured LCI datasets and emission factors exist for only a fraction of the substances involved. The remainder must be modeled from synthesis pathways, process data, or representative chemical categories. For a single product, dozens of custom LCA datasets may need to be generated from hundreds of candidate substances. Across a portfolio of millions of products, this is a multi-year data engineering challenge.

The approach that works is incremental and methodologically rigorous.

  • Map existing coverage first. Identify what background database coverage already exists and where manufacturer-provided LCA data can be matched exactly. This scopes the true gap before any modeling begins.
  • Prioritize by impact. Focus custom dataset generation on substances with the greatest frequency and material contribution across the portfolio. Starting with the most-used materials delivers meaningful coverage without attempting to solve everything at once.
  • Model the long tail by category. Remaining substances can be grouped into chemical categories, solvent classes, inorganic groups, and represented by datasets with defined variance, min/max ranges, and documented error margins. This is scalable and auditable.
  • Handle marginal contributors appropriately. Substances that contribute negligible quantities to the final product can be represented using high-level grouped data, such as average organic or inorganic chemical classifications, without materially affecting output accuracy.
  • Align on methodology before scaling. ISO compliance requirements, error margin conventions, and how averages are applied in reporting must be agreed between technical and sustainability teams before outputs are used for external disclosure.

This is not a “plug in AI and get answers” workflow. It is a structured, expert-guided process in which AI dramatically accelerates each step. The methodological rigor is still required. Makersite is built to support exactly this kind of layered, scalable approach, from data ingestion and substance-level mapping through to audit-ready LCA and PCF outputs.

How the Sustainability Leader Role Changes in 3 to 5 Years

The panel’s view here was grounded rather than speculative.

The shift is not from human judgment to automated decision-making. It is from reactive reporting to real-time insight generation. Sustainability leaders who today spend significant time on data collection, supplier follow-up, and manual LCA modeling will increasingly function as analysts and strategists, interpreting AI-generated outputs, setting data quality standards, and embedding sustainability criteria directly into product design and procurement decisions.

The implication for organizations is clear: the value of the sustainability function is increasingly determined by the quality of its data infrastructure, not its headcount. Teams that build structured, auditable data pipelines now will have a structural advantage in regulatory readiness and decision speed within the 3 to 5 year window.

Where Scope 3 Is Hardest

Scope 3 emissions, particularly Category 1 purchased goods and services, remain the most difficult area, and the panel was specific about why.

The problem is not the emissions calculation. It is the absence of primary data at the supplier level. Most Scope 3 analyses rely on spend-based or industry-average approaches because supplier-specific, product-level emissions data does not exist in structured, accessible form. AI can model gaps with defined uncertainty, but it cannot compensate for missing primary data.

Organizations making the most progress on Scope 3 share three characteristics. They have built structured supplier data collection processes, full material declarations, BOM-level inputs, that feed directly into LCA and PCF workflows. They have invested in component-level modeling that can be reused across product families rather than rebuilt product by product. And they have established methodology alignment across sustainability, engineering, and commercial teams so that AI-generated outputs are trusted and acted upon.

The approach that consistently does not work: attempting to resolve Scope 3 at the portfolio level with aggregate methods while continuing to operate disconnected, product-level data systems.

What an AI-Enabled Sustainability Decision Looks Like at the Design Level

The most forward-looking discussion centered on product design, where the greatest leverage exists.

In the next three to five years, engineers making component selection decisions will have real-time access to sustainability impact data at the substance and configuration level. Selecting a different supplier or substituting a material will immediately surface its LCA, compliance, and Scope 3 implications before the decision is finalized, not weeks later during a sustainability review.

This is technically achievable today for organizations that have built the necessary data infrastructure. The constraint is not AI capability. It is the availability of structured, substance-level product and supplier data in a form that AI can use.

The leadership implication is significant: Sustainability decisions in the next decade will increasingly be made by product engineers, not sustainability teams in isolation. The function of the sustainability team shifts to building and maintaining the data systems, methodological standards, and AI tooling that make those decisions possible at scale.

One non-negotiable: AI-generated sustainability outputs require full audit trails. ISO-aligned PCFs and LCAs need traceable, validated data lineages. Explainability is a technical requirement, not an optional feature.

Watch-Outs as AI Gets Embedded in EHS Workflows

The panel closed with a frank assessment of where AI adoption fails in EHS and sustainability.

Treating AI as a substitute for data quality is the most common mistake. AI can model gaps and generate datasets for missing substances, but it cannot produce defensible outputs from structurally flawed inputs. Organizations that skip data foundation work before deploying AI will generate results that fail audit, regulatory, or customer scrutiny.

Neglecting methodology alignment is the second failure pattern. Different LCA system boundary definitions and allocation approaches can produce materially different results from the same underlying data. If sustainability, engineering, and commercial teams are not aligned on methodology before AI outputs are generated, those outputs will be contested internally before they reach any external stakeholder.

Underestimating the supplier engagement requirement is the third. Scaling sustainability data across complex supply chains is not purely a technology problem. Thousands of suppliers must participate in structured data collection for AI-generated outputs to reflect primary data rather than estimates. That requires change management and supplier enablement, not just software.

And finally: confusing speed with accuracy. AI generates outputs faster. Faster outputs with unquantified uncertainty are not more useful than slower outputs with defined error margins. Speed and methodological precision must be calibrated together.

The One Thing Leaders Should Understand Right Now

Organizations that will use AI effectively for sustainability in 3 to 5 years are the ones building structured data foundations today.

The technology is ready. The bottleneck is data, specifically, the absence of product-level, substance-level, supplier-validated data organized in a way that AI can work with. Progress comes from practical, incremental steps: mapping what data exists, identifying the highest-priority gaps, and systematically closing them through supplier engagement, AI-assisted modeling, and expert validation.

As Manuel noted ahead of NAEM OPEX/TECH26, “Progress tends to come from practical steps that build confidence, not from trying to solve everything at once. It’s an evolution, not a revolution.”

That remains the right starting point.

In Practice: How Lenovo ThinkPad Solved This at Scale

The challenge described throughout this keynote is not theoretical. Lenovo’s ThinkPad team worked through it directly with Makersite.

ThinkPad faced increasing pressure in enterprise procurement bids requiring configuration-specific, ISO-aligned Product Carbon Footprints. A single model-level PCF cannot represent variation across customer configurations. Without configuration-level visibility, ThinkPad could not demonstrate how component choices influenced the final footprint. This was a measurable gap in competitive tenders.

The approach Makersite and Lenovo took maps precisely to the methodology described in this here in this article. Supplier Full Material Declarations were ingested and automatically converted into substance-level LCA models. More than 2.5 million FMDs processed through Makersite. Rather than modeling every product variant independently, ThinkPad shifted to a shared-component approach: SSDs, displays, memory, and chassis were validated once and reused across product families. Makersite then generated the highly granular substance-level LCAs for the parts and assemblies behind each shared component. Work that would have taken years manually.

The outcome: configuration-level, ISO-aligned PCFs generated at scale, certified, and usable in enterprise sales conversations. ThinkPad sellers can now demonstrate how specific component choices move the carbon footprint up or down, using traceable data rather than estimates.

Internally, sustainability, engineering, and commercial teams now work from the same data. That alignment between people, methodology, and system is what makes the outputs usable, not just accurate.

Next Steps

If your organization is navigating product-level sustainability data challenges, whether for LCA, Scope 3, PCF, or compliance, the starting point is understanding where your data stands and where the highest-impact gaps exist. Makersite works with enterprise manufacturers to build and scale that foundation.

Book a conversation with Makersite >

How Data, Standards, and Automation Are Reshaping Environmental Product Declarations

Key Takeaways from the Digital EPD Session at eClad Conference

1. The EPD market is Scaling Fast, but the Foundation is Still Fragmented

EPDs are growing rapidly across industries, driven by regulatory pressure, customer demand, and procurement requirements. But the underlying systems have not kept pace. As Robert highlighted, the EPD ecosystem has evolved organically over time. Different regions, standards, tools, and workflows have developed independently.

The result is a fragmented landscape where:

    • Data formats are inconsistent
    • Processes vary by region and program operator
    • Digital workflows are not fully standardized
    • Scalability remains limited

This creates a fundamental challenge. The industry is trying to scale outputs without first standardizing the data infrastructure.

2. The Real Bottleneck is Not EPD Creation – It’s the Data.

Across every EPD workflow, the same bottlenecks appear:

    • Data collection
    • Data transformation
    • Data completeness and consistency
    • LCA modeling
    • EPD & LCA verification
    • Non-harmonized calculation rules

These challenges are not new. But they become exponentially more complex as companies try to scale across hundreds or thousands of products. The takeaway is clear: EPD challenges are not primarily about reporting. They are about data architecture.

3. Digital EPDs are the Path Forward. But Only if Done Correctly

Digital EPDs have the potential to solve many of these challenges.

They enable:

    • Automated data validation
    • Structured, machine-readable datasets
    • Faster integration into downstream systems
    • Scalable lifecycle assessments

However, the current reality is more complicated. In many cases today, the process is still reversed. Teams generate a PDF first, then manually transfer data into digital formats. This introduces errors, inconsistencies, and inefficiencies.

The future state is the opposite. A digital dataset should be the single source of truth. From that, any human-readable format, including PDFs, can be generated.

4. Verification Must Evolve to Support Automation and Scale

As EPD volumes grow, traditional verification approaches become a bottleneck.

The current verification guidelines are often:

    • Tool-specific instead of tool-agnostic
    • Lacking detailed requirements
    • Not designed for automated workflows

To address this, new approaches are emerging that focus on:

    • Tool-based verification frameworks
    • Logging and traceability of data and mapping
    • Scalable validation processes
    • Integration of AI-assisted verification

The goal is not just faster verification. It is more consistent and reliable verification at scale.

5. The Shift to Digital Enables Interoperability and Global Alignment

One of the biggest barriers to scaling EPDs today is lack of harmonization. Different program operators, regions, and standards require different formats and calculations. This creates duplication, inefficiency, and inconsistencies.

Digital EPD initiatives aim to solve this by:

    • Standardizing machine-readable formats
    • Enabling interoperability across systems
    • Reducing reliance on region-specific PDF formats
    • Supporting global comparability

This is a foundational shift. It moves EPDs from static documents to interoperable data assets.

6. EPDs are Not the End Goal. Decision-Making Is.

One of the most important points from the session was simple but critical. Companies are not creating EPDs just to have EPDs. They are creating them to enable better decisions.

Whether at the product level, building level, or portfolio level, EPD data should support:

    • Material selection decisions
    • Product design improvements
    • Procurement strategies
    • Regulatory compliance

Without this connection to decision-making, EPDs remain a reporting exercise rather than a business capability.

The Core Problem: EPD Workflows Are Not Built for Scale

Across industries, the challenge is consistent. Organizations are trying to scale EPDs using processes that were never designed for volume, speed, or interoperability.

This leads to:

    • Manual, time-intensive data collection
    • Inconsistent and non-harmonized datasets
    • Duplicated effort across regions and standards
    • Limited ability to reuse or integrate data
    • Slow and costly verification processes

The result is a system that struggles to keep up with growing demand.

The Solution: From EPD Documents to Product Lifecycle Intelligence

The path forward is not just digitization. It is transformation. Instead of creating EPDs one by one, the model shifts to:

    • Ingest all available product and supply chain data
    • Structure it into a unified, digital data model
    • Create digital twins of products
    • Apply logic to generate lifecycle insights
    • Output results across multiple use cases

This approach enables:

    • Automated EPD generation
    • Substance compliance analysis
    • Lifecycle impact modeling
    • Continuous data improvement

All from the same underlying data foundation. This is what Makersite defines as Product Lifecycle Intelligence.

From Data to Decisions: Why This Matters Now

For manufacturers and construction stakeholders, this shift is critical. The market is demanding:

    • More EPDs
    • More specific EPDs
    • Faster turnaround
    • Higher data quality
    • Greater transparency

At the same time, products are becoming more complex and configurable. This creates a new requirement: the ability to generate accurate, scalable, and decision-ready environmental data.

Companies that can do this gain a significant advantage:

    • Faster compliance and reporting
    • Improved product design decisions
    • Reduced operational effort
    • Stronger, more credible sustainability claims

What the Market Is Moving Toward

The conversation is changing. Organizations are no longer asking: “Can we create EPDs?”

They are asking:

    • Can we scale EPDs across entire product portfolios?
    • Can we trust and verify the data consistently?
    • Can we integrate EPDs into digital workflows and systems?
    • Can we use EPD data to drive real decisions?

This reflects a broader shift:

    • From static documents to dynamic data
    • From manual workflows to automated systems
    • From reporting outputs to decision intelligence

Final Thought

The biggest takeaway from the session: EPDs are evolving from documents into infrastructure. Digital EPDs, standardized data models, and automated workflows are not just improving reporting. They are enabling a new foundation for environmental decision-making.

By moving toward connected, digital, and scalable data systems, organizations can turn EPDs from a compliance requirement into a strategic capability.

Want to Scale EPDs Without Scaling Manual Effort?

If your team is working to:

    • Automate EPD generation
    • Improve data quality and consistency
    • Reduce verification bottlenecks
    • Connect EPDs to product and design decisions

See how Makersite enables digital EPD workflows, lifecycle intelligence, and scalable sustainability insights across your product portfolio.

Download Makersite’s EPD ebook 

8 AI-Powered LCA Software Solutions for Manufacturers in 2026

What Is AI-Powered LCA Software?

AI-powered Life Cycle Assessment (LCA) software uses machine learning (ML), large language models (LLMs), and predictive algorithms to quantify the environmental impacts of a product across its full value chain. Traditional LCA is notoriously labor-intensive, requiring months of manual data collection and consulting hours. AI disrupts this by automating the most painful friction points: filling data gaps when supplier data is missing, matching complex bills of materials (BOMs) to background databases, and running what-if scenario models at scale.

In 2026, regulatory pressure has accelerated the demand for these tools. However, not all AI is built the same. The market is currently split between platforms built on robust, industry-specific data foundations that use AI to enrich existing data, and newer platforms relying heavily on “synthetic” LLM-generated models to estimate impacts rapidly.

Quick Summary

  • Makersite: Manufacturing-focused LCA platform that uses proprietary industry AI agents for context-rich gap filling and automated BOM-to-database matching.
  • Sphera: Enterprise, service-led LCA platform using AI to automate matching to its proprietary GaBi database, embedded within a broader EHS ecosystem.
  • One Click LCA: Construction-focused platform using AI to automatically map Building Information Modeling (BIM) elements to LCA datasets and EPDs.
  • Minviro: Mining and battery materials platform utilizing automated, data-driven parameterization to instantly update complex geological LCA models.
  • Muir AI: Rapid assessment platform relying heavily on LLMs to create “synthetic” supply chain models and deconstruct products without primary supplier data.
  • CarbonCloud: Food and beverage LCA tool using an AI-driven classification engine to map products to representative agricultural supply chains.
  • Watershed: Enterprise carbon platform utilizing AI to deconstruct purchased goods (Scope 3.1) into sub-materials and production processes.
  • Terrascope: GHG and decarbonization platform using ML for missing data imputation and automated emission factor matching.

Makersite

Makersite is a granular, AI-powered LCA platform purpose-built for complex manufacturing sectors, with a strong presence in electronics, automotive, industrial machinery/construction, consumer goods and chemicals.

Makersite tackles the core issue of manufacturing LCAs: modeling products with thousands of components when primary supply chain data is missing. Rather than relying on generic estimates, it ingests structured product data (BOMs) and enriches it using deeply specialized AI.

How AI is used:

  • Context-rich gap filling: Uses dedicated, industry-level proprietary AI agents to infer missing material or process data. The AI analyzes the context of the product (materials, components, and likely manufacturing processes) to accurately fill gaps.
  • Automated background database matching: AI automatically maps BOM inputs to the most accurate LCA datasets and emission factors (e.g., Ecoinvent) across any impact category, reducing mapping time from months to minutes.
  • What-If Scenario Modeling: AI powers real-time recommendations for material and supplier substitutions, allowing engineering and procurement teams to compare environmental, cost, and compliance trade-offs concurrently.

Differentiator:
Makersite’s differentiator is its combination of a large data foundation with highly specialized, industry-trained AI agents. Unlike generic AI tools, its AI understands manufacturing context, making it highly accurate for complex, multi-tier supply chains.
Best for: Manufacturers managing complex BOMs who need highly accurate environmental, cost, and compliance modeling.

Sphera

Sphera is an enterprise-grade, service-led LCA provider that combines purpose-built software solutions with its legacy GaBi database to automate specific areas of the LCA process for large organizations.

How AI is used:

  • Automated background matching: Uses AI algorithms to automatically match client activity data to its proprietary Managed LCA Content (GaBi) database, which contains over 20,000 verified datasets.
  • Predictive EHS insights: Through “Sphera AI”, the platform leverages machine learning to embed predictive insights into broader Environmental, Health, and Safety (EHS) and operational risk workflows, linking product sustainability to operational safety.

Differentiator:
Sphera’s main strength is its deep integration into enterprise EHS ecosystems and its proprietary GaBi database. It is a service-led offering designed to reduce manual modeling for multinational corporations rather than a pure self-serve software play.
Best for: Large enterprises looking for a service-led approach combined with EHS infrastructure.

One Click LCA

One Click LCA is a construction-focused platform that utilizes AI to automate carbon assessments for the highly fragmented built environment.

How AI is used:

  • Automated material matching: Uses AI to read Building Information Modeling (BIM) files and Bills of Quantities (BOQs), automatically matching architectural design elements to an extensive database of verified LCA datasets and EPDs.
  • Early-stage conceptual modeling: AI-driven tools (like Carbon Designer 3D) help users model the carbon impact of different structural layouts and material choices before finalizing designs.

Differentiator:
Vertical depth. AI in construction LCA is highly specific, requiring the ability to understand architectural plans and regional building codes. One Click LCA’s AI eliminates the manual translation of building designs into LCA models.
Best for: Architects, engineers, and construction firms needing automated EPD matching and green building compliance.

Minviro

Minviro operates in a highly complex niche: the energy transition. It focuses on the cradle-to-gate LCA of mining operations, electric vehicles (EVs), and battery materials.

How AI is used:

  • Data-driven parameterization: While the exact ML architecture is proprietary, Minviro uses automated, data-driven parameterization to manage complex geological variables (ore grade, local energy mix, processing routes).
  • Real-time model updating: Automates LCA recalculations instantly when upstream mining or supplier data changes, ensuring battery compliance models reflect “live” operational realities rather than static industry averages.

Differentiator:
Sector specificity. General-purpose LCA AI cannot account for how a specific mining site’s ore grade impacts total Global Warming Potential (GWP). Minviro provides defensible, site-specific environmental data crucial for EV OEMs.
Best for: Mining companies, battery manufacturers, and EV supply chain teams.

Muir AI

Muir AI is a rapid assessment platform. It takes a fundamentally different approach to LCA, prioritizing speed and portfolio-wide coverage by relying heavily on Large Language Models (LLMs) to generate “synthetic” data.

How AI is used:

  • AI-driven deconstruction: Uses LLMs to break down simple procurement data or generic product descriptions into assumed material components and manufacturing processes.
  • Synthetic supply chain mapping: Employs AI to estimate the likely flow of materials across sourcing countries and assigns synthetic emission models when primary data is entirely absent.

Differentiator:
Speed at the expense of primary data foundations. Because Muir AI relies almost entirely on LLMs to build synthetic LCAs, it can instantly assess entire product portfolios. However, this approach lacks the contextual accuracy and data foundation of tools like Makersite, making it better for high-level hotspotting than precise engineering trade-offs.
Best for: Consumer goods and apparel companies needing rapid, high-level portfolio assessments where primary supplier data is completely unavailable.

CarbonCloud

CarbonCloud is an AI-enhanced LCA platform built specifically to map the immense variability of agricultural and food supply chains.

How AI is used:

  • AI Category Tree Mapping: Uses an AI-driven classification engine to categorize complex food products based on their properties and automatically map them to representative agricultural supply chains.
  • Automated Modeling Engine: Uses predictive mapping to generate climate footprints for large food portfolios in a matter of days by filling ingredient data gaps with verified agricultural metrics.

Differentiator:
CarbonCloud excels at creating automated “digital twins” of food products, providing F&B brands with a consistent baseline for entire product portfolios, even when upstream farm data is missing.
Best for: Food and beverage brands looking to scale carbon footprinting across massive product lines.

Watershed

While traditionally known as an enterprise carbon accounting platform, Watershed has developed specific AI LCA capabilities to tackle Scope 3.1 (Purchased Goods and Services).

How AI is used:

  • Product deconstruction: AI models deconstruct purchased items—from basic office supplies to industrial chemicals—into their sub-materials and likely production processes based purely on spend and procurement descriptions.
  • Automated regional mapping: The AI automatically applies regional emission factors and manufacturing assumptions to these deconstructed components to build rapid Product Carbon Footprints (PCFs).

Differentiator:
Watershed uses AI not for deep product engineering, but for procurement intelligence. It is designed to give enterprise sustainability teams a fast, AI-generated LCA of the things they buy, rather than the things they make.
Best for: Corporate sustainability and procurement teams needing to estimate the footprint of large volumes of purchased goods.

Terrascope

Terrascope focuses on using machine learning to improve the efficiency, accuracy, and scalability of enterprise greenhouse gas accounting and product footprinting.

How AI is used:

  • Missing data imputation: Uses ML models to automatically check for data quality, identify anomalies, and impute (estimate) missing values in bulk supplier data.
  • Intelligent emission factor matching: An AI engine matches company activities and materials with the most appropriate emission factors in minutes, assigning confidence scores and flagging low-confidence matches for human review.

Differentiator:
Terrascope is built for scale and ease of use, utilizing AI to clean up messy corporate data and democratize the emission factor matching process for non-sustainability experts.
Best for: Large enterprises needing scalable ML solutions to clean data and automate GHG/PCF accounting.

How to Choose: Key Questions

  1. Are you engineering complex products, or doing rapid portfolio estimates?If you are a manufacturer designing complex, multi-tier products and need high accuracy for engineering trade-offs, Makersite offers the necessary industry-specific AI and strict data foundation. If you just need a fast, high-level estimate across a consumer portfolio and are comfortable with LLM-generated “synthetic” data, Muir AI provides rapid speed.
  2. What industry are you in? AI in LCA works best when it understands your specific sector. One Click LCA is unmatched for construction and BIM integrations. Minviro is the only logical choice for the geological complexities of battery and EV mining. If you are in food and agriculture, CarbonCloud and HowGood hold the specialized AI engines for crop and ingredient mapping.
  3. What is the end goal of the assessment? If the goal is product design, cost optimization, and supply chain substitution, Makersite connects those workflows natively. If you need to satisfy enterprise Scope 3 reporting and EHS compliance, Sphera or Terrascope are ideal. If you are trying to map the footprint of the products you buy rather than make, Watershed is built specifically for procurement deconstruction.

 

 

Vendor Core Focus Key AI Capability Best For
Makersite Manufacturing, BOM-level PCF, supply chain LCA Industry-specific AI gap filling; semantic DB matching; AI scenario modeling Manufacturers managing complex, multi-tier supply chains (Electronics, Auto, Industrial)
Sphera Enterprise LCA and EHS integration Automated matching to GaBi database; predictive EHS risk insights Large enterprises wanting a service-led approach with EHS infrastructure
One Click LCA Construction and built environment AI matching of BIM/BOQ files to EPDs; early-stage conceptual modeling Architects, engineers, and construction firms
Minviro Mining, EVs, and battery materials Automated data-driven parameterization; real-time model updating Mining companies, battery makers, EV supply chain teams
Muir AI Rapid supply chain assessment LLM-driven product deconstruction; synthetic supply chain modeling Consumer goods needing fast, high-level estimates without primary data
CarbonCloud Food and beverage portfolios AI category tree classification; automated agricultural supply chain mapping Food & beverage brands mapping large product portfolios
Watershed Enterprise Scope 3.1 (Purchased Goods) AI deconstruction of procured items; automated regional mapping Corporate procurement teams measuring supply chain emissions
Terrascope Enterprise GHG and PCF automation ML data imputation; intelligent emission factor matching engine Enterprises needing scalable data cleansing and automated GHG accounting

Still Have Questions? Let’s Dig Deeper

What makes LCA software “AI-powered” versus traditional lifecycle assessment tools?

Traditional LCA software relies on manual data entry, extensive supplier surveys, and human experts spending weeks mapping components to background databases (like Ecoinvent or GaBi). “AI-powered” platforms automate these bottlenecks. They use machine learning and semantic algorithms to automatically match complex Bills of Materials (BOMs) to the correct emission factors, use predictive models to fill in data gaps, and enable real-time “what-if” scenario modeling without requiring a sustainability consultant to recalculate the entire assessment.

How do AI LCA tools handle incomplete or missing primary supplier data?

Missing data is the biggest hurdle in traditional LCA, but it’s exactly where AI excels. Instead of stalling an assessment, AI platforms use context to bridge the gaps. For example, tools built for manufacturing (like Makersite) use industry-specific AI agents to infer the likely materials and manufacturing processes based on the component’s context. Other platforms use machine learning to impute missing values from corporate spend data, or rely on LLMs to generate “synthetic” supply chain estimates to keep the assessment moving.

Are AI-generated or “synthetic” emission estimates accurate enough for regulatory reporting?

It depends heavily on the platform’s data foundation and your end goal. If you are doing rapid, portfolio-wide hotspotting to see where your biggest emissions are, “synthetic” models (relying heavily on LLMs and spend data) are incredibly useful. However, for strict regulatory compliance (like the EU Battery Regulation or CSRD) and precise engineering trade-offs, you need platforms that use AI to enrich a rigid, scientifically verified data foundation (like Makersite, Sphera, or Minviro) rather than relying entirely on AI-generated estimates.

When should AI-powered LCA be used in the product development lifecycle?

Historically, LCA was a retrospective exercise—done after a product was manufactured to create a report. AI-powered LCA shifts this entirely to the left, straight into the R&D and design phases. Because AI can instantly map impacts and run “what-if” scenarios, engineering and procurement teams can use these tools during the early design phase to instantly compare the carbon, cost, and compliance trade-offs of switching a material or supplier before the product is ever built.

From Manual LCAs to Cloud-Scale Measurement: Microsoft’s CHEM Methodology

The challenge: You can’t decarbonize what you can’t measure

For hyperscalers and data center operators, embodied carbon in ICT hardware represents a major share of Scope 3 emissions. In a recent whitepaper, Microsoft notes that reducing this impact requires reliable and granular measurement across a rapidly evolving hardware landscape and a deeply layered global supply chain.

While life cycle assessment (LCA) is a well established methodology for quantifying environmental impacts, Microsoft states that traditional approaches are difficult to apply consistently at cloud scale. Manual steps such as reconstructing complex BOMs and mapping materials to life cycle inventory datasets can take more than 100 hours per server, which makes it difficult to scale process-based LCA across thousands of hardware configurations without significant effort.

The shift: From manual modeling to scalable measurement

To overcome these limitations, Microsoft developed the Cloud Hardware Emissions Methodology, or CHEM. CHEM is an LCA based methodology designed to automate and scale embodied carbon measurement across Azure hardware, while preserving the level of detail needed to identify emissions hotspots and evaluate decarbonization interventions.

How CHEM is built

CHEM was developed using Azure data services alongside cloud based automated LCA software, including Makersite, which Microsoft uses to implement and scale process based  LCA models across complex hardware configurations. This is combined with proxy mapping tooling and state of the art semiconductor life cycle inventory data from the imec Sustainable Semiconductor Technologies and Systems program.

Integrating product data
To reduce manual effort and improve consistency, CHEM integrates directly with Microsoft’s internal product data management systems and full material declarations. This allows complex BOMs hierarches to be transferred automatically into the LCA modeling environment, helping assessments stay aligned as hardware designs evolve.

Automating material to inventory mapping
CHEM automates the mapping of material compositions to representative life cycle inventory datasets from third party sources such as ecoinvent. By reducing manual modeling work, this approach allows practitioners to focus on data quality, supplier specific inputs, and interpretation rather than data entry.

Modeling semiconductors at higher resolution
Microsoft identifies semiconductor components as the primary drivers of embodied carbon in datacenter hardware. To improve accuracy, CHEM incorporates detailed manufacturing data from the imec Sustainable Semiconductor Technologies and Systems program.

Microsoft integrates this data into custom LCA models and uses its automated LCA software environment, including Makersite, to run and scale those models across large numbers of hardware configurations.

Why this matters

By applying CHEM across its cloud hardware fleet, Microsoft describes several practical outcomes:

  • More robust Scope 3 reporting
    Process based data replaces high level financial proxies, supporting disclosures that are more consistent, auditable, and repeatable at scale.
  • Clearer supply chain hotspot identification 
    Granular modeling makes it possible to trace embodied carbon impacts multiple tiers deep and evaluate where targeted interventions could have the greatest effect.
  • Carbon informed hardware design
    CHEM data can be used by system architects to consider embodied carbon alongside power, performance, and cost during hardware design decisions.
  • More precise carbon roadmapping
    Aggregated results across parts, assemblies, and configurations support carbon reduction roadmaps that reflect real manufacturing processes rather than estimates.

A signal for the industry

Microsoft presents CHEM as part of a broader shift toward more scalable, data driven approaches to understanding and reducing the embodied carbon impact of cloud hardware. Th company also highlights ongoing collaboration with industry groups such as the Open Compute Project and the Semiconductor Climate consortium to help improve consistency and standardization in LCA based carbon accounting.

Together, these efforts point toward a future where embodied carbon data is not just reported but operationalized. For organizations managing complex hardware fleets, the CHEM approach illustrates what is required to move from high level estimates towards measurement that can support real supply chain, design, and roadmapping decisions.

This blog is an interpretive summary of Microsoft’s whitepaper ‘How Microsoft is advancing embodied carbon measurement at scale for Azure hardware’, published in 2026. 

You are currently viewing a placeholder content from Articulate 360. To access the actual content, click the button below. Please note that doing so will share data with third-party providers.

More Information

Still Have Questions? Let’s Dig Deeper

How does Microsoft measure embodied carbon for Azure hardware?

Microsoft measures embodied carbon for Azure hardware using the Cloud Hardware Emissions Methodology (CHEM), a process based life cycle assessment methodology. CHEM integrates internal product and supply chain data with environmental lifecycle inventory data to quantify emissions across the full hardware lifecycle.

What is the difference between spend-based and process-based LCA for data centers?

Spend-based methods estimate emissions using financial proxies, which can obscure the true drivers of embodied carbon. Process-based LCA, as used in CHEM, models emissions based on physical manufacturing processes and material flows, enabling more granular and actionable insights into where emissions originate.

How does Microsoft handle the complexity of semiconductor emissions?

Recognizing that semiconductors are a major contributor to embodied carbon, Microsoft incorporates detailed semiconductor life cycle inventory data into CHEM.This includes the use of advanced “virtual fab” models developed with data from the imec Sustainable Semiconductor Technologies and Systems program to represent specific manufacturing process steps rather than generic averages.

Can Life Cycle Assessment (LCA) be automated for hyperscale hardware?

Microsoft’s CHEM methodology demonstrates that significant parts of process-based LCA can be automated when product data systems are connected to cloud-based LCA modeling tools. This reduces the manual effort required to reconstruct BOMs and map materials to life cycle inventory datasets at hyperscale.

What role does Makersite play in the CHEM methodology

Microsoft uses Makersite as part of the CHEM implementation to support automated LCA modeling across complex hardware configurations. Makersite is used to map product structures and materials to environmental datasets, enabling scalable, process-based emission modeling.

All You Need To Do About The New EU Batteries Regulation

Overview

After 18 February 2027, every EV, LMT, and industrial battery in the EU will be dead on arrival without a digital battery passport. That’s the main change brought by the new EU Batteries Regulation, which entered into force on 17 August 2023. Battery producers who fail to comply might see a revenue drop. While there’s still time to avoid this financial risk, you’ve got to act right now. Doing it on your own could be overwhelming, so we’ve outlined a series of steps you should take to ensure compliance.

1.   Drop Last-Min Compliance Mindset

Get your battery digital passport ready sooner rather than later. This is not a mere compliance exercise. An early adoption of this tool will give you a competitive edge. Based on the study conducted by the Battery Pass consortium, a digital passport could reduce the need for technical tests, thus driving down procurement costs for independent operators by up to 10%. Furthermore, thanks to the improved recycling rate (up to 2%), the battery e-passport will reduce pre-processing and post-treatment expenses for recyclers by up to 20%. To add to that, better recycling implies a lower amount of primary materials used. This then translates into up to 1300 kt of CO2eq that could be saved annually by 2045.

On the other hand, a compliance delay could seriously hurt your business. That’s what happened to Meta, which was fined €1.2 billion by the Irish Data Protection Commission for transferring EU users’ personal data to the US without meeting GDPR requirements.

2.   Engage Your Suppliers

Let’s put it bluntly. If you don’t have supply chain data, your battery passport won’t get stamped. That’s why it’s paramount to liaise with your suppliers. Reach out to them and shed light on how this regulation will affect their business. Explain to them the benefits of sharing data as this will increase trust. Building a partnership around transparency is also critical to draft the due diligence policies for the sourcing of critical raw materials which are required by the regulation.

3.   Implement An Efficient Data Management System

Creating and managing a battery digital passport means collecting an enormous amount of data from multiple sources. And that’s the hardest job nobody is prepared for. Nobody, except specialised companies like Makersite. We leverage an AI-enabled technology that lets you automatically consolidate all your information into a centralized platform in no time.

Our system also allows you to perform a lifecycle analysis (LCA) to have an accurate estimate of your battery’s carbon footprint, which is one of the passport’s key attributes. However, don’t expect an accurate LCA if you don’t feed accurate data to the system. That means you should gather detailed information on raw material, energy & water consumption, and waste production associated with each stage of the product’s life cycle.

Moreover, make sure to tap into a supply chain mapping feature to assess your supplier practice and spot inefficiencies. That’s what Makersite has done for an international cosmetic company. Combining an automated multi-tier mapping of their value chain with AI-powered suggestions, we allowed them to achieve a more sustainable and cost-effective sourcing in less time. Our case study shows how value chain diligence can flip from a cost center into a margin protector.

4.   Governance

AI won’t solve all your problems. You need to invest in human resources as well. For instance, it might be wise to rely on software experts who can help you understand your data to make more informed decisions.

Another useful strategy would be to appoint internal compliance personnel or create cross-functional taskforces (R&D, supply chain, legal) if not already in place. Additionally, it would be helpful to implement robust supply chain audit and reporting processes to promote transparency.

Is Your Battery Affected By The Regulation?

This rule applies to the following types of batteries:

  • portable batteries;
  • starting, lighting and ignition batteries (SLI batteries);
  • light means of transport batteries (LMT batteries);
  • electric vehicle (EV) batteries and industrial batteries;
  • batteries that are incorporated or designed to be incorporated into or added to products.

The Battery Passport

The regulation’s key implication is the attachment of an electronic record, a.k.a. battery passport, to each LMT battery, each industrial battery with a capacity greater than 2 kWh and each EV battery. Manufacturers should have the battery passport ready by 18 February 2027.

What Information Should The Passport Contain?

As reported in the Appendix, the information contained in the battery digital passport are classified according to who should access it.

Technical & Operational Obligations For Manufacturers

Besides collecting a lot of credible data and storing it for 10 years, when compiling a battery passport manufacturers will also have to meet a series of technical and operational requirements.

  • Due diligence: Manufacturers must adopt and communicate due diligence policies for the supply of critical raw materials (i.e., cobalt, natural graphite, lithium, nickel) and its associated risks. These policies should be in line with internationally recognized standards such as the OECD Due Diligence Guidelines and the UN Guiding Principles on Business and Human Rights.
  • QR Code: This is the unique identifier that manufacturers will have to attribute to each of their batteries to ensure access to the digital passport. The QR code should respect the guidelines of ISO/IEC Standard 18004:2015.
  • Interoperability: The battery passport will have to be interoperable with other digital passports required by the EU (e.g., the Digital Product Passport (DPP) introduced by the Ecodesign for Sustainable Products Regulation (ESPR)). All information should be based on open standards and transferable through an open interoperable data exchange network.
  • Data Access: Manufacturers should have a data system allowing them to attribute different types of access rights as per the Regulation (see also Appendix).

Regulation Uncertainties

After going through the EU Batteries Regulation, some grey areas remain. First of all, some level of confusion still lies on access rights. The rule leaves it to the EU Commission to elucidate this aspect through the issue of implementing acts. These secondary legislations should clarify which stakeholders (e.g. persons with legitimate interest, third-party organisations, etc.) are entitled to access which datasets.

The Commission will have to publish other acts to give further guidance on the following too:

  • methodologies for calculating carbon footprint;
  • minimum values for electrochemical performance;
  • share of recycled content for non-critical battery materials;
  • harmonised formats for data reporting (e.g., real-time or static information?) and labelling.

Finally, while the regulation says that the digital passport should be interoperable with other similar EU frameworks (e.g., DPP), it doesn’t mention anything on non-EU systems. This could pose a double reporting issue for global manufacturers.

What Should You Do Now To Prepare?

Aside from familiarising with the EU Battery Regulation, manufacturers can go beyond that by checking out the DIN-DKE SPEC 99100. Based on the Battery Passport Content Guidance, this document provides battery producers with more details and recommendations on each data attribute to embed in the digital passport.

Conclusions

Complying with the EU Batteries Regulation will require way more than plugging any data into any AI tool. To begin with, gathering good data about your battery implies engaging suppliers and finding out as many details as possible on critical and other raw materials. Moreover, data interpretation can be a time-consuming task without expert oversight. Although AI can help speed things up, you still need human validation to assure data integrity. Manufacturers who master high-quality product data now will cut costs, avoid double reporting, and be first to market. That’s the advantage Makersite is building for its customers.

Appendix

Still Some Doubts? Let’s Clarify Them

What Happens If My Company Isn’t Ready For The EU Battery Passport By February 2027?

Without a valid passport, affected batteries can’t be sold in the EU. That means blocked market access, potential fines, and losing out to competitors who prepared early. The biggest bottleneck is neither the QR code nor the reporting templates — it’s the supply chain and lifecycle data that companies often don’t have in usable form.

👉 This is where Makersite helps: consolidating supplier and product data into a live, compliant passport.

Isn’t The Battery Passport Just A Compliance Exercise?

No. The regulation is designed to push transparency and sustainability across the value chain. Companies that treat it only as a compliance task will incur costs; instead, companies that use the passport as a data backbone can reduce procurement costs, cut CO₂, and strengthen supplier resilience.

👉 Makersite turns compliance data into a business advantage by linking it to cost, carbon, and risk models.

How Is The Battery Passport Different From The Digital Product Passport (DPP)?

The battery passport is the first sector-specific implementation of the EU’s broader push for DPPs. Both share the same goal: live, interoperable product data. If you solve for the battery passport correctly, you’ll be positioned for future DPP rollouts across other product categories.

👉 Makersite’s platform is built for both — future-proofing your compliance investments.

What Type Of Data Is Hardest To Gather For The Passport?

Supply chain data on critical raw materials, recycled content, and carbon footprint. Most manufacturers don’t have visibility past Tier 1 suppliers, making this the single biggest challenge.

👉 Makersite uses AI + supply chain mapping to fill gaps, validate supplier inputs, and model missing data.

Why Can’t We Just Wait Until The EU Clarifies The Regulation’s Grey Areas?

Because building the data infrastructure and supplier collaboration needed for a passport takes years. If you wait for the EU’s secondary acts, you’ll stay behind.

👉 Makersite allows you to start building compliant passports today, while staying adaptable to regulatory updates.

How Can AI Be Used To Map Supply Chains Or Enrich Missing Data For The Battery Passport?

AI can accelerate the mapping of multi-tier supply chains by consolidating information from public databases, supplier disclosures, and proprietary datasets. It can also enrich missing attributes (such as carbon footprint or recycled content) by benchmarking against process- and material-specific models.

But AI is only as good as the data and expertise behind it. For compliance-grade outputs, AI must be trained on high-quality supply chain data and guided by experts who understand materials, processes, and regulatory standards. Large Language Models (LLMs) aren’t suitable here — this is about structured, verifiable data that stands up to audits.

👉 Makersite combines AI-driven automation with expert-validated data and robust databases, ensuring your battery passport is accurate, compliant, and defensible.

From Data to Impact: Scaling Sustainability Across Manufacturing Enterprises

You are currently viewing a placeholder content from Articulate 360. To access the actual content, click the button below. Please note that doing so will share data with third-party providers.

More Information

The Hard Truth: Reporting Doesn’t Change Anything

Most manufacturers are drowning in spreadsheets and annual reports, but emissions don’t drop from reporting alone. The real bottleneck isn’t ambition. It’s data that’s too coarse to change real decisions. In our most recent webinar, partnered with NAEM, “From Data to Impact: Scaling Sustainability Across Manufacturing Enterprises” our experts stressed a simple reality: spend-based estimates might check a compliance box, but they will never redesign a product, reshape a supply chain, or win you a tender.

Key Takeaways

  • Activity-based data is essential. Our experts emphasized that bill of materials (BOM)-level data enables manufacturers to move beyond spend-based estimates and into actionable Scope 3.1 reporting.
  • A laddered approach to data quality. Start where you are: BOM-based modeling when available, weight or average factors when necessary, and spend-based methods only as a last resort.
  • Collaboration is non-negotiable. Procurement, engineering, and sustainability functions must be aligned if product sustainability data is to influence real business decisions.
  • Digital twins enable scale. By linking product, supply chain, and impact data, organizations can automate LCAs, close supplier data gaps, and create portfolio-wide transparency.

The Data Ladder: How to Climb Out of Spend-Based Guesswork

Most organizations are not constrained by a lack of data but by the wrong type of data. Scope 3.1 requires product-specific, activity-based data to enable meaningful action. Without it, sustainability reporting risks becoming an exercise in compliance rather than a driver of competitive advantage. Our experts underscored that to create impact, sustainability insights must flow into design and sourcing decisions, not remain trapped in reporting cycles.

A Practical Data Ladder for Scaling Sustainability Data

  1. BOM-based (preferred): Map the bill of materials, normalize material and process categories, and apply life cycle inventory factors. This is the most actionable level for design and sourcing decisions.
  2. Average/mass-based (backup): When full BOM data is unavailable, use product weight and representative averages to approximate impacts.
  3. Spend-based (fallback): Leverage spend data multiplied by EEIO factors only when no other information exists. This approach should be replaced progressively with activity-based data.

This ladder allows organizations to begin modeling with the data at hand and gradually refine accuracy through supplier engagement and primary data collection.

The point? Start now, climb steadily. Don’t let “perfect data” be the excuse for doing nothing.

The Barriers Manufacturers Face

Scaling sustainability isn’t straightforward. Common challenges include:

  • Data silos across PLM, ERP, and compliance systems.
  • Incomplete product records, such as missing weights or coatings.
  • Rapid change in engineering and sourcing, which often outpaces traditional LCA cycles.
  • Competing priorities across teams, with procurement focused on cost, engineering on manufacturability, and sustainability on reporting deadlines.

Breaking the Silos: From Sustainability Reports to Product Decisions

ERP and PLM systems were built to optimize cost and risk—not sustainability. The result? Fragmented records, incomplete supplier info, and teams working in isolation. The solution made clear: scaling sustainability demands a single source of product truth that procurement, engineers, and sustainability teams can all access and act on.

To overcome these barriers, our experts recommend a repeatable operating model:

  1. Normalizing product records by harmonizing BOMs, applying default assumptions, and defining a single source of truth for material attributes.
  2. Building an LCA-at-scale service via digital twins that connect cost, compliance, and footprint data, keeping models current as designs change.
  3. Prioritizing supplier engagement based on material impact, focusing requests for primary data where it matters most.

Embedding sustainability into workflows so that footprints are considered alongside cost and lead time in procurement events, design reviews, and customer disclosures.

Data Quality Isn’t the Excuse

Yes, your data is messy. Everyone’s data is messy. But the myth that you must “fix data first” before scaling sustainability is paralyzing progress. As our experts highlighted, you can achieve more than you think—even with imperfect data. The maturity curve proves it: novices wrestle spreadsheets, intermediates integrate flows, and advanced players run centralized master data governance. The winners don’t wait—they build maturity as they go.

What Success Looks Like

Signals that sustainability has scaled include:

  • Broad coverage of up to 80% revenue/SKUs using activity-based methods.
  • Automated impact updates tied to BOM or supplier changes.
  • Sustainability metrics integrated into sourcing and design decision gates.
  • Clear supplier mix shifts toward lower-carbon options, informed by quantified trade-offs.

Real-World Win: Microsoft – 28% Footprint Reduction on Surface Pro

Microsoft’s Surface team discovered that manual LCA processes were outdated, slow, and riddled with generic data.

By automating through Makersite, they:

  • Cut LCA effort from months to minutes.
  • Increased accuracy from 20% primary data to 70%.
  • Freed 80% of resources to focus on reductions instead of reporting.
  • Achieved a 28% footprint reduction on the Surface Pro.

The lesson: Automation doesn’t just accelerate reporting—it creates the space to design real carbon reductions.

Read the Full Case Study

Real-World Win: FLS – Tackling Massive Complexity

FLS, a global mining equipment leader, faced customer demand for timely LCA data that far outpaced their manual capacity. Their products involve thousands of BOM lines and hundreds of tons of material.

With Makersite, they:

  • Scaled LCAs across complex, customized portfolios.
  • Embedded carbon transparency into tenders and sales.
  • Gained actionable insights to drive supplier and material decisions.

FLS turned sustainability into a competitive edge—not a reporting chore.

Read the Full Case Study

So, What Should You Do?

  • Run a pilot, not a POC. Prove scale, accuracy, and speed on real SKUs or sites—not toy examples.
  • Get cross-functional buy-in. Procurement, sales, and engineering must see the business value, or sustainability stays underfunded.
  • Pick a partner you trust. The sustainability software space is still the Wild West. Don’t just buy tools—find people who deliver.

Addressing Common Pushbacks

  • “We don’t have the data.” Use the ladder—begin with what is available and improve over time. Hybrid approaches are both recognized and effective.
  • “LCAs take too long.” With a connected digital twin, models update automatically, reducing time to insight.
  • “Scope 3 is just reporting.” When tied to product and sourcing decisions, Scope 3 becomes a lever for both emissions reduction and margin growth.

Still Skeptical? Let’s Address the Hard Questions

  • “We already have a sustainability team handling this.”
    Good—but if their insights never reach procurement or design, you’re leaving value on the table. Scaling means wiring their work directly into product and supplier decisions, not confining it to reports.
  • “We’re not ready for a new tool or vendor.”
    You don’t need another silo—you need a connected digital twin that feeds your existing PLM, ERP, and sourcing systems. The right partner integrates with what you have and accelerates ROI.
  • “We don’t have good enough data to act.”
    No one starts with perfect data. The key is the ladder approach: use what you have, improve as you go, and replace assumptions with primary data over time.
  • “This all sounds too complex.”
    Automating LCAs across thousands of BOM lines for heavy mining equipment is complex, but it can be done. Complexity is exactly why scalable automation exists.
  • “Scope 3 is just reporting.”
    Reporting alone doesn’t change outcomes. But when Scope 3 metrics drive sourcing, design, and tender decisions, they become a lever for cost savings, margin growth, and differentiation.

Closing Thought

Our experts’ message was clear: sustainability at scale isn’t about “more reports.” It’s about hardwiring footprint into product and supplier decisions. That’s the difference between reporting carbon and actually reducing it.

Product Sustainability Isn’t a Cost. It’s the Fastest Path to Margin Growth

You are currently viewing a placeholder content from Articulate 360. To access the actual content, click the button below. Please note that doing so will share data with third-party providers.

More Information

What You’re Missing From The Decarbonisation Picture

During a masterclass called “Decarbonize by Design: How Product Sustainability Fuels Business Growth”, Neil D’Souza, CEO of Makersite, interviewed David Linich, Sustainability Principal at PwC US. Digging into PwC’s “State of Decarbonisation” study, this white paper debunks the myth of sustainability being just a costly and painful exercise. 

Unlike what is reported by the media, firms are making some real progress in carbon emissions reduction and are also profiting from it. Moreover, product data is playing a key role in turning decarbonisation from a risk into a business opportunity.

The Unexpected Truth: What the Data Actually Shows

  • Sustainability is shifting from a reporting obligation to a core driver of business value;
  • Scope 3 emissions are more valuable than your CFO thinks;
  • Product data is worth up to 25% revenue upside.

Everyone Thinks Sustainability Is Dead. 4,000+ Companies Say Otherwise

News headlines often report companies pulling back on sustainability commitments or net zero targets. Yet, PwC tells a different story. In its “State of Decarbonization” study, the firm surveyed 4,163 companies to assess their climate commitments and progress towards them. 

Contrary to the common narrative, PwC found out that 37% of surveyed companies are increasing their climate ambitions. Only 16% of respondents are dialing their commitments back, half of which are merely recalibrating timelines to achieve their carbon reduction targets. This just shows a more realistic approach rather than a lower climate responsibility. Furthermore, PwC reported a ninefold increase in the number of companies that have set decarbonisation goals over the last 5 years.

Another major finding was that products marketed with sustainability attributes led to up to a 25% revenue upside. This could be due to different factors such as applying price premium to low-carbon goods vs carbon-intensive products, sales increase driven by consumers’ trust, or new selling schemes (refurbishment, takeback).

“So this moves from, hey let’s report because there’s a lot of scrutiny on this topic to okay, what can we do to drive business, and I think that’s a fundamental change.” Neil D’Souza, CEO of Makersite

The Hidden Revenue Engine Lurking in Your Scope 3

Scope 3 emissions are the elephant in the decarbonisation room for any company. For this reason, their reporting could feel overwhelming. According to PwC, only 54% of companies are on track to reduce their Scope 3 carbon footprint. However, this figure would change if the remaining 46% of surveyed firms appreciated the financial advantages of tackling their value chain emissions. Scope 3 emissions include energy, fuel, waste and other elements that carry a cost attached to them. Therefore, driving them down will unlock margin opportunities. 

Among companies pursuing Scope 3 targets, PwC observed a trend in disclosing more categories, thus indicating an improvement in carbon accounting capabilities. For instance, enterprises are learning how to measure use-phase emissions (category 11). Besides being hard-to-decarbonise, this category significantly contributes to the climate footprint of organisations across different industry sectors. On the other hand, when it comes to another heavy-carbon category such as purchased goods and services, PwC’s research shows that most companies engage suppliers at a very basic level. The lack of a high-level supplier engagement is another indicator of companies missing out on Scope 3-related business opportunities. To capitalise on these, organisations should invest in segmentation, co-innovating, incentives schemes, etc.

“I take that one of the big drivers for Scope 3 decarbonization is counterintuitively not saving the planet but revenue and margin growth. And I think this is a good thing because when I started in this career my boss told me our job is to make the most sustainable products in the world that can be made.” Neil D’Souza, CEO of Makersite

Why Product Data Is Your Newest Profit Lever

Scope 3 emissions have always been seen as an accounting nightmare because they’re beyond an organisation’s control. Nevertheless, companies do have control over their product design. The latter influences the raw materials used to make a product and therefore who supplies those materials. In other words, there’s a direct link between product design and Scope 3 carbon emissions.

Harnessing intelligent tools, firms can identify carbon hotspots and inefficiencies along their product value chain. This translates into lower emissions and costs. On top of that, gathering accurate product data allows firms to build powerful assets such as life cycle assessments (LCAs) and environmental product declarations (EPDs). Accordingly, organisations can support their climate-friendly claims without incurring any greenwashing risk, thus retaining their customer base and attracting new clients.

“As we looked at the levers that companies were pulling to drive down scope 3 emissions, I thought we were going to see the most predominant one being something like supplier collaboration. But it turns out the number one lever is product sustainability.” David Linich, Sustainability Principal at PwC US

Your Data Isn’t Broken – Your Org Is

Missing decarbonisation targets is not the only source of headache for sustainability leaders. Limiting margin compression from tariffs is probably the main focus of any organisation at the moment. This obviously can affect priorities in terms of supplier choice. Not to mention the series of conflicting regulations to comply with such as extended producer responsibility (EPR), REACH, RoHs, and so on. Being able to factor all these in is paramount to make sustainability profitable. As suggested by PwC, the only way to achieve this is to have a strong tech and data foundation.

A key issue raised by PwC is that a lot of companies are using a skunk approach to product sustainability. Specifically, product data lives in silos across engineering, procurement, compliance, and sustainability teams. This forces companies to walk a tightrope, with decisions made in the dark. Fixing this requires centralising product and supply chain data as well as enabling real-time collaboration between functions. To overcome these challenges, firms could harness a platform like Makersite that merge all data into one place. Additionally, this AI-enabled tool lets companies fully understand their data, thus ensuring an optimal decision-making process.

“A typical company that I talk to is telling me the majority of their top customers are reaching out to them and asking for sustainability-related data and are encouraging them to set targets and to make more progress.” David Linich, Sustainability Principal at PwC US

Why Manual LCA Will Kill Your Climate Strategy

As mentioned earlier, conducting an LCA can add business value to your decarbonisation strategy. Nonetheless, if it’s done manually, this exercise can be time-consuming as it involves the collection of an enormous amount of data. This is particularly true for large enterprises managing thousands of stock keeping units (SKUs). That’s where a digital twin can make a huge difference. Tapping into this technology, organisations can perform real-time analysis on multiple product versions in minutes vs months, thus overcoming the LCA’s scalability bottleneck.

Barco case study

Barco, a global tech, was spending lots of time and money reporting SKU-level environmental data as these were siloed and scattered across their supply chain. To address this challenge, Barco tapped into Makersite’s automated Life Cycle Analysis (LCA) and Product Environmental Footprints (PEFs). Thanks to these smart tools, Barco could consolidate their data as well as filling any information gaps. Besides complying with EU taxonomy reporting requirements, the company implemented more targeted eco-design principles across their product portfolio. This led to the achievement of a third-party validated carbon neutral label.

“To calculate products’ carbon footprint without full material declarations, you could create parametric models. But unfortunately there’s nobody else that’s doing this besides Makersite, so it’s a very hard thing to do and we build specific AI models to be able to do this.” Neil D’Souza, CEO of Makersite

What To Do On Monday: Your Action Plan

Here’s a roadmap you can refer to for reaping the benefits of decarbonisation.

  1. Build your climate governance.
  2. Change your capital allocation:
    1. embed an internal cost of carbon into your budgeting;
    2. ring-fence CAPEX for the initiatives needed to achieve your net-zero targets.
  3. Engage your stakeholders more effectively.
  4. Embrace product-level sustainability: as outlined above, this is the profitable frontier of decarbonisation. Fixing your product data isn’t just good practice. It’s an easy way to leverage new revenue streams, boost value chain resilience, and future-proof operations.

“You need to have the business case, you need to evangelize it, and then you need to repeat the process as you continue. If you don’t have your business case in place and you don’t have a strategy of how you’re going to implement it, nothing ever happens.” Neil D’Souza, CEO of Makersite.

Still Skeptical? Let’s Address the Hard Questions

How Do You Calculate Product Carbon Footprint Without Full Material Declarations?

It’s unlikely to get all the data you need from your suppliers. However, starting from the data you have, you can leverage AI-enhanced digital twin models. For products like cement and metals, these will give you an accurate estimate of their carbon footprint.

Is There A Set Of Rules To Follow When Dealing With Ecodesign Checklists And Standards?

There are a lot of choices and assumptions to make when conducting an LCA, therefore the top rule is to be consistent in the way you measure your product’s environmental impact across your portfolio.

How Can You Move From Words To Facts?

Rather than pursuing skunk projects, connect your teams and align their efforts with your customers needs and expectations.

On-Demand Decarbonize by Design: How Product Sustainability Fuels Business Growth

You are currently viewing a placeholder content from Articulate 360. To access the actual content, click the button below. Please note that doing so will share data with third-party providers.

More Information

Masterclass Key Takeaways

Despite headlines hinting at a “sustainability slowdown,” the data tells a very different, but encouraging story. In our recent masterclass with PwC’s David Linich and Makersite CEO Neil D’Souza, we dug into the real business levers behind climate action and decarbonization. Spoiler alert: they’re not just about ESG scores. They’re about growth, resilience, and bottom-line performance.

Sustainability isn’t dead. It’s maturing

PwC’s latest State of Decarbonization study found that more companies are increasing their climate ambitions than pulling back—37% of companies are stepping up their goals, compared to just 16% dialing them down. Many of those lowering targets are simply recalibrating early, overly ambitious goals to reflect more realistic roadmaps. In just five years, there’s been a 9x increase in companies setting emissions reduction targets.

 

Scope 3 is the new frontier—and the biggest opportunity

Progress on Scope 1 and 2 is real, but Scope 3 remains the largest untapped lever. Why? It’s tough to tackle, but it’s also where most value chain emissions, and business value, live. Most enterprises still rank low in supplier engagement maturity. That’s where tools like product carbon footprinting and collaborative design come in.

Product sustainability drives growth. Literally. 

Sustainable products deliver measurable ROI. PwC found that companies marketing products with sustainability attributes are seeing a 6–25% revenue uplift, through price premiums, increased purchase intent, and entirely new revenue streams like circular business models. Double claims, “durable + PFAS-free”, are particularly powerful, boosting purchase intent by up to 30%.

The business case isn’t one-and-done. It’s a drumbeat

One of the biggest blockers? Failing to make (and maintain) the business case. Leaders need to advocate for decarbonization like they would any core investment, with a repeatable story, clear ROI, and alignment to strategic priorities. Capital allocation is key. Leading companies are ring fencing budgets for decarbonization or applying internal carbon prices to support longer-term investments.  

Digital product twins are redefining what’s possible

Legacy LCA methods don’t scale, but scalable, AI-driven platforms can. With digital twins, manufacturers can simulate cost, carbon, risk, and compliance trade-offs in real-time across entire product portfolios, not just pilot SKUs. The shift: From once-a-year compliance reports to daily design decisions.

Circularity is suddenly more economical

Thanks to tariffs, raw material volatility, and shifting customer preferences, circular business models that didn’t pencil out before are now financially viable. But unlocking that value requires scenario planning and data orchestration at scale. From reuse to take-back programs, sustainability and margin of growth are finally aligning.  

Why it Matters

Product sustainability is becoming a top priority for both sales teams and engineers, and for good reason. According to PwC, by 2030, over one-third of global company revenues will come from climate-focused solutions—think lightweight products, alternative fuels, and circular business models for B2B and consumers. It’s clear: meaningful climate action and business growth now go hand in hand.

What You Can Do on Monday

If you’re not ready to overhaul your entire product sustainability strategy (yet), start here:

  • Assess your product sustainability maturity: Take stock of your cross-functional coordination, LCA capabilities, and supplier engagement efforts. Understand your current baseline and identify where to improve. 
  • Build (and sustain) the business case: Clarify how product sustainability directly supports revenue, margin, and compliance goals. Revisit it often to maintain leadership support and investment. 
  • Explore design levers for Scope 3: Pinpoint where your product and sourcing choices influence emissions and cost. Focus efforts where they’ll yield both carbon reduction and commercial value. 
  • Equip Sales & Marketing with the right tools: Provide clear, credible messaging and ROI calculators to help teams communicate sustainability claims effectively, especially in B2B contexts. 
  • Pilot scalable digital tools: Trial digital twins or rapid LCA platforms on a small product set to evaluate speed, cost, and business insight potential before scaling up. 

Take Action

Watch the full masterclass and download PwC’s State of Decarbonization report for sector insights, value chain strategies, and practical playbooks.

Need help scaling your product sustainability efforts? Makersite’s experts are ready to help.

Quantifying Circularity: A Data-Driven Approach to Chip Lifecycle Emissions

Turning Vision into Action: Advancing Circular Manufacturing

To open this masterclass, Gruber and Dillman presented a bold perspective on circular economy strategies, using a case study that compared the environmental and economic impacts of reusable and linear semiconductor chip designs. With sustainability leaders from companies like Amazon, IKEA, and Cisco in attendance, the discussion emphasized integrated, data-driven decision-making as a critical enabler for meeting today’s sustainability standards.

Contrasting scenarios included:

  • A linear model, where the chip is manufactured, used, and discarded.
  • A circular model, where the chip is recovered, re-balled, and reused.

The circular model demonstrated slightly higher emissions for the reprocessing step (2.36 kg CO₂e vs. 1.94 kg CO₂e for linear disposal), but by extending the lifetime of the initial chip in the circular model, where the linear would now be replaced by a new chip (1.94 x 2 = 3.88 kg CO₂e) the benefits of the circular approach is shown. By eliminating the need to manufacture new chips for future production cycles, the circular process reduces total, system-wide emissions while also drastically minimizing raw material extraction, water usage, and land use. 

Circular manufacturing offers a transformative solution for reducing environmental impact and building long-term economic resilience. Forward-thinking companies like Jabil are already operationalizing these principles, turning what was once considered waste into valuable resources through systematic recovery and reuse programs that can also deliver significant cost savings.

Gruber and Dillman’s data-driven example underscores how this model can cut resource consumption, support compliance with evolving sustainability regulations, and drive progress toward a fully circular economy. Businesses adopting these strategies position themselves as sustainability leaders, strengthening their operations against resource scarcity and climate challenges. By embracing circular innovation, companies unlock a powerful pathway to sustainable growth and competitive advantage.

“Circularity isn’t just about recycling—it’s about smarter design, sourcing, and evaluating trade-offs,” said Dillman. “To close the loop, we must assess impacts beyond carbon.”

Designing for Circularity: Key Insights from the Session

Key takeaways included:
  • Sustainability Requires System Thinking: Achieving a circular economy demands cross-functional collaboration across design, procurement, logistics, and recovery. A unified data foundation is critical to driving these efforts effectively.
  • Data-Driven Decisions Over Assumptions: The circular chip example scenario underscores the importance of high-fidelity modeling in evaluating circular strategies. Circular initiatives often lack granular emissions and cost data, making it difficult to assess trade-offs or justify actions internally. Digital tools that enable engineers and sustainability teams to quantify carbon impacts and material costs at the component level provide the analytical rigor needed to support data-backed circularity decisions.
  • Leadership Focuses on Actionable Insights: The strong participation of executives and senior managers in the session underscores growing C-level commitment to sustainable innovation and responsible driven business models.
  • Scalable Platforms Are the New Standard: Fragmented tools fall short in today’s complex landscape, creating new data silos and preventing transparency. Forward-thinking sustainability leaders are turning to scalable platforms and digital tools to seamlessly integrate sustainability, cost efficiency, and product compliance into their operations.

Driving Circularity with Actionable Product Intelligence

As manufacturers push toward circular economy goals, decision-makers are increasingly turning to digital tools that provide high-resolution insights across the product lifecycle. These platforms are enabling sustainability, procurement, and design teams to move beyond assumptions by modeling the environmental and economic implications of circular strategies in real time.

By bringing together lifecycle data, cost metrics, and supply chain considerations, these tools support:

  • Comparative analysis of linear vs. circular models
  • Identification of trade-offs across environmental categories
  • Alignment across teams through shared, data-driven insight.

In a rapidly shifting regulatory and market landscape, the ability to simulate design choices at scale — grounded in real-world data — is essential. Organizations that invest in this type of intelligence aren’t just improving products; they’re reshaping how sustainability is operationalized across the enterprise.

Turning Circular Strategies into Scalable Impact

For manufacturers, achieving sustainability success requires integrating data-driven insights and lifecycle thinking into design and procurement processes. This approach empowers teams to scale effective strategies such as reducing product carbon footprints, ensuring regulatory compliance, and driving operational efficiencies. With data and cross-functional alignment at the core, circularity evolves from a lofty goal to a measurable competitive advantage, positioning businesses as leaders in innovation and sustainability.

How EPDs at scale help you win tenders and drive sustainable product innovation

Ebook: Your EPDs Are Holding You Back

DOWNLOAD EBOOK

The EPD landscape has shifted. It’s no longer optional – it’s a competitive battleground where speed and verifiable proof now dictate market access and credibility. Are your EPDs truly enabling sustainable choices, or just checking a box?

Months-long verification bottlenecks, crippling costs per EPD, and—let’s be frank—a reliance on generic ‘family’ averages often chosen less for true efficiency and more to conveniently obscure performance variability across sites or products.

This ebook cuts through the noise. Your EPDs Are Holding You Back exposes the limitations of outdated EPD practices and shows how leveraging EPDs at scale with automation transforms this liability into your competitive advantage.

Download the ebook to uncover how to:

✅ Win more tenders: Instantly verified, product-specific EPDs ready on demand.

✅  Stop hiding behind product families: Differentiate with credible, granular LCA data and expose the limitations of blended data.

✅ Unlock the real power of EPDs: Go beyond marketing to use specific EPD data as an engine for genuine internal innovation, R&D, procurement optimization, and robust decarbonization planning.

✅ Slash crippling cost, time, and verification bottlenecks: Leave manual LCA modeling and never-ending verification loops behind.