Enabling Cross-Sector Interoperability with The World Avatar

Digital Twin Hub > Articles & Publications > Enabling Cross-Sector Interoperability with The World Avatar

Why this talk mattered

Digital twins don’t create value in isolation; they create value when they connect—across tools, teams, sectors, and even countries. In his Gemini Call Live presentation, Amit Bhave (CEO, CMCL) showed how The World Avatar weaves domain models, real-world data, and AI into a connected ecosystem so insights—and actions—can travel wherever they’re needed. The payoff: faster decisions, lower costs, and solutions that scale from a single asset to city- and system-level questions.

“The fastest way to scale digital twins is to connect them—technically and socially.”

From one use case to an interoperability mindset

Amit opened with a pragmatic challenge: emissions compliance testing for an engine used from excavators to gensets. Physical testing across all operating points is slow and expensive. CMCL built a thermodynamic digital twin that blends physics-based and stochastic models with data-driven/ML workflows to replace a substantial portion of measurements, cutting testing effort by roughly a third and delivering meaningful daily cost savings.

That project seeded a bigger lesson: when domain-specific models are connected with digital/knowledge-management technologies, the impact compounds. The real problem isn’t a lack of models—it’s a lack of interoperability across organisations, sectors, data formats, and tools.

The World Avatar: combining “old AI” and “new AI”

To tackle fragmentation, CMCL leans on two complementary toolsets:

  • “Old AI”: web/semantic technologies (ontologies, knowledge graphs, SPARQL) that give shared meaning to data and models.
  • “New AI”: ML, deep learning, and foundation models to learn from data and automate analysis.

The World Avatar (TWA) fuses these into an ecosystem where data, software, and tools connect semantically and syntactically, can be queried across federated sources, and surface via mobile apps, web apps, APIs, chatbots, and dashboards. It even supports parallel worlds for scenario planning and optimisation of the real world we’re operating.

“Old AI gives things meaning; new AI gives them momentum. Together, they travel.”

Three energy-flavoured stories

1) Engines & emissions: measure less, learn more
The thermodynamic twin predicted gas-phase and particulate emissions across fuels (diesel, hydrogen, biofuels). Shifting from exhaustive test-cell campaigns to model-based assessment reduced measurements by ~one-third, accelerating development while maintaining confidence.

2) District heating you can actually interrogate
For a district heating plant near the Franco-German border, TWA mapped assets (conventional and renewable), ingested two years of operational data, and blended external feeds (market energy prices, CO₂ pricing, weather, demand). When the operator asked about impacts on nearby buildings and residents, the same knowledge graph absorbed building-stock data and wrapped third-party air-quality software to assess local effects—showcasing how cross-domain questions become tractable once the data and models are connected.

3) Building- and city-scale planning
At the urban scale, TWA combines BIM, BMS, and GIS with wrapped tools (e.g., City Energy Analyst) to estimate building-energy profiles and renewables potential (like roof-top PV). Users can drill from whole-stock views into a single asset and explore scenarios interactively.

Beyond energy: from ships to… parliamentary debates?

Because the ecosystem approach is domain-agnostic, CMCL has tackled maritime emissions during port calls, self-driving labs, and even parliamentary debates. In the latter, combining knowledge graphs with retrieval-augmented generation (RAG) aims to reduce LLM hallucinations and improve answer reliability—evidence that “old AI + new AI” travels well beyond engineering plants.

Bringing people with us: three practical tips

  1. Communicate and let people interact. Give end users hands-on access to twins so they can try interventions and see consequences.
  2. Quantify value with uncertainty. Even rough forecasts—honest about confidence bounds—help stakeholders reason about cost/benefit.
  3. Adopt ontologies—but don’t make them the conversation. Practitioners should do the mapping; users shouldn’t have to learn vocabulary wars to get value.

Why this resonates with the digital and data practitioners

  • Interoperability first. practitioners consistently cite integration as the biggest blocker. TWA’s approach shows how to make federated, multi-vendor ecosystems useful in practice.
  • From pilots to platforms. Each use case becomes a reusable building block—so progress accrues rather than resets project by project.
  • Human-centred evidence. By exposing uncertainty, enabling interaction, and hiding plumbing (ontologies, schemas), the work builds trust, not just technology.

Key takeaways (at a glance)

  • A thermodynamic digital twin can meaningfully cut physical testing and costs for multi-application engines.
  • Knowledge-graph-driven twins let operators blend operations, markets, weather, and policy signals to answer cross-domain questions (e.g., local air quality).
  • TWA operationalises semantic integration + AI and surfaces it via approachable apps and dashboards.
  • The same pattern applies beyond energy, from ports to parliament—where graph-RAG can improve LLM trustworthiness.

Leave a comment

Content

Discover