Join us to celebrate the launch of the Infrastructure Client Group Annual Digital Benchmarking Report 2021 on 15 June 2022 at 9:00 BST by REGISTERING HERE.
The ICG Report, powered by the Smart Infrastructure Index, surveys asset owners and operators who jointly represent over £385bn worth of capital assets and over 40% of the national infrastructure and construction pipeline.
After Mark Enzer, former Head of the National Digital Twin Programme, Centre for Digital Built Britain, introduces the report, Andy Moulds and Anna Bowskill, Mott MacDonald, will uncover the results of the latest research into the state of the nation for digital adoption and maturity.
This will be followed by a panel of industry thoughts leaders and practitioners, chaired by Melissa Zanocco, Co-Chair DTHub Community Council, sharing their views and best practice case studies from the ICG Digital Transformation Task Group and Project 13 Adopters including:
Karen Alford, Environment Agency – skills
Matt Edwards, Anglian Water – digital twins
Sarah Hayes, CReDo – Climate Resilience Demonstrator digital twin
Neil Picthall, Sellafield – common data environments
Matt Webb, UK Power Networks – digital operating models
Will Varah, Infrastructure & Projects Authority – Transforming Infrastructure Performance: Roadmap to 2030
REGISTER to find out how much progress has been made at a time when digital transformation is a critical enabler for solving the global, systemic challenges facing the planet.
For any questions or issues, please contact Melissa Zanocco: firstname.lastname@example.org
Please note: We plan to make a recording of the event available. Please note that third parties, including other delegates may also take pictures or record videos and audio and process the same in a variety of ways, including by posting content across the web and social media platforms.
The bigger and more complicated the engineering problem, the more likely it is to have a digital twin. Firms that build rockets, planes and ships, for example, have been creating digital twins since the early 2000s, seeing significant operational efficiencies and cost-savings as a result. To date, however, few firms have been able to realise the full potential of this technology by using it to develop new value- added services for their customers. We have developed a framework designed to help scale the value of digital twins beyond operational efficiency towards new revenue streams.
In spite of the hype surrounding digital twins, there is little guidance for executives to help them make sense of the business opportunities the technology presents, beyond cost savings and operational efficiencies. Many businesses are keen to get a greater return on their digital twins’ investment by capitalising on the innovation – and revenue generating - opportunities that may arise from a deeper understanding of how customers use their products. However, because very few firms are making significant progress in this regard, there is no blueprint to follow. New business models are evolving but the business opportunities for suppliers, technology partners and end-users is yet to be fully documented.
Most businesses will be familiar with the business model canvas as a tool to identify current and future business model opportunities. Our ‘Four Values’ (4Vs) framework for digital twins is a more concise version of the tool, developed to help executives better understand potential new business models. It was designed from a literature review and validated and modified through industry interviews. The 4Vs framework covers: the value proposition for the product or service being offered, the value architecture or the infrastructure that the firm creates and maintains in order to generate sustainable revenues; the value network representing the firm’s infrastructure and network of partners needed to create value and to maintain good customer relationships; and value finance such as cost and revenue structures.
The value proposition describes how an organisation creates value for itself, its customers and other stakeholders such as supply chain partners. It defines the products and services offered, customer value (both for customers and other businesses) as well as the ownership structure. Examples of digital twin-based services include condition monitoring, visualization, analytics, data selling, training, data aggregation and lifespan extension. Examples of customer value in this context might include: decision support, personalisation, process optimisation and transparency, customer/operator experience and training.
The value architecture describes how the business model is structured. It has 5 elements: 1. Value control is the approach an organisation takes to control value in the ecosystem. For example, does it exist solely within its own ecosystem of digital twin services or does it intersect with other ecosystems? 2. Value delivery describes how the digital twins are delivered, are they centralised, decentralised or hybrid? It also seeks to understand any barriers that may prevent the delivery of digital twins to customers. 3. Interactions refers to the method of customer interaction with the digital twin. Common examples of interaction include desktop or mobile app, virtual reality and augmented reality interactions. 4. Data collection underlies the digital twin value proposition and can be a combination of the following: sensor based and/or supplied/purchased data. 5. Boundary resources are the resources made available to enhance network effects and scale of digital twin services. This typically comprises of the following: APIs, hackathons, software development toolkits and forums.
The value network is the understanding of interorganisational connections and collaborations between a network of parties, organisations and stakeholders. In the context of digital twin services, this is a given as the delivery mechanism relies on multiple organisations, technological infrastructure and stakeholders.
This defines how organisations approach costing, pricing methods and revenue structure for digital twins. Digital twin revenue model most commonly refers to outcomes-based revenue streams and data-driven revenue models. Digital twin pricing models include, for example, freemium and premium, subscription models, value-based pricing and outcome-based pricing models. Four types of digital twin business models were identified from extensive interviews with middle and top management on services offered by digital twins, we identified four different types of business models and applied our 4Vs approach to understand how those models are configured and how they generate value.
These were all found in information, data and system services industries. Their value proposition is to provide a data marketplace that orchestrates the different players in the ecosystem and provides anonymised performance data from, for example, vehicle engines or heating systems for buildings. Value Finance consists of recurring monthly revenues levied through a platform which itself takes a fee and allocates the rest according to the partnership arrangements.
This business model is prevalent in the world of complex assets, such as chemical processing plants and buildings. Its value proposition lies in providing additional insights to the customer on the maintenance of their assets to provide just-in-time services. What-if analysis and scenario planning are used to augment the services provided with the physical asset that is sold. Its Value Architecture is both open and closed, as these firms play in ecosystems but also create their own. They control the supply chain, how they design the asset, how they test it and deliver it. Its Value Network consists of strategic partners in process modelling, 3D visualisation, CAD, infrastructure and telecommunications. Value Finance includes software and services which provide a good margin within a subscription model. Clients are more likely to take add-on services that show significant cost savings.
This business model tends to be found in the transport sector, where it’s important to maximise the uptime of the aircraft, train or vehicle. The value proposition centres on keeping these vehicles operational, either through predictive maintenance for vehicle/ aircraft fleet management and, in the case of HGVs, route optimisation. Its Value Architecture is transitioning from closed to open ecosystems. There are fewer lock- in solutions as customers increasingly want an ecosystems approach. Typically, it is distributors, head offices and workshops that interact with the digital twin rather than the end-customer. The Value Network is open at the design and assembly lifecycle stages but becomes closed during sustainment phases. For direct customers digital twins are built in-house and are therefore less reliant on third-party solutions. Its Value Finance is focused on customers paying a fee to maximise the uptime of the vehicle or aircraft, guaranteeing, for example, access five days a week between certain hours.
This business model focuses on delivering the necessary outcome to the customers. It tends to be found with government clients in the defense and aerospace sector. Value propositions are centered around improving efficacy of support and maintenance/ operator insight and guaranteeing mission success or completion. These business models suffer from a complex landscape of ownership for integrators of systems as much of the data does not make it to sustainment stages. Its Value Architecture is designed to deliver a series of digital threads in a decentralised manner. Immersive technologies are used for training purposes or improved operator experience. Its Value Network is more closed than open as these industries focus on critical missions of highly secure assets. Therefore, service providers are more security minded and careful of relying on third-party platforms for digital twin services. Semi-open architecture is used to connect to different hierarchies of digital twins/digital threads. Value Finance revealed that existing pricing models, contracts and commercial models are not yet necessarily mature enough to transition into platform-based revenue models. Insights as a service is a future direction but challenging at the moment, with the market not yet mature for outcome-based pricing.
For B2B service-providers who are looking to generate new revenue from their digital twins, it is important to consider how the business model should be configured and identify major barriers to their success. Our research found that the barriers most often cited were cost, cybersecurity, cultural acceptance of the technology, commercial or market needs and, perhaps most significantly, a lack of buy-in from business leaders. Our 4Vs framework has been designed to help those leaders arrive at a better understanding of the business opportunities digital twin services can provide. We hope this will drive innovation and help digital twins realise their full business potential.
Now for a small request to the reader that has reached this far, we are looking to scale these research findings in our mass survey across industry on the business models of digital twins. If your organisation is planning to implement or has already started its journey of transformation with digital twins please help support our study by participating in our survey. Survey remains fully anonymised and all our findings will be shared with the DTHub community in an executive summary by the end of the year.
Link to participate in the survey study https://cambridge.eu.qualtrics.com/jfe/form/SV_0PXRkrDsXwtCnXg
Transforming an entire industry is, at its core, a call to action for all industry stakeholders to collaborate and change. The National Digital Twin programme (NDTp) aims to do just that, enabling a national, sector-spanning ecosystem of connected digital twins to support people, the economy, and the natural environment for generations to come.
But to achieve these ambitious impacts, a great deal of change needs to occur. So, to provide clear rationale for why potential activities or interventions should be undertaken and why they are expected to work, Mott MacDonald has worked with CDBB to develop a Theory of Change (ToC) and a Benefits Realisation Framework (BRF) to represent the logical flow from change instigators (i.e., levers) to overall benefits and impacts. The ToC and BRF are expected to provide future leaders and policymakers with a clear understanding of the drivers of change and the actors involved to create an ecosystem of connected digital twins.
Components of the Theory of Change
Within the ToC, we outline several key components - actors, levers, outputs, outcomes, impacts, and interconnected enablers. As a national programme uniting the built environment through a complex system of systems, it is essential that multiple actors collaborate, including asset owners and operators, businesses, government, academia, regulators, financial entities, and civil society. These actors need to share a common determination to move the needle towards better information management by utilising a combination of interconnected levers to kickstart the change: financial incentives, mandates and legislation as well as innovation.
We see that pulling these three levers is likely to trigger tangible change pathways (i.e., the routes in which change takes place), manifested through the ToC outputs and intermediate outcomes, leading to the creation of institutional and behavioural changes, including organisations taking steps to improve their information management maturity and exploring cross-sector, connected digital twins. Ultimately, we consider these change pathways to lead to the long-term intended impact of the NDTp, achieving benefits for society, the economy, businesses, and the environment.
Underpinning and supporting the levers and change pathways are the enablers. We see these as positive market conditions or initiatives and are key in implementing and accelerating the change. They span having a unifying NDTp strategy, vision and roadmap, empowering leadership and governance, leveraging communication and communities, building industry capacity, and adopting a socio-technical approach to change.
The five levels of the Theory of Change
We intend for the ToC to outline how change can occur over five distinct levels: individual, organisational, sectoral, national, and international. The individual level involves training and upskilling of individuals from school students to experienced professionals, so that individuals can be active in organisations to drive and own the change. Our previous work with CDBB focused on the Skills and Competency Framework to raise awareness of the skills and roles needed to deliver a National Digital Twin in alignment with the Information Management Framework (IMF).
At the core of establishing the National Digital Twin is the organisational level, within which it is essential for change to occur so that organisations understand the value of information management and begin to enhance business processes. Broadening out from these two levels sits the sectoral level, where the development of better policies, regulations and governance can further support the change across all levels. Similarly, change at the national level will guide strategic engagement and should encourage further public support.
Ultimately, change at these four levels should achieve change at an international level, where the full potential of connected digital twins can be realised. Through the encouragement of international knowledge sharing and by creating interconnected ecosystems, challenges that exist on a global scale such as climate change can be tackled together.
Benefits Realisation Framework
Monitoring and evaluation have been fundamental to the assessment of public sector policy and programme interventions for many years. The potential benefits of the NDTp are significant and far reaching, and we have also developed guidance on how to establish a benefits realisation framework, based on UK best practice including HM Treasury’s Magenta Book, to drive the effective monitoring and evaluation of NDTp benefits across society, the economy, businesses, and the environment. We intend for this to provide high-level guidance to measure and report programme benefits (i.e., results) and track programme progress to the NDTp objectives outlined in the Theory of Change.
The Gemini Papers
Our work in developing the Theory of Change for the National Digital Twin programme has informed one of the recently published Gemini Papers. The Gemini Papers comprise three papers addressing what connected digital twins are, why they are needed, and how to enable an ecosystem of connected digital twins, within which the Theory of Change sits.
Together, we can facilitate the change required to build resilience, break down sectoral silos and create better outcomes for all.
Several Terms such as Digital Ecosystem, Digital Life, Digital World, Digital Earth have been used to describe the growth in technology. Digital twins are contributing to this progress, and it will play a major role in the coming decades. More digital creatures will be added to our environments to ease our life and to reduce harms and dangerous. But can we trust those things? Please join the Gemini call on the 29th of March; Reliability ontology was developed to model hardware faults, software errors, autonomy/operation mistakes, and inaccuracy in control. These different types of problems are mapped into different failure modes. The purpose of the reliability ontology is to predict, detect, and diagnose problems, then make recommendations or give some explanations to the human-in-the-loop. I will discuss about these topics and will describe how ontology and digital twins are used as a tool to increase the trust in robots.
Trust in the reliability and resilience of autonomous systems is paramount to their continued growth, as well as their safe and effective utilisation. A recent global review into aviation regulation for BVLOS (Beyond Visual Line of Sight) with UAVs (Unmanned Aerial Vehicles) by the United States Congressional Research Office, highlighted that run-time safety and reliability is a key obstacle in BVLOS missions in all of the twelve European Union countries reviewed . A more recent study also highlighted that within a survey of 1500 commercial UAV operators better solutions towards reliability and certification remain a priority within unmanned aerial systems. Within the aviation and automotive markets there has been significant investment in diagnostics and prognostics for intelligent health management to support improvements in safety and enabling capability for autonomous functions e.g. autopilots, engine health management etc.
The safety record in aviation has significantly improved over the last two decades thanks to advancements in the health management of these critical systems. In comparison, although the automotive sector has decades of data from design, road testing and commercial usage of their products they still have not addressed significant safety concerns after an investment of over $100 Billion in autonomous vehicle research. Autonomous robotics face similar, and also distinct, challenges to these sectors. For example, there is a significant market for deploying robots into harsh and dynamic environments e.g. subsea, nuclear, space etc which present significant risks along with the added complexity of more typical commercial and operational constraints in terms of cost, power, communication etc which also apply. In comparison, traditional commercial electronic products in the EEA (European Economic Area) have a CE marking, Conformité Européenne, a certification mark that indicates conformity with health, safety, and environmental protection standards for products sold within the EEA. At present, there is no similar means of certification for autonomous systems.
Due to this need, standards are being created to support the future requirements of verification and validation of robotic systems. For example, the BSI standards committee on Robots and Robotic Devices and IEEE Global Initiative for Ethical Considerations in Artificial Intelligence and Autonomous Systems (including P7009 standard) are being developed to support safety and trust in robotic systems. However, autonomous systems require a new form of certification due to their independent operation in dynamic environments. This is vital to ensure successful and safe interactions with people, infrastructure and other systems. In a perfect world, industrial robotics would be all-knowing. With sensors, communication systems and computing power the robot could predict every hazard and avoid all risks. However, until a wholly omniscient autonomous platform is a reality, there will be one burning question for autonomous system developers, regulators and the public - How safe is safe enough? Certification infers that a product or system complies with legal relevant regulations which might slightly differ in nature from technical or scientific testing. The former would involve external review, typically carried out by some regulators to provide guidance on the proving of compliance, while the latter usually refers to the reliability of the system. Once a system is certified, it does not guarantee it is safe – it just guarantees that, legally, it can be considered “safe enough” and that the risk is considered acceptable.
There are many standards that might be deemed relevant by regulators for robotics systems. From general safety standards, such as ISO 61508, through domain specific standards such as ISO 10218 (industrial robots), ISO 15066 (collaborative robots), or RTCA DO-178B/C (aerospace), and even ethical aspects (BS8611). However, none of those standards address autonomy, particularly full autonomy wherein systems take crucial, often safety critical, decisions on their own. Therefore, based on the aforementioned challenges and state of the art, there is a clear need for advanced data analysis methods and a system level approach that enables self-certification for systems that are autonomous, semi or fully, and encompasses their advanced software and hardware components, and interactions with the surrounding environment. In the context of certification, there is a technical and regulator need to be able to verify the run-time safety and certification of autonomous systems. To achieve this in dynamic real-time operations we propose an approach utilising a novel modelling paradigm to support run-time diagnosis and prognosis of autonomous systems based on a powerful representational formalism that is extendible to include more semantics to model different components, infrastructure and environmental parameters.
To evaluate the performance of this approach and the new modelling paradigm we integrated our system with the Robotics Operating System (ROS) running on Husky (a robot platform from Clearpath) and other ROS components such as SLAM (Simultaneous Localization and Mapping) and ROSPlan-PDDL (ROS Planning Domain Definition Language). The system was then demonstrated within an industry informed confined space mission for an offshore substation. In addition, a digital twin was utilized to communicate with the system and to analysis the system’s outcome.
Intelligent infrastructure is a new trend that aims to create a work of connected physical and digital objects together in industrial domains via a complex digital architecture which utilises different advanced technologies. A core element to this is the intelligent and autonomous component. Two-tiers intelligence is a novel new concept for coupling machine learning algorithms with knowledge bases. The lack of availability of prior knowledge in dynamic scenarios is without doubt a major barrier for scalable machine intelligence. The interaction between the two tiers is based on the concept that when knowledge is not readily available at the top tier, the knowledge base tier, more knowledge cab be extracted from the bottom tier, which has access to trained models from machine learning algorithms.
It has been reported that the need for intelligent autonomous systems – based on AI and ML – operating in real-world conditions to radically improve their resilience and capability to recover from damage. It has been expressed the view that there is a prospect for AI and ML to solve many of those problems. A claim has been made that a balanced view of intelligent systems by understanding the positive and negative merits will have impact in the way they are deployed, applied, and regulated in real-world environments. A modelling paradigm for online diagnostics and prognostics for autonomous systems is presented. A model for the autonomous system being diagnosed is designed using a logic-based formalism, the symbolic approach. The model supports the run-time ability to verify that the autonomous system is safe and reliable for operation within a dynamic environment. However, during the work we identified some areas where knowledge for the purpose of safety and reliability is not readily available. This has been a main motive to integrate ML algorithms with the ontology.
After decades of significant research, two approaches to modelling cognition and intelligence have been investigated and studied: Networks (or Connectionism) and Symbolic Systems. The two approaches attempt to mimic the human brain (neuroscience) and mind (logic, language, and philosophy). While the Connectionism approach considers learning as the main cognitive activity, the Symbolic Systems are broader, they also look at reasoning (for problem solving and decision making) as the main cognitive activity besides learning. Although, learning isn’t the focus of Symbolic Systems, powerful – but limited – methods were applied, such as ID3 (define) and its different variations and versions. Furthermore, the Connectionism approach is concerned with data while Symbolic Systems are concerned with knowledge.
Psychologists have developed non-computational theories of learning that have been the source of inspiration for both approaches. Psychologists have also differentiated between different types of learning (such as learning from experience, by examples, or a combination of both). In addition, unlike in animals (it is difficult to test intelligence in non-human creatures), human psychologists have also produced methods to test human intelligence. Mathematicians have also contributed statistical methods and probabilistic models to predict behaviour or to rank a trend. The subject of Machine Learning (ML) is the bag for all algorithms used to mine data in the hope that we can learn something useful from the data, which is usually distributed, structured or unstructured, and of significant size. Although there are several articles on the differences and similarities between Artificial Intelligence and Machine learning, and articles on the importance of the two schools, there are no real or practical attempts that have been reported in the literature to practically use or combine the two approaches together. Therefore, this is an attempt to settle the ongoing conflicts between the two existing thoughts for modelling cognition and intelligence in humans. We argue that two-tiers intelligence is a mandate for machine intelligence as it is the case for human. Animals, on the other hand, have one-tier intelligence, which is the intrinsic and the static know-how. The harmony between the two tiers can be viewed from different angles, however they complement each other, and both are mandatory for human intelligence and hence machine intelligence.
The lack of availability of prior knowledge in dynamic complex systems of is without doubt a major barrier to scalable machine intelligence. Several advanced technologies are used to control, manipulate, and utilise all parts whether software, hardware, mobile assets such as robots, or even infrastructure assets such as wind turbines. The two-tiers intelligence approach will enhance the learning and knowledge sharing process in a setup that heavily relies on some sort of symbiotic relationships between its parts and the human operator.
A digital twin is a digital representation of something that exists in the physical world (be it a building, a factory, a power plant, or a city) and, in addition, can be dynamically linked to the real thing through the use of sensors that collect real-time data. This dynamic link to the real thing differentiates digital twins from the digital models created by BIM software—enhancing those models with live operational data.
Since a digital twin is a dynamic digital reflection of its physical self, it possesses operational and behavioral awareness. This enables the digital twin to be used in countless ways, such as tracking construction progress, monitoring operations, diagnosing problems, simulating performance, and optimizing processes.
Structured data requirements from the investor are crucial for the development of a digital twin. Currently project teams spend a lot of time putting data into files that unfortunately isn’t useful during the project development or ultimately to the owner; sometimes it is wrong, at other times too little, or in other cases an overload of unnecessary data. At the handover phase, unstructured data can leave owner/operators with siloed data and systems, inaccurate information, and poor insight into the performance of a facility. Data standards such as ISO 19650 directly target this problem that at a simple level require an appreciation of the asset data lifecycle that starts with defining the need in order to allow for correct data preparation.
Implementing a project CDE helps ensure that the prepared data and information is managed and flows easily between various teams and project phases, through to completion and handover. An integrated connected data environment can subsequently leverage this approved project data alongside other asset information sources to deliver the foundation of a valuable useable digital twin.
To develop this connected digital twin, investors and their supply chains can appear to be presented with two choices: an off-the-shelf proprietary solution tied to one vendor or the prospect of building a one-off solution with risk of long term support and maintenance challenges. However, this binary perspective is not the case if industry platforms and readily available existing integrations are leveraged to create a flexible custom digital twin.
Autodesk has provided its customer base with the solutions to develop custom data integrations over many years, commencing with a reliable common data environment solution. Many of these project CDEs have subsequently migrated to become functional and beneficial digital twins because of a structured data foundation. Using industry standards, open APIs and a plethora of partner integrations, Autodesk’s Forge Platform, Construction Cloud and recently Tandem enable customers to build the digital twin they need without fear of near term obsolescence or over commitment to one technology approach. Furthermore partnerships with key technology providers such as ESRI and Archibus extend solution options as well as enhancing long term confidence in any developed digital twin.
The promises of digital twins are certainly alluring. Data-rich digital twins have the potential to transform asset management and operations, providing owners new insights to inform their decision-making and planning. Although digital twin technologies and industry practice are still in their youth, it is clear that the ultimate success of digital twins relies on connected, common, and structured data sources based on current information management standards, coupled with adoption of flexible technology platforms that permit modification, enhancement or component exchange as the digital twin evolves, instead of committing up front to one data standard or solution strategy.
The Strategic Pipeline Alliance (SPA) was established to deliver a major part of Anglian Water’s ‘Water Resources Management Plan’ to safeguard against the potential future impacts of water scarcity, climate change and growth, whilst protecting and enhancing the environment. The SPA was established to deliver up to 500km of large diameter interconnecting transmission pipelines, associated assets and a Digital Twin.
Digital transformation was identified early in the programme as a core foundational requirement for the alliance to run its ‘business’ effectively and efficiently. It will take Anglian Water through a digital transformation in the creation of a smart water system, using a geospatial information system as a core component of the common data environment (CDE), enabling collaboration and visualisation in this Project 13 Enterprise.
Our geospatial information system (GIS) described is just one part of a wider digital transformation approach that SPA has been developing and is a step change in the way that Anglian Water uses spatial data to collaborate and make key decisions, with net savings of £1m identified.
When the newly formed SPA went from an office-based organisation to a home-based organisation overnight due to COVID19, standing up an effective central GIS system was critical to maintain the ability to work efficiently, by providing a common window to the complex data environment to all users. With 500km of land parcels and around 5000 stakeholders to liaise with, the GIS system provided the central data repository as well as landowner and stakeholder relationship management. The mobile device applications, land management system, ground investigation solution and ecology mapping processes all enabled SPA to hit its key consenting and EIA (Environmental Impact Assessment) application dates.
We got the Alliance in place and fully operative within six months and the SPA GIS has helped fast-track a key SPA goal of increasing automation throughout the project lifecycle; automation tools such as FME (Feature Manipulation Engine), Python and Model Builder have been widely adopted, driving efficiencies.
The SPA GIS analyses and visually displays geographically referenced information. It uses data that is attached to a unique location and enables users to collaborate and visualise near real time information. Digital optimisation will provide enormous value and efficiencies in engineering, production, and operational costs of the smart water system. Having a single repository of up-to-date core project geospatial deliverables and information has reduced risk and enabled domain experts and our supply chain to interact with data efficiently.
Spending quality time up front in developing an enterprise architecture and data model allowed us to develop a CDE based around GIS. A cost model was approved for the full five years, and the platform was successfully rolled out.
The Enterprise Architecture model was created in a repository linked to Anglian Water’s enterprise. This included mapping out the technology and data integration requirements, as well as the full end-to-end business processes. The result was a consistent, interoperable solution stack that could be used by all alliance partners, avoiding costly duplication. GIS was identified as a key method of integrating data from a wide range of different sources, helping to improve access across the alliance to single version of the truth and improving confidence in data quality. In addition, a fully attributed spatial data model was developed representing the physical assets. This will help support future operations and maintenance use cases that monitor asset performance.
The use of our GIS system is enabling SPA to meet its obligations around planning applications and obtaining landowner consent to survey, inspect and construct the strategic pipeline. Hundreds of Gb of data had to be collected, analysed, and managed to create our submissions.
The SPA GIS provides secure, consistent, and rapid access to large volumes of geospatial data in a single repository. Using a common ‘web-centric’ application, the solution enables teams to cooperate on location-based data, ensuring its 700+ users can access current and accurate information. The intuitive interface, combined with unlimited user access, has enabled the Alliance to rapidly scale without restriction. We have also enabled the functionality for desktop software (ESRI ArcPro, QGIS, FME, AutoDesk CAD and Civil3D) to connect to the geodatabase to allow specialist users to work with the data in the managed, controlled environment, including our supply chain partners.
The integration of SPA Land Management and SPA GIS in one platform has brought advantages to stakeholder relationship management by enabling engagement to be reviewed spatially.
SPA’s integrated geospatial digital system has been the go-to resource for the diverse and complex teams. The use of our GIS system has been used to extensively engage with the wider Anglian Water operational teams, enabling greater collaboration and understanding of the complex system. The GIS system has, in part, enabled SPA to remove the need to construct over 100km of pipeline, instead re-using existing assets that have been identified in the GIS solution, contributing to the 63% reduction in forecast capital carbon, compared to the baseline.
The SPA Land Management solution incorporates four core areas: land ownership, land access survey management and stakeholder relationship management (developed by SPA) which puts stakeholder and customer engagement at its heart. With 300 unique land access users, traditionally, these areas would be looked after by separate teams, with separate systems which struggle to share data. With the digital tool, land and engagement data can be shared across SPA, creating a single source of truth, mitigating risk across the whole infrastructure programme. This has benefitted our customers, as engagement with them is managed much more effectively. Our customer sentiment surveys show 98% are satisfied with how we are communicating with them.
The Enterprise Architecture solution allows for capabilities to be transferred into Anglian Water’s enterprise, and there has been careful consideration around ensuring the value of data collected during the project is retained. SPA is developing blueprints as part of the outputs to enable future Alliances to align with best practices, data, cyber and technology policies. SPA is also focussing on developing the cultural and behavioural aspects with Anglian Water to enable Anglian to be able to accept the technological changes as part of this digital transformation. This is a substantial benefit and enables Anglian Water to continue to work towards its operator of the future ambitions, where digital technologies and human interfaces will delivery higher levels of operational excellence.
Article written by :- Ilnaz Ashayeri - University of Wolverhampton | Jack Goulding - University of Wolverhampton
STELLAR provides new tools and business models to deliver affordable homes across the UK at the point of housing need. This concept centralises and optimises complex design, frame manufacturing and certification within a central 'hub'; where 'spoke' factories engage their expertise through the SME-driven supply chain. This approach originated from the airline industry in the 1950’s, where the rationale of this optimises process and logistic distribution. STELLAR takes this one step further by creating a bespoke offsite ‘hub and spoke’ model which is purposefully designed to deliver maximum efficiency savings and end-product value. This arrangement is demonstrated through three central tenets: 1) 3D 'digital twin' factory planning tool; 2) Parametric modelling tool (to optimise house design); and 3) OSM Should-Cost model (designed specifically for the offsite market).
The energy industry has made impressive strides along the path to net-zero, while undergoing the transition to digitisation. Our next, shared step can be to capitalise on the potential of a more dynamic, joined-up and intelligent view of our entire energy system.
Great Britain’s energy system is experiencing two fundamental transitions in parallel.
First, the shift to net zero – something we’ve already made significant strides in. The decarbonisation of our sector is well underway, as is the planning for the changing demands on the sector as other industries also undergo this change in their own efforts to reach net zero.
And second, digitisation. New technology and the prevalence of real-time data have already transformed many aspects of the energy industry, and there are a multitude of commercial projects that bring to life the concept of digital twins of specific IT systems.
An opportunity to come together
We now have an opportunity ahead of us; to bring these parallel transitions together to create something incalculably more powerful, that has the potential to help us take even greater strides towards net zero.
This is why we’re launching an industry-wide programme to develop the Virtual Energy System – a digital twin of Great Britain’s entire energy system.
We recognise it’s an ambitious goal.
But we also recognise that it could be a central tool, bringing together every individual element of our system to create a collective view which will give us more dynamic intelligence around all aspects of the energy industry.
The Virtual Energy System will also provide us all with a virtual environment to test, model and make more accurate forecasts – supporting commercial decision-making, while enabling us to understand the consumer impact of changes before we make them.
This ambition is not out of reach - many elements of the energy industry are already using individual digital twins. The next step on this journey is to work together to find a way to take these digital twins forward, in unison. A way in which we can connect these assets and encourage future development across the entire energy spectrum.
A tool created by our industry, for everyone
The key to the Virtual Energy System will be collaboration - this won’t be the ESO’s tool, but a tool available to our entire industry - a tool that we will all be able to tap into and derive learnings from, that will support future innovation and problem solving.
But we need to start somewhere. We are sharing the concept and setting down the gauntlet. It will only become a reality if it is collaboratively designed and developed by the whole energy industry.
The ESO has set out its initial thinking on what a roadmap could look like, but we need our best and brightest minds to feed into this to shape its future. We know we won’t always reach a perfect consensus every time, but only through engagement and open collaboration will the full benefits be unlocked.
In December we brought the energy industry together with Ofgem and BEIS for a one-day conference. It was an opportunity to explore the proposed programme, and kickstart our feedback and engagement period. From this, we plan to form working groups to begin a deeper dive into the key areas of development that will underpin the entire development journey. To watch back the conference, contribute to our initial stakeholder feedback and view a brief outline on the suggested structure visit our website.
Get Involved and Hear More
Join us on Thursday 10th February 1pm-2pm for a brief introduction to our Common Framework Benchmarking Report ahead of its public release, followed by a workshop around the key socio-technical factors which could make up the common framework of the Virtual Energy System. There will be lots of opportunity to discuss and ask questions during the session, it will be an informal session where we can collaborate around the latest ideas.
Register to attend
You can also join us on the Gemini Call on 8th February for a short introduction before the full session.
We are facing a growing challenge in the UK in managing the assets we construct. New structures will need future maintenance and much of our existing infrastructure is ageing and performing beyond its design life and intended capacity. In order to get more out of our existing assets with minimum use of limited resource we need to better understand how they are performing. Climate crisis and extreme weather events bring additional strain to the condition and structural health of assets making assessing their condition increasingly important. There are logistical challenges too – visually inspecting remote and hard to access assets can be expensive and hazardous.
Many people don’t consider that the Earth’s surface is being continuously scanned. By different satellite sensors, in different directions, day and night. While the proliferation of sensors and satellite technology has fuelled a revolution in the way we can monitor assets, the ideal solution is to use different tools in the engineer’s toolbelt in order to find the right solutions for the right cases.
We’re used to the ideal of Google Earth, and many people in our sector are learning about the usefulness of maps and geographical information systems (GIS), with many open datasets provided by organisations like Ordnance Survey in the UK. What you see as satellite images on Google Earth are forms of optical data: like taking pictures over time of the Earth’s surface and using our eyes to see the changes (or maybe automating change detection through machine learning…and that’s another point). What many people working in the built environment do not realise is that there is a whole spectrum of other sensors that can show us beyond what our eyes can see.
Did you know that radar satellites continuously scan the earth, emitting and receiving back radar waves? These satellites do not rely on daylight to image and so we can collect radar measurements day and night, and even through clouds. Using different processing techniques, this data can be used to create 3D digital elevation models, map floods and measure millimetres of movement at the Earth’s surface – all from hundreds of kilometres up in space. And did you know there is free data available to track pollutants, monitor ground changes and track vegetation?
There is. In huge volumes. Petabytes of data are held in archives which allow us to look backwards in time as well as forwards. With all this opportunity, it can seem a bit daunting on where to get started.
I have worked in the design, construction and maintenance sectors for over a decade, and I came back to academia to learn about the opportunities of satellite data from the German Aerospace Center and the Satellite Applications Catapult. I spent a PhD’s worth of time retaining in data analysis so that I could combine the latest in data analysis with a civil engineer’s lens to better understand how we can unlock value from this data. I’ll save you the time and give a quick overview of what we can do in industry now, and share some learnings from talented researchers working on a Centre for Digital Built Britain (CDBB) project on satellite monitoring to support digital twin journeys.
Hope to see you next Tuesday 1st February at the Gemini call for introduction to the topic and some signposting on where you can go to find out more to make the most of such data for your own assets.
AEC Information Containers based on the ISO21597, the so-called ICDD, are a great way to store data and relations. Widely discussed as a possible structure for any CDE, this standard was made to hand over project data or exchange files of a heterogeneous nature in an open and stable container format and therefore will become the starting point of many digital twins.
The standard says: A container is a file that shall have an extension ".icdd" and shall comply with ISO/IEC 21320–1, also known as ZIP64.
Information deliveries are often a combination of drawings, information models, text documents, spreadsheets, photos, videos, audio files, etc. In this case, many scans and point clouds came on top. And while we have all metadata datasets in our system, it is pretty hard to hand this over to the client, that might have another way of handling it. So we have now put all those specific relationships between information elements in those separate documents using links because we believe it will contribute significantly to the value of information delivery.
We successfully handed over a retroBIM project from a nuclear facility in Germany. It was 661469018KB. And it was a ZIP; before zipping, it was around 8TB! It has a whole archive back to the 60' it has all models, all point clouds the model was made of, and it has all documents produced from the models too. So, all in all, we have 2338 documents.
We created an ICDD container that, when represented as an RDF graph (index & links), is composed of 29762 unique entities, 37897 literal values and 147795 triples.
All these information is now transferred independent form any software application and a great way to start a digital twin from. All sensor and live date can be added the same way we had connected documents with BIM elements. Only difference is that you do not store it in a zip file but rather run it in a graph data base. This way you will not only have the most powerful and fastest twin, but also most future-proof and extendible one you can possible get.
Next week’s Gemini Call will include a presentation by Jack Ostrofsky, Head of Quality and Design at Southern Housing Group and Chair of BIM for Housing Associations.
BIM for Housing Associations (BIM4HAs) is a client led and client funded initiative set up in 2018 to accelerate the uptake of consistent and open standards-based BIM processes across the Housing Association sector.
An urgent priority for this group is building and fire safety, particularly in the context of the development of a Golden Thread of Building Safety Information which is part of the Building Safety Bill which is expected to receive Royal Assent in 2022.
Understanding of BIM and Digital Twins in the residential housing sector is poor, yet as long-term owner-operators of built assets, housing associations are ideally placed to benefit from the efficiencies of BIM and Digital Twins.
In June 2021 BIM4HAs published a Toolkit of resources for housing associations aimed at assisting them in the process of adopting ‘Better Information Management’. The toolkit, which is free to use, translates the requirements of the National BIM Framework into accessible language and practical tools for housing associations.
Jack will describe an example of the challenge to housing associations to use structured data to manage their assets; the transfer of spatial information about buildings which designers and contractors label as ‘plots’, development managers and asset managers in housing associations have their own naming conventions which have evolved in a traditional and disjointed manner. As a result, the metadata links are severed at handover and a great deal of valuable, useable information is lost to the client.
Jack’s employer Southern Housing Group has developed a spatial hierarchy and property reference numbering system which was published in the BIM4HAs Toolkit in June.
The spatial hierarchy and naming system links to commonly understood asset management language and informs Asset Information Requirements that housing associations can use to instruct development and refurbishment projects. This process enables contractors to provide useable metadata to housing associations and will form an essential part of the implementation of a Golden Thread of Building Safety Information.
In a further development Southern Housing Group, working with members of the BIM4HAs community, have developed and are implementing an Asset Information Model based on the Gemini Principles and aligned with the other BIM4HAs work. This Model will be published for free, for anyone to use, by BIM4HAs as part of an update to the BIM4HAs Toolkit in February.
Please join us on the Gemini Call on 25th January at 10.30 to hear about the spatial hierarchy work and put your questions to Jack.
Download the Spatial Hierarchy Document and ‘The Business Case for BIM’ Document from the links below. Both are part of the Toolkit.
The whole Toolkit can be downloaded for free from the National Housing Federation website here: housing.org.uk/BIM4HAs
BIM for Housing Associations Pt1 The Business Case for BIM.pdf SHG Spatial Hierarchy UPRN Procedures.pdf
In setting up the SPA Enterprise it was acknowledged that BIM principles would drive outperformance in both the project and asset lifecycles, and therefore an early focus ensured that the foundations were in place to enable SPA to maximise benefits from data and information.
To smooth the integration of our physical assets and the associated data and information produced our enterprise architecture focussed on delivering a solution that would:
Maximise the benefits from the existing Anglian enterprise.
Ensure that data and information would integrate seamlessly with existing Anglian repositories.
Easily be transitioned from the project to the asset information model.
This approach would not hinder bringing any additional enterprise systems that would benefit Anglian Water but would ensure that any legacy systems were planned for seamless integration, giving a longer-term benefit (blueprint) for other and future Alliances.
Development of the BIM strategy identified the need for the following BIM tools in line with recommendations in PAS1192-2 (now superseded):
BIM Execution plan – in response to the EIR (Exchange Information Requirements).
Common Data Environment (CDE) – to allow exchange of information within the project team and the wider supply chain eco-system - GIS (Geospatial Information System), BIM360, Azure, SharePoint.
Master Information Delivery Plan (MIDP) and Task Information Delivery Plan (TIDP) – to manage delivery of information during a project.
Supply chain EIR.
Asset Information Model.
During the initial period SPA has had to work closely with Anglian Water to ensure that we have the following in place:
Clear information repositories.
Approved data structures.
Collaborative communication mechanisms.
Appropriate security and authentication checks.
Clearly defined and agreed processes.
As an early adopter on the Project13 programme (Centre for Digital Built Britain) the relational development of our supply chain eco-system was essential.
All our suppliers complete a Collaboration Request Form (MS Flow Automate), and a BIM Capability Assessment (MS Flow Automate). We work through the SPA Supplier EIR with all partners to share our information management standards and determine how much we need to work with them to ensure the benefits of BIM are realised.
Part of this induction is being clear on the expected deliverables and the format of these, and how they can interact with our common data environment. For all suppliers we set up a dedicated folder in our SharePoint and BIM360 environments for all information exchange and should there be a need for the supplier to access GIS or BIM models we assist them from a technological and behavioural perspective.
We have created an automated OCRA (Originate Check Review Approve) process that SPA end-users use for Quality Assurance (QA) in SharePoint and BIM360. With BIM360 the OCRA workflows functionality is built in, and we can create new, customisable checking procedures at will.
The CDE storage philosophy of project deliverable information is data driven, utilising file metadata to structure, sort, and search for information. ‘Containerisation’ of information utilising subfolder subsystems is kept minimal thereby facilitating a transparency and consistency in the storage of our information across all projects.
A Digital Delivery lead was put in place by SPA as the platform owner for BIM 360 supported by a team of BIM Engineers. The setup, configuration and management of the platform is governed by the BIM Execution Plan and the CAD (Computer Aided Design) strategy.
Throughout the design phase of projects in SPA, the various teams have endeavoured to create, and use coordinated, internally consistent, computable information about the project and provide this information to project stakeholders in the most valuable format possible. Following the statutory process and environmental impact study phases for the initial projects, the project moved towards detailed design with a multi-disciplinary design team. With support from the senior leadership in SPA, the design team have embraced a production-based approach which has entailed the adoption of 3D modelling techniques and BIM workflows.
Data is transferred from analysis and design applications directly into an integrated model, leveraging 3D modelling techniques to enable clash detection, design visualisation and ‘optioneering’ as part of SPA’s Digital Rehearsal approach. The 3D and 2D information models not only serve as a visual communication tool to convey the infrastructure design to the various teams, statutory bodies, and public stakeholders, but was also a vital tool to inform Anglian Water of the development of the assets they will own and operate. The project team have utilised various BIM and GIS technology to enhance and communicate the various constraints (environmental, legislative, physical, ecological, hydraulic, geotechnical etc.) and complex design effectively to all stakeholders. This has been achieved in many formats utilising various software products throughout the project’s life cycle. This will include the use of a virtual reality (VR) gaming engine and the direct importation of the single integrated 3D tunnelling compound model into the GIS environment.
This means that design conflicts are identified and rectified before construction drawings are completed and issued. Similarly, 3D simulations help promote safety and avoid costly inefficiency by identifying potential issues and mitigating against them in advance.
It is estimated that setting up this framework will generate at least £1723k net savings over the project period using BIM. This is estimated by the reduction in individual time saved by designers, as well as project time saved.
It should be noted that there are many non-financial benefits that have also been identified including benefits in safety (better identification of safety changes), to the wellbeing of our staff (reduced driving as collaboration in the model can be remote), and to the environment (reduced Carbon as less miles driven to meetings). There will also be Operational (Opex) savings because of the way that we collate, capture, manage and re-use data within the asset information model. These operational cost savings are yet to be quantified. There are also non-quantifiable benefits expected from a reduction in rework and prolongation.
In conclusion the introduction of BIM techniques has greatly benefitted the Alliance and will continue to do so throughout the project and asset lifecycle.
The climate emergency and the transition to a net zero economy means businesses, governments and individuals need access to new information to ensure that we can mitigate and adapt to the effects of environmental and climate change. Environmental Intelligence will be a critical tool in tackling the climate and ecological crises, and will support us as we move towards more sustainable interaction with the natural environment, and delivery of net zero.
Environmental Intelligence is a fast-developing new field that brings together Environmental data and knowledge with Artificial Intelligence to provide the meaningful insight to inform decision-making, improved risk management, and the technological innovation that will lead us towards a sustainable interaction with the natural environment. It is inherently inter-disciplinary and brings together research in environment, climate, society, economics, human health, complex eco-systems, data science and AI.
The Joint Centre for Excellence in Environmental Intelligence (JCEEI) is a world-leading collaboration between the UK Met Office and the University of Exeter, together with The Alan Turing Institute and other strategic regional and national collaborators. This centre of excellence brings together internationally renowned expertise and assets in climate change and biodiversity, with data science, digital innovation, artificial intelligence and high-performance computing.
The JCEEI’s Climate Impacts Mitigation, Adaption and Resilience (CLIMAR) framework uses Data Science and AI to integrate multiple sources of data to quantify and visualise the risks of climate change on populations, infrastructure and the economy in a form that will be accessible to a wide variety of audiences, including policy makers, businesses and the public.
CLIMAR is based on the Intergovernmental Panel on Climate Change’s (IPCC; https://www.ipcc.ch) risk model that conceptualises the risk of climate-related impacts as the result of the interaction of climate-related hazards (including hazardous events and trends) with the vulnerability and exposure of human and natural systems. Hazards are defined as ‘the potential occurrence of a natural or human-induced physical event or trend or physical impact that may cause loss of life, injury, or other health impacts as well as manage and loss to property, infrastructure, livelihoods, service proposition, ecosystems, and environmental services.’; exposures ‘The presence of people, livelihoods, species or ecosystems, environmental functions, services, and resource, infrastructure, or economic, social or cultural assets in places and settings that could be adversely affected.’; and vulnerability ‘The propensity or predisposition to be adversely affected’, which encompasses sensitivity or susceptibility to harm and lack of capacity to cope and adapt.
A mathematical model is used to express the risk of a climate related impact, e.g. an adverse health outcome associated with increased temperatures or a building flooding in times of increased precipitation. Risk is defined as the probability that an event happens in a defined time period and location and is a combination of the probabilities of the hazard occurring together with probability models for exposure and vulnerability. In the simplest case, the probabilities (of hazard, exposure and vulnerability) would be treated as independent, but in reality the situation is much more complex and the different components will often be dependent on each other), which requires conditional probability models to be used. For example people’s exposures to environmental hazards (e.g. air pollution) may be dependent on their vulnerability (e.g. existing health conditions.
The UKCP18 high-resolution climate projections are used to inform models for hazards and provide information on how the climate of the UK may change over the 21st century (https://www.metoffice.gov.uk/research/approach/collaboration/ukcp/index). This enables the exploration of future changes in daily and hourly extremes (e.g. storms, summer downpours, severe wind gusts), hydrological impacts modelling (e.g. flash floods) and climate change for cities (e.g. urban extremes). The headline results from UKCP18 are a greater chance of warmer, wetter winters and hotter, drier summers, along with an increase in the frequency and intensity of extremes. By the end of the 21st century, all areas of the UK are projected to be warmer and hot summers are expected to become more common. The projections also suggest significant increases in hourly precipitation extremes, with the rainfall associated with an event that occurs typically once every 2 years increasing by 25%, and the frequency of days with hourly rainfall > 30 mm/h almost doubling, by the 2070s; increasing from the UK average of once every 10 years now to almost once every 5 years.
CLIMAR is currently being used in a range of real-world applications based on the UKCP18 projections across sectors that will be affected by changes in the climate, including energy system security, telecommunications, critical infrastructure, water and sewage networks, and health. Two examples are:
working with Bristol City Council on the effects of climate change on urban heat, inequalities between population groups and the efficacy of methods for adapting building stock (e.g. improved ventilation, double glazing) to keep people cool, and safe, in periods of extreme heat;
working with a consortium led by the National Digital Twin Programme and the Centre for Digital Built Britain to develop a Climate Resilience Demonstrator, integrating climate projections with asset information and operational models to develop a Digital Twin that can be used to assess the future risks of flooding on critical infrastructure including energy, communications and water and sewage networks. This will provide a step-change in our understanding of the potential effects of climate change on critical infrastructure and demonstrates the power of inter-disciplinary partnerships, spanning academia and industry, that will be crucial in unlocking the enormous potential for Digital Twins to enhance our resilience to climate change across a wide variety of sectors.
For further information on CLIMAR and associated projects, please see https://jceei.org/projects/climar/ and for information on the National Digital Twin Climate Resilience Demonstrator (CreDo) see https://digitaltwinhub.co.uk/projects/credo/
The bigger and more complicated the engineering problem, the more likely it is to have a digital twin. Firms that build rockets, planes and ships, for example, have been creating digital twins since the early 2000s, seeing significant operational efficiencies and cost-savings as a result. To date, however, few firms have been able to realise the full potential of this technology by using it to develop new value-added services for their customers. This article describes a framework designed to help scale the value of digital twins beyond operational efficiency towards new revenue streams.
In spite of the hype surrounding digital twins, there is little guidance for executives to help them make sense of the business opportunities the technology presents, beyond cost savings and operational efficiencies.
Many businesses are keen to get a greater return on their digital twins’ investment by capitalising on the innovation – and revenue generating - opportunities that may arise from a deeper understanding of how customers use their products. However, because very few firms are making significant progress in this regard, there is no blueprint to follow. New business models are evolving but the business opportunities for suppliers, technology partners and end-users is yet to be fully documented.
Most businesses will be familiar with the business model canvas as a tool to identify current and future business model opportunities. Our 4 Values (4Vs) framework for digital twins is a more concise version of the tool, developed to help executives better understand potential new business models. It was designed from a literature review and validated and modified through industry interviews.
The 4Vs framework covers: the value proposition for the product or service being offered, the value architecture or the infrastructure that the firm creates and maintains in order to generate sustainable revenues; the value network representing the firm’s infrastructure and network of partners needed to create value and to maintain good customer relationships; and value finance such as cost and revenue structures.
Four types of digital twin business models
From extensive interviews with middle and top management on services offered by digital twins, we identified four different types of business models and applied our 4Vs approach to understand how those models are configured and how they generate value.
These were all found in information, data and system services industries. Their value proposition is to provide a data marketplace that orchestrates the different players in the ecosystem and provides anonymised performance data from, for example, vehicle engines or heating systems for buildings. Value Finance consists of recurring monthly revenues levied through a platform which itself takes a fee and allocates gives the rest according to the partnership arrangements.
This business model is prevalent in the world of complex assets, such as chemical processing plants and buildings. Its value proposition lies in providing additional insights to the customer on the maintenance of their assets to provide just-in-time services. What-if analysis and scenario planning are used to augment the services provided with the physical asset that is sold. Value Architecture is both open and closed, as these firms play in ecosystems but also create their own. They control the supply chain, how they design the asset, how they test it and deliver it. The Value Network consists of strategic partners in process modelling, 3D visualisation, CAD, infrastructure and telecommunications. Value Finance includes software and services which provide a good margin within a subscription model. Clients are more likely to take add-on services that show significant cost savings.
This business model tends to be found in the transport sector, where it’s important to maximise the uptime of the aircraft, train or vehicle.
The value proposition centres on keeping these vehicles operational, either through predictive maintenance for vehicle/aircraft fleet management and, in the case of HGVs, route optimisation. Value Architecture is transitioning from closed to open ecosystems. There are fewer lock-in solutions as customers increasingly want an ecosystems approach. Typically, it is distributors, head offices and workshops that interact with the digital twin rather than the end-customer. The Value Network is open at the design and assembly lifecycle stages but becomes closed during sustainment phases. For direct customers digital twins are built in-house and are therefore less reliant on third-party solutions. Value Finance is focused on customers paying a fee to maximise the uptime of the vehicle or aircraft, guaranteeing, for example, access five days a week between certain hours.
This business model focuses on delivering the necessary outcome to the customers. It tends to be found with government clients in the defense and aerospace sector. Value propositions are centered around improving efficacy of support and maintenance/ operator insight and guaranteeing mission success or completion. These business models suffer from a complex landscape of ownership for integrators of systems as much of the data does not make it to sustainment stages.
Value Architecture is designed to deliver a series of digital threads in a decentralised manner. Immersive technologies are used for training purposes or improved operator experience. Value Network is more closed than open as these industries focus on critical missions of highly secure assets. Therefore, service providers are more security minded and careful of relying on third-party platforms for digital twin services. Semi-open architecture is used to connect to different hierarchies of digital twins/digital threads. Value Finance revealed that existing pricing models, contracts and commercial models are not yet necessarily mature enough to transition into platform-based revenue models. Insights as a service is a future direction but challenging at the moment, with the market not yet mature for outcome-based pricing.
For B2B service-providers who are looking to generate new revenue from their digital twins, it is important to consider how the business model should be configured and identify major barriers to their success. Our research found that the barriers most often cited were cost, cybersecurity, cultural acceptance of the technology, commercial or market needs and, perhaps most significantly, a lack of buy-in from business leaders. Our 4Vs framework has been designed to help those leaders arrive at a better understanding of the business opportunities digital twin services can provide. We hope this will drive innovation and help digital twins realise their full business potential.
Our research to date has been through in-depth qualitative interviews across industry but we wish to expand this research and gather quantitative information on specific business model outcomes from digital twins across industry.
If you would like to support this research and learn more about the business model outcomes from digital twins, then please participate in our survey!
Take part in our survey here: https://cambridge.eu.qualtrics.com/jfe/form/SV_0PXRkrDsXwtCnXg
Information sheet.pdf Read more...
Like many companies Atkins, a member of the SNC-Lavalin group, is investing heavily in digital transformation and we all know that skills are a key enabler. We wanted to be clear about the skills needed by our workforce to be able to deliver digitally. The starting point was finding out what digital skills we have in the company. Then we could identify gaps and how we might bridge them.
But what are digital skills…and how do we measure them?
As we pondered this, we realised that there were many, many challenges we would need to address. Atkins is a ‘broad church’ comprising many professionals and technical specialisms. Digital transformation is challenging the business in many different ways. Articulating a single set of digital skills that reflects needs across the business is complicated by language, terminology and digital maturity. Furthermore, unlike corporate engagement surveys, there is no established industry baseline that we can use to benchmark our corporate digital skills against. To evaluate a skills gap would require an estimate of both the quantity and types of skills that will be required in the future – something that is far from certain given our industry’s vulnerability to disruption.
We knew we were trying to grasp something universal and not sector or domain specific, so this is the definition we decided to use: Digital skills enable the individual to apply new/digital technology in their professional domain.
That left the question around how we measure Digital Skills.
We did some research and explored several frameworks including Skills For the Information Age (SFIA), the EU Science Hub’s DigiComp and the DQ Institute’s framework. As we were doing this, we became aware that the CDBB Skills and Competence Framework (SCF) was being launched and we immediately sensed it could be just what we were looking for.
Why? Apart from being right up to date, it has a simple and straightforward structure and is capable to be tailored for an organisation. The proficiency levels are very recognisable - Awareness, Practitioner, Working and Expert - and it is in the public domain. But most importantly it seemed like a good fit because most of what we do at Atkins is in some way related to infrastructure and therefore is within the domain of digital twins.
But we needed to test that fit. Our hypothesis was “…that the CDBB SCF had sufficient skills to represent the ability of our staff”. We tested this with a survey, interviews, and a series of workshops.
In the survey we looked at how individuals from different groups in the company (differentiated by their use of technology) understood the 12 skill definitions and the extent to which they needed and used each skill in their day-to-day role. We also explored whether there were other digital skills that respondents felt were not recognised by the framework. We followed up the survey with interviews to clarify some of the responses and then used workshops to play back our findings and sense-check the conclusions.
Our overall conclusion was that we had good evidence to support our hypothesis, i.e. that the CDBB SCF was a good fit for our workforce. However, we realized we would need to bring the indicators to life so that users could relate them to their roles, particularly with people at the Awareness and Working Levels. This is not unexpected. Generally, people with lower levels of competence don’t know what they don’t know.
Another conclusion was that we needed a fifth, null competency indicator to recognise that not everyone needs to know about everything.
In terms of next steps, we are working with a supplier to develop an online assessment tool so that we can apply the framework at scale. We have rewritten the skills definitions and competence indicators to omit any reference to the NDT programme etc. although these were very minor changes.
We are working on ways to bring the skill level indicators to life for our employees e.g. through guidance materials, FAQs etc. We’re also developing an initial ‘rough and ready’ set of learning materials related to each of the digital indicators at Awareness and Practitioner levels. We expect the CDBB’s Training Skills Register to feature prominently in this!
Some of things we have parked for the moment are: (1) Moderation and accreditation of the assessments. Our first wave will be self-assessed only, and (2) Integrating the skills into role definitions.
We’re very grateful to CDBB for the timely creation of the SCF and I look forward to sharing our onward journey with the DT Hub community.
Anglian Water is an early adopter of digital twins within the water sector, working closely with the Centre for Digital Built Britain to help develop the market and showcase how digital twins can support an organisation’s strategic outcomes.
Anglian Water has a 15 year vision to develop a digital twin to sit alongside its physical assets.
From an Anglian Water perspective, the Digital Twin is essentially an accurate digital representation of their physical assets, enabling insight, supporting decision making and leading to better outcomes. Aligning the digital twin objectives to Anglian Water’s regulated outcomes, as defined by the regulator OFWAT, has been a key step in developing the business case.
With the initial vision and roadmap outlined the next step on the roadmap was to implement a proof of concept, to explore the value created from digital twins. Anglian Water undertook a discovery phase and a Proof of Concept with Black and Veatch for a Digital Twin back in 2019, and started to define how a Digital Twin would benefit the delivery and management of physical assets.
The discovery phase looked to understand the current landscape, further enhancing the vision and roadmap, and establish persona requirements. It proved vital to really understand the organisation and the impact on people during this early exploratory work.
The proof of concept looked at delivering three main outputs, focused on a pumping station to keep the scope focused and value measurable:
To demonstrate an asset intelligence capability
To demonstrate a visualisation capability
To examine the asset data and architecture.
Alongside the proof of concept other initiatives were kick started to consider how other elements of digital twin might add value, with a focus on more enhanced use of hydraulic models to explore how water networks could be further optimised. Anglian Water recognised early on that by integrating and enhancing many of the existing enterprise systems, existing investments could be leveraged and technology gaps identified.
Learning from the proof of concept and other early works Anglian Water looked to the next step of the roadmap, a scaled demonstrator on the Strategic Pipeline Alliance. The Strategic Pipeline Alliance was set up to deliver up to 500km of large scale pipeline, and alongside this to start defining and delivering the first phase of the digital twin. SPA has a 2025 vision is to deliver a large-scale, holistically linked water transfer resilience system. This will be operated, performance managed and maintained using advanced digital technology.
The SPA team set about developing a digital twin strategy which is based on the wider corporate vision and enhances the proof of concept work. The basic premise of the SPA digital twin is to integrate traditionally siloed business functions and systems, to deliver enhanced capability across the asset lifecycle.
As with Anglian Water the SPA strategy is focused on using the technology available and developing a robust enterprise, integration, and data architecture to create a foundation for digital twin. Taking this a step further it was decided to adopt a product based approach, thinking about the development of digital twin products aligned to physical assets, that could be re-used across the wider Anglian Water enterprise.
This whole life product based approach threw up some interesting challenges, namely how to build a business case that delivered benefit to SPA and also enabled Anglian Water’s future ambitions, taking a lifecycle view of the value delivered.
To achieve this meant considering and assessing the value to both SPA during the capital delivery phase and Anglian Water during the operational phases. This process also highlighted that certain elements of the digital twin deliver value to both SPA and Anglian Water equally and could therefore be considered as a shared benefit.
The resulting benefits register helped to identify the value delivered to the alliance partners which was vital to securing the delivery board sign off. As Anglian Water are a partner in the alliance, the ability to demonstrate value in the operational phase with SPA developing the technical foundation, was another key element in securing the investment.
As part of the overall process the SPA board were keen to see how the investment would be allocated, therefore the strategy evolved to incorporate the capabilities to be developed within SPA to enable digital twin. This helped to inform and validate the team for digital twin delivery.
With the capabilities and organisational chart resolved, a governance framework was put into place to allow the digital twin evolution to be managed effectively, putting in place the right checks and balances. This has included input and oversight from the wider Anglian Water team as ultimately, they will be responsible for managing the various digital twins long term.
To validate the digital twin against the SPA outcomes and objectives, the various elements of the digital twin were incorporated into the overall enterprise architecture. This has proved to be an important part of the process to ensure alignment to the wider capabilities and importantly ensure the right technology is in place. The enterprise architecture continues to evolve to include information objects below the application layer, again building on the product based approach, so that the enterprise architecture can be utilised in the wider Anglian Water Alliances.
In total the development of the strategy, business case and capabilities has taken 6 months, however it is important to note that this builds on the earlier proof of concept and ideation during the initial mobilisation of SPA. Given the approach a key next step is to work with Anglian Water to explore accelerated deployment of SPA digital twins on other major schemes, to put to test the product approach and maximise the investment made.
We have learnt from the early developments on SPA that articulating a whole life view of value is vital and that focusing on capital / operational stages is equally important, so that appropriate budget holders can see the value being delivered. We have also learnt the importance of having a bold vision which must be matched by clear definition of the first few steps, showing a long term roadmap for achieving an enterprise digital twin.
What is certainly clear is that we still have a lot to learn, however by following good architectural best practice and placing people and our environment at the heart of digital twin, we have put in place a good foundation from which to build.
If you would like to know more, please get in touch through the DT Hub.
How manufacturers can structure and share data safely and sustainably.
Manufacturers of construction products produce a significant part of the information required to bring about a safer construction industry, but currently, this information isn’t structured or shared in a consistent way.
If UK construction is to meet the challenges of a digital future and respond to the requirements of a new building safety regulatory system, it needs manufacturers to structure and share their data safely and sustainably.
There’s no need to wait to digitise your product information. Making the correct changes now will bring immediate benefits to your business and long-term competitive advantage. This guide will help you identify what those changes are.
Our guide helps decision-makers in manufacturing identify why supplying structured data is important, how to avoid poor investment decisions, safe ways to share information about products across the supply chain, and more.
The Guide https://www.theiet.org/media/8843/digitisation-for-construction-product-manufacturers-main-guide.pdf
8 Page Summary https://www.theiet.org/media/8856/digitisation-for-construction-product-manufacturers-summary.pdf
2 Page Key facts and Summary https://www.theiet.org/media/8856/digitisation-for-construction-product-manufacturers-summary.pdf
The UK’s approach to delivering complex infrastructure projects is obsolete, leading to far too many projects failing to meet the expectations of their sponsors and the public. That was the conclusion reached following a detailed review commissioned by the Institution of Civil Engineers. I was lucky enough to work on the review that was published as A Systems Approach to Infrastructure Delivery (SAID) and published in December 2020.
At the centre of our findings is a call for a fundamental change of mindset. The review team, led by ex ICE Vice President Andrew McNaughton, concluded that even relatively small projects are now best seen as interventions into existing complex systems, made up of a mix of physical, human and digital components.
In this world, traditional civil engineering works, while still a large capital cost, only exists to support (or perhaps just keep dry!) these systems. It is easy to see that the system – not the civils – provides the infrastructure services on which people rely. More importantly, as Crossrail has shown us so clearly, the greatest sources of risk to a project now lie not in managing tunnelling or any other piece of heroic construction but in integrating and commissioning a fully functioning system – trains, stations, tracks, digital signalling, safety and communications, driver behaviour.
SAID proposes 8 principles for better projects that can shift the infrastructure industry in this direction. Principle 8 Data Oils your Project was built on detailed interviews with leading practitioners from inside and outside the infrastructure sector. Again and again we heard about the importance of all the project participants having access to consistent timely and reliable information. Client and owner organisations recognised that it is their responsibility to fix the data plumbing. This means having the capability to define what information is needed to deliver and operate the asset. It also means ensuring that the project’s systems convert raw data into meaningful information that flows to team members as and when they need it to make decisions.
Much of this is I think a no-brainer. What was really interesting was to see how thinking about data and digital is helping to generate a shift away from a traditional project mindset and towards a systems approach grounded in an understanding of the importance of what is already there.
Every asset owner we spoke to recognised that we are now firmly in a world where project deliver a cyber-physical asset. They also get that their digital twin can be the basis for a robust delivery and commission plan that integrates the project’s physical outputs into the existing network.
What was really interesting was to hear about the challenge of how a single project’s digital outputs can be effectively integrated into the existing cyber-physical system to create the kind of golden loop of information described in CDBB’s Flourishing Systems report of 2020. This would be real systems thinking, putting projects in their proper place in relation to the systems and human needs they are meant to be serving.
The response to SAID has been overwhelmingly positive. In response ICE has commissioned a second phase of work in which we are exploring the SAID principles with a series of live projects and infrastructure sector organisations. Later in 2021 ICE will be publishing practical advice on implementing the 8 principles based on the insight generated by these discussions. I hope that this blog can start a discussion with the CDBB network that will generate insight we can include in this advice and help the infrastructure world embrace a Systems Approach to Infrastructure Delivery
In November 2020 DNV published the energy industry’s first recommended practice (RP) on how to quality-assure digital twins. Our new RP, which we developed in collaboration with TechnipFMC, aims to set a benchmark for the sector’s various approaches to building and operating the technology. It guides industry professionals through:
· assessing whether a digital twin will deliver to stakeholders’ expectations from the inception of a project
· establishing confidence in the data and computational models that a digital twin runs on
· evaluating an organization’s readiness to work with and evolve alongside a digital twin.
DNV’s RP intends to provide valuable guidance for digital twin developers; introduces a contractual reference between suppliers and users; and acts as a framework for verification and validation of the technology. It builds upon the principles of DNV’s Recommended Practices for the qualification of novel hardware technology and assurance of data and data-driven models.
Making digital twins a real asset
Physical assets are built to perform to the highest standards and undergo rigorous assurance processes throughout their life. However, there has been no requirement for their digital counterparts to go through the same procedures. Our new recommended practice seeks to remedy this issue as the technology begins a path of significant scaling across the sector. We believe it is time to prove that twins can be trusted and that the investments made in them give the right return!
The methodology behind DNV’s new RP has been piloted on 10 projects with companies including Aker BP and Kongsberg Digital. It has also been through an extensive external hearing process involving the industry at large. In addition, TechnipFMC’s deep domain knowledge and expertise in digital technologies and oil and gas infrastructures has made an essential contribution to jointly developing the RP.
A framework to handle complex requirements
The framework provides clarity on the definition of a digital twin; required data quality and algorithm performance; and requirements on the interaction between the digital twin and the operating system. It addresses three distinct parts: the physical asset, the virtual representation, and the connection between the two. This connection amounts to the data streams that flow between the physical asset to the digital twin and information that is available from the digital twin to the asset and the operator for decision making.
A preview copy of our recommended practice can be downloaded from our website: https://www.dnv.com/oilgas/digital-twins/preview-DNVGL-RP-A204-qualification-and-assurance-of-digital-twins.html
We’d love to get your comments and feedback on our work – and look forward to giving a short overview of our methodology at the Gemini call on 3rd August 2021.
Head of Growth and Innovation UK & Ireland – Energy Systems
Footnote: Who are DNV?
We’re an independent assurance and risk management company, part of our service offering includes the provision of software, platforms, cyber and other digital solutions to the energy sector. We have a specific focus on helping our customers manage risk and complexity linked to the energy transition, specifically their ongoing decarbonization and digitalization journeys.
Company website: www.dnv.com
Link to digital twin services: https://www.dnv.com/oilgas/digital-twins/services.html
The building stock is a city’s most significant socio-cultural and economic resource and its largest capital asset. Buildings are also where we spend most of our lives and most of our money, and where enormous potential for energy and waste reduction lies.
To help improve the quality, sustainability and resilience of building stocks, and to help reduce emissions from them, comprehensive information on their composition, operation and dynamic behaviour are required. However in many countries relevant data are extremely difficult to obtain, often highly fragmented, restricted, missing or only available in aggregated form.
Colouring Cities sets out to address this issue. The initiative develops open code to facilitate the construction and management of low cost public databases, which double as knowledge exchange platforms, providing open data on buildings, at building level. These are provided to answer questions such as: How many buildings do we have? Which building types, uses, construction systems, ages, styles and sizes are located where? How repairable, adaptable and extendable are they? How long can they last if properly maintained? How energy efficient are they? Can they easily be retrofitted? Who built them and what is their ownership type, and how well do local communities think they work?
Colouring Cities also looks to advance a more efficient, whole-of-society approach to knowledge sharing on buildings and cities, allowing for permanent databases to be collaboratively maintained and enriched, year-on-year, by citizens, academia, government, industry and the voluntary sector. Colouring London https://colouringlondon.org/, our live prototype, has been built and tested over the past five years using a step-by-step collaborative approach which has involved consultation with academia, government, industry, the voluntary sector and the community (working across science, the humanities and the arts). It looks to test four approaches to data provision-collation of existing open uploads, computational generation, local crowdsourcing and live streaming.
In 2020 the Colouring Cities Research Programme was set up at The Alan Turing Institute to support international research institutions wishing to reproduce and co-work on Colouring Cities code at city or country level. We are currently collaborating with academic partners in Lebanon, Bahrain, Australia, Germany and Greece and Switzerland.
Watch the Hub Insight to learn more about the project and the opportunity to get involved.
If you'd like to get involved please do test our site and add any recommendations for features you would like in our discussion thread https://discuss.colouring.london/. Or, if you are a public body or DTHub industry member wishing to increase open access to your infrastructure datasets, and/or to digital twin visualisations, relating to the building stock, please contact Polly Hudson at Turing.
Find out more:
Who are we
Game engine technology is at the heart of heralding a new age of content creation, immersive storytelling, design driven development, and business process innovation. These tools are now being utilised to work along side your data to create a visual front end digital twin, to allow for a more immersive, controllable and completely customisable digital twin application.
Unreal Engine is a game engine created by Epic Games to allow developers to create their own games and immersive 3D worlds. This technology has seen fast adoption across a number of industries including Manufacturing, Automotive, Film and Media, Architecture, Engineering and Construction [AEC]. As the need to collaborate virtually with stakeholders and end-users has increased, and the need to customise unique applications and visualise our 3D models and data becomes more important, it is where the role of game engines in AEC is making a mark. Unreal Engine is a free, open source tool for creators to develop their custom real-time experiences.
Unreal Engine and Digital twins
Data alone can often be confusing and hard to understand, its not until the data is contextualised that you are able to better understand the data and turn it into information that can benefit the project. This is where the Unreal Engine is here to support the Digital Twin communities, with its unique ability to aggregate data sources, from 3D geometry, BIM metadata, 4D construction data and IoT Hubs. Users are able to have a centralised location to contextualise the data in its native environment and allow users to build custom applications around it.
Getting involved in our future roadmap...
As we see more and more companies developing large scale digital twin applications, here at Epic Games we want to make sure we are providing everything you need to make your own digital twin applications with Unreal Engine. To allow you to integrate your existing data, geometry and IoT hub information into a visual platform for sharing with the world.
We'd love to hear from you about how you see the world of digital twins evolving. Going forward, which tools and features will you find most valuable in creating digital twins? What kinds of training and support would you like to have access to from Epic Games on this?
To help them serve you better, please take their survey about the current state of digital twins, and share your ideas or what you would like to see happen.
Take the survey here
Results of this survey will be shared to the community for wider awareness. In the mean time you can check out a recent article we shared with one of our customers in China:
Good day to you!
I am a member of the BSi e-committee, tasked with producing the attached draft of BS 99001:2021 Quality management systems.
It has been produced with the intention of being utilised alongside BS EN ISO 9001:20159001 in the UK construction sector, as it has specific requirements for the built environment sector.
It is out for public consultation until the 24th July 2021. Thereafter BSI shall hold comment resolution meeting(s) to address and resolve comments received.
The aim of this new quality management standard is to ensure that in the wake of the Grenfell Tower Inquiry, BS EN ISO 9001:20159001 remains relevant to UK construction industry.
Because the NDTp is such an important element of the ever changing landscape of the UK construction industry, the BSi e-committee would very much appreciate feed back from those who are heavily involved in digitalization of the built environment in general, and those who are committed to the NDTp in particular, on the draft version of BS 99001:2021. Specifically feedback on this question would be very gratefully received:-
Will this new, built environment centric quality management system, actually help the NDTp achieve its vision, by not only supporting that vision, but actually being a key enabler of that vision?
Please do make comments using the online SDP system. Please note comments need to be saved and submitted individually.
Obviously if you have any questions, please do contact me.
Sincere and grateful thanks in advance everyone,
BSI 99001.pdf Read more...
The attached guide was put together from discussions and knowledge share through the Infrastructure Asset Data Dictionary for the UK (IADD4UK) group between 2013 and 2018. Updated where appropriate to include the most recent standards and some additional thought leadership.
The IADD4UK initiative was formed of the foremost owners, major projects, delivery partners and interested parties under the chairmanship of the COMIT innovations group. A list of participants can be found at the rear of this guide.
Early in our BIM journey it was recognised that data and its slightly more refined form, information would be the key. We had standards as how to classify it, manage it, secure it, procure it, exchange it, but nothing about what “it” actually was.
It was also understood that this required information would have an impact on everything we do with our assets, across the entirety of its lifecycle. That impact had a relationship with the outcomes delivered to their respective clients, whether that was an end user, consumer, member of the public, a shareholder or the country itself. The delivery of the outcomes ensured that there was a value in the information, without which their upkeep would not be possible.
The IADD4UK group was put together with an agreement to research and document the best way to create information requirements, not to write them, but it was agreed that if organisations could come together when writing them, the costs and risk could be shared and the benefits doubled.
The reason for increased benefits, were that when assets were transferred from one owner to another, or between delivery partners they would be described in the same way, negating the risks of translation and converting information from one system to another. Key assets in infrastructure are basically the same, whether they are owned by a transport, communications, energy or water company. They will have the same questions, tasks and decisions during their lifecycle. The answers will be different, but the basic information requirement will be largely the same. This commonality across owners could help reduce the procurement costs and the risks of generating, managing and exchanging each information set with the side effect of reducing interoperability issues between software packages.
In 2017 the IADD4UK organisation was put on hold for various reasons, chiefly lack of funding to both create and curate a common information requirements dictionary. This meant that the participants in the initiative dispersed to create their own data dictionaries utilising some of the methods and processes shared with you in this guide.
Writing information requriements by IADD4UK.pdf Read more...