Jump to content
Join live conversation in our weekly Gemini Call - knowledge sharing and digital twin presentations from the community ×

Public Resources

36 files

  1. Digital Twin Toolkit

    The purpose of this toolkit is to help you on your digital twin journey. It is intended to help you and your team think about why you need a digital twin and what it can be used for.
    DT Hub members asked for support in making the business case for digital twins so through the  Gemini programme, we put together a team of volunteers who are working in the area of digital twins and who offered their contribution on a pro bono basis to the development of a DT Toolkit.
    The result is a DT toolkit report which takes you through:
    ·        What is a digital twin?
    ·        What can digital twins be used for?
    ·        A case study register on the DT Hub
    ·        A business case template for a digital twin
    ·        A roadmap to implementing your digital twin
    This is the first version of the DT toolkit and we’re looking for your involvement in testing this toolkit and developing it further. Please comment here.
    The toolkit includes a business case template (available below or upon clicking "download this file"), which is intended to help you put together the business case for a digital twin.
    To watch the Sarah Hayes' presentation and the DT Toolkit Launch click here.
     
     
    DT toolkit business case template.docx

    2,197 downloads

    0 comments

    Updated

  2. How Finance and Digital Twins Can Shape a Better Future for the Planet

    How Finance and Digital Twins Can Shape a Better Future for the Planet
    by Alexandra Bolton, Executive Director, Centre for Digital Built Britain; Mark Coates, International Director of Public Policy and Advocacy, Bentley Systems; Peter El Hajj, National Digital Twin Programme Lead, Centre for Digital Built Britain
     
     
    The finance industry has long been at the forefront of using data and technology to make better decisions, to de-risk and improve return on investments, and to create better outcomes. Tackling the big challenges of our day—such as climate change, energy, and healthcare—relies on having the right technologies and data to make the right interventions. 
    One of the most significant technology developments of the past decade, digital twins—a digital replica of a physical asset or world—is the key in building infrastructure that supports future generations. 
    The Centre for Digital Built Britain (CDBB) has seen first-hand how digital twins can improve decision-making in the planning, design, build, and operation of assets, as well as the benefits of connecting the technology across organisations and sectors. 
    Based on the Gemini Principles, the CDBB spearheaded the development of the U.K.’s national digital twin project, building an ecosystem of connected digital twins that can securely share infrastructure information in real-time to support better outcomes.
    If this goal seems one step too far into science fiction, then we can take inspiration from Singapore, which recently became the first country to create a country-wide digital twin. The technology will help create more sustainable, resilient, and smart development; help in the rollout of renewable energy; and protect against climate change and rising sea levels.
    Cross River Rail in Brisbane, Australia, is another example of a publicly sponsored mega-project that can provide the catalyst for a city-level digital twin. Featuring multi-environment digital twins that constantly talk to one another, the AUD 5.4 billion (GBP 3 billion) project aims to introduce the benefits of a federated model approach to digital engineering.
    The issues in Singapore and Brisbane that are being solved through digital twins are similar to those that are affecting, or soon will affect, the U.K. However, there are other use cases for the technology. The Grenfell Tower tragedy, for example, has ushered in a renewed focus from government and regulators on the safety of higher risk buildings, while national and international banks are having to demonstrate the impact of their investments against economic, social, and environmental (ESG) targets. 
    As key investors, the finance community has an opportunity to realise the benefits of digital twins by using them to add value, track sustainability targets, attract new investments, and manage risk better. 
    Having spoken with a wide range of investors, the conclusion is that there is huge untapped potential for investors to influence how data is used to improve infrastructure decision-making. Also, by taking on a greater role in the digital transformation of infrastructure, investors can be involved in providing better outcomes for businesses, people, and nature. 
    There is also a significant opportunity to leverage digital twins to support key challenges facing the investment community: where to allocate capital; screening and managing risk; enhancing asset value by improving performance and reliability; and complying with environmental, social, and governance (ESG) requirements. 
    The key to unlocking this potential is to apply the fundamentals of the information value chain. 
    By collaborating with the wider industry to develop practical use cases, the finance community can help use the insights derived from data to solve their most pressing problems. 
    The infrastructure sector needs to play its part and approach this dialogue with openness and flexibility. Infrastructure professionals need to understand that investors generate their return on investment (ROI) in a variety of ways, via different types of assets and at different stages of the infrastructure lifecycle. Some of their use cases overlap with those already developed for supply chain businesses or operators while others will not. 
    We first need to improve the quality of the dialogue between investors and other parts of the infrastructure sector, re-imagine the information value chain from an investor’s perspective, explore how investors can expand their leadership role, and share some use cases investors are currently pursuing.
    From an infrastructure industry perspective, there are three important steps to achieve this goal:
    Understand the variety of infrastructure investors and what that means for the different ways that they can benefit from digital twins  Understand how investors categorise infrastructure Relate digital twin use cases to different investor strategies  Collaboration across the infrastructure industry and investors is key to building smarter, more sustainable infrastructure. When we collaborate, across boundaries and across borders, we can do amazing things. We can make better business decisions that drive better economic, social, and environmental outcomes.
    There is already progress being made, with digital twin spending set to reach USD 27.6 billion by 2040, according to an article published by Twinview. It will be exciting to see how the finance community will join with the wider infrastructure community and solution providers to use higher quality data and digital twins to improve investment returns, meet ESG goals, and create the sustainable future we all want to see.

    20 downloads

    0 comments

    Updated

  3. Public resources on DAFNI

    Public resources on DAFNI

    6 downloads

    0 comments

    Updated

  4. CReDo SIPOC Diagram

    CReDo SIPOC Diagram

    3 downloads

    0 comments

    Updated

  5. CReDo Task List and Dependencies

    CReDo Public Task List and Dependencies

    7 downloads

    0 comments

    Updated

  6. Identifying the expected impacts of CReDo - executive summary

    Executive summary of the report into identifying the expected impacts of CReDo.

    9 downloads

    0 comments

    Updated

  7. CReDo technical report 1: Building a Cross-Sector Digital Twin

    CReDo aims to demonstrate how the National Digital Twin programme could use connected digital twins to increase climate resilience. This first phase of the project investigates how to implement a digital twin to share data across sectors to investigate the impact of extreme weather, in particular flooding, on energy, water and telecoms networks. The current digital twin integrates flood simulations for different climate change scenarios with descriptions of the energy, water and telecoms networks, and models the interdependence of the infrastructure to describe the resilience of the combined network.
    CMCL Innovations were engaged by the Centre for Digital Built Britain (CDBB )and the Connected Places Catapult (CPC) as part of CReDo to develop a digital twin of assets from Anglian Water, BT and UK Power Networks. The digital twin combines a description of the logical connectivity between the assets with flood data to resolve the effect of floods on individual assets and the corresponding cascade of effects across the combined network. It demonstrates how to achieve basic interoperability between data from different sectors, and how this data might be combined with flood data for different climate scenarios to begin to explore the resilience of the combined network and identify vulnerabilities to support strategic decision making and capital planning.
    The first phase of the digital twin and an accompanying visualisation were implemented on DAFNI, the Data & Analytics Facility for National Infrastructure. This report describes the use and technical implementation of the current digital twin. Recommendations are made for how it could be extended to improve its ability to support decision making, and how the approach could be scaled up by the National Digital Twin programme.

    171 downloads

    0 comments

    Updated

  8. CReDo: an overview

    An overview of the CReDo project.

    200 downloads

    0 comments

    Updated

  9. Identifying the expected impacts of CReDo report

    A report prepared by Frontier Economics for the Centre for Digital Built Britain identifying the expected impacts of CReDo.

    34 downloads

    0 comments

    Submitted

  10. How to access CReDo on DAFNI platform

    Guidance for accessing CReDo through the DAFNI platform.

    11 downloads

    0 comments

    Submitted

  11. CReDo Data Exploration License Template

    A data exploration license template for use with the CReDo project.

    21 downloads

    0 comments

    Submitted

  12. CReDo: an overview - executive summary

    Executive summary of the overview of CReDo.

    27 downloads

    0 comments

    Submitted

  13. CReDo Technical Report 3: Assessing Asset Failure

    Climate change is increasing the frequency with which the UK infrastructure is threatened by extreme weather events. To explore the potential impact of future climate conditions, the CReDo project is working to develop a digital twin of key infrastructure networks. This digital twin can be used to help make decisions to better protect the networks in advance of extreme weather events, and ultimately to help inform a real-time response to extreme weather events. The novel feature of this tool is that it will provide the collaborating asset owners- and also crisis management teams- with not only assessments concerning the impact of a weather-induced flooding incident in a future climate on the infrastructure and networks monitored by the individual asset owners, but also the operability of assets owned by other companies- where the failure of these assets impinges on the functionality of their own. The highly interdependent nature of these infrastructure networks, such as telephone lines relying on power supplies being operational, mean that reliably modelling the impact of an extreme weather event requires accounting for such connections. It is planned that the shared appreciation of the mutual threats described by the digital twin across the different actors will encourage further coordination between the companies in their strategic plans to mitigate these increasing threats.
    This report outlines just one component of this development. We demonstrate how it is possible to elicit from asset owners the probabilities that each of their assets might fail, in a particular future flood scenario that makes consideration of the impact of climate changes on extreme weather patterns. Taking these unfolding events, and through working with teams of domain experts drawn from asset owners associated with the local power, water and telecommunication companies, our team demonstrate how it is possible to elicit probability distributions of the failure of each asset and their connections within the network. This information would then be fed to operational researchers who can calculate the knock-on effect on the whole network of each simulated future incident. From a decision-analytic perspective, the digital twin would thus consist of connected digital twins representing hydrology, the failure modes of assets, and the system in which the assets sit, with a decision support layer sitting above this.
     

    41 downloads

    0 comments

    Updated

  14. CReDo technical report 2: Generating Flood Data

    Climate change will bring far reaching consequences across many aspects of society, including our health, prosperity and future security. The latest climate projections from the UK Met Office indicate that we will experience warmer, wetter winters and hotter, drier summers, together with an increase in the frequency and intensity of extremes. Substantial increases in hourly precipitation extremes are expected, with the frequency of days with hourly rainfall > 30 mm/h almost doubling by the 2070s. The increase in short, intense, rainfall events may be expected to manifest in flooding which can cause serious threats to society and the economy.
    This report provides details of how flood data was generated within the CReDO project. A summary of different types of flooding are considered (river, coastal, surface water) together with an outline of standard industry approaches and requirements to quantifying probabilities of occurrence. We provide a summary of the information available within the UKCP18 projections, and how this can be used for assessing changes in precipitation under climate change scenarios. This includes the UKCP18 local projections, consisting of hourly data at a 2.2km resolution for 12 simulations from a convection-permitting model, with a bias correction applied, and the probabilistic extremes dataset (PPCE), with discussion of what information these products can and cannot provide.
    Information on the risk of river and tidal flooding in the study region is provided from Environment Agency models. UKCP18 does not provide direct information on flooding, and the flood model HiPIMS was used to convert precipitation to surface water flooding. For generating storm events, FEH methodology was used, in combination with uplifts from different sources to represent the effects of climate change, and a discussion of how UKCP18 products may augment this approach, given appropriate consideration of the challenges in using this for decision making.
    Using HiPIMS allowed the provision of multiple surface water flooding scenarios for different storm lengths, return periods (1 in 100, 1 in 1000 year events) and climate change scenarios, giving spatio-temporal maps of flood depth over time, in a form that can be used to assess the vulnerability of assets and consider how changes in the climate will affect the likelihood, and extent, of flooding in the future.

    78 downloads

    0 comments

    Updated

  15. Annual Benchmark Report 2021

    'The growth of the Digital Twin Hub (DT Hub) over the last two years has exceeded all our expectations as numbers have leapt from an initial group of six, to an amazing 3,400 members, representing more than 1,600 organisations from over 77 countries across the globe.   

    'The DT Hub has become a vibrant meeting place for people wherever they are on their digital twin journey and the ‘go-to’ place for anyone wanting to find out more about connected digital twins. It has been a game-changer, showing the real need and desire for this DT community, and that collaboration, connection, and knowledge exchange are vital if we are to achieve our goal of connected digital twins across the built and natural environments. As the DT Hub approaches its second-year anniversary, we would like to share our progress and learnings.'
    Alexandra Bolton, Executive Director, CDBB
     

    256 downloads

    2 comments

    Updated

  16. CReDo Technical Report 4: Modelling System Impact

    This paper describes the work done on the understanding of infrastructure interdependencies and impact on the overall system. The work on the model described in this report started in September 2021. Access to the data was given at the end of October 2021 and the technical work ran until mid-January 2022.
    The work was led by Lars Schewe and primarily carried out by Mariel Reyes Salazar. The integration of the multiple different networks was carried out by Maksims Abalenkovs. We achieved to demonstrate that we can integrate the data from a digital twin into component networks models and could connect these with an overarching coordinating algorithm. This allows us to propagate failures in the networks and then analyse the impacts on the different networks. The observed runtimes for the test networks indicate that the implemented methods will work on realistic networks and that implementing more complex models is feasible in follow-up projects.
    The technical work planned in the work package was to model each of the component networks, build models that allow to propagate failures through each of them, and propose methods to propagate the failures between them.
    To structure the work, the team proposed three levels of detail for the network models and two levels for the integration. In addition, the objective functions for the underlying optimization problems were to be developed. Due to unavailability of data and the short timescale, it was decided to focus on the first levels for all networks and the integration. As no data was available that could guide the definition of an objective function, this work was not undertaken.
    The basic models were implemented in Python and tested on a small-scale model of part of a UK town. This allowed to demonstrate that the overall methodology is sound and that data from a digital twin can be transferred to more detail network models and the results can be played back to the digital twin.

    66 downloads

    0 comments

    Submitted

  17. 2021 DT Hub Digital Maturity Benchmarking Report

    The DT Hub Digital Maturity Benchmarking Report for 2021.

    167 downloads

    0 comments

    Updated

  18. Digital Twins: Ethics and the Gemini Principles

    The Digital Twins: Ethics and the Gemini Principles report is the result of academic and industry research into the ethics of connected digital twins, in relation to the Gemini Principles, combined with insight from three workshops with members of our community. This work into digital twin ethics is intrinsic to the National Digital Twin programme and the foundation for future discussion.

    278 downloads

    0 comments

    Submitted

  19. A Framework for Composition: A Step Towards a Foundation for Assembly

    A Framework for Composition: A Step Towards a Foundation to Assembly, written by the Information Management Framework team in collaboration with the Construction Innovation Hub, sets out an ontological pattern that is a contribution to the Information Management Framework’s Foundation Data Model - a data model grounded in science and engineering principles that provides the structure and meaning of data -. The Foundation Data Model is the focus of the fifth circle of the 7 circles of Information Management; the intended audiences for the report are data architects and ontologists.

    The report outlines the foundation for composition – a general formal structure at the heart of engineering breakdowns, including the assembly of assets from parts. Most current information systems have been designed around specific breakdowns, without considering their general underlying formal structure. This is understandable, given the focus on devising the breakdown and that there is not a readily available formal structure to build upon. It is a step towards providing a solid general structure; one that will not only be reusable across all kinds of breakdowns but also a solid foundation for the future.   
     

    56 downloads

    0 comments

    Updated

  20. Cyber-Physical Fabric Summit Recording and Transcript

    About the Event
    The emerging perspective is that the pandemic has driven seven years of digital progress in one year. That acceleration has taught us many things for example: the importance of data, particularly the real-time, crowdsourced data on our national systems and infrastructure; the need for better tools for remote immersive collaboration, innovation and learning; the importance of models and simulations to better understand our current world and predict its futures; all manner of smart machines to power agile configurable manufacturing of critical supplies and the orchestrated scalable logistics for their distribution.
    The pandemic may have amplified these needs, but even in normal times solutions to these challenges are needed to drive greater prosperity, resilience, security and sustainability for the UK; ready when future exogenous shocks strike, such as climate change, global cyber attacks or solar storms.
    On the back of the Government’s previously published R&D Strategy, Build Back Better - Plan for Growth and the forthcoming Innovation Strategy, there is a groundswell of interest and support for developing such solutions. Several publicly funded initiatives are underway in data, AI, connected digital twins, and smart machines, complemented by innovation within the private and public sectors. However, there is a need to build greater awareness, understanding and alignment across this complex and diverse technological landscape.
    This summit pulled together stakeholders across this landscape. It explored a bold, expansive vision for a cyber-physical fabric at a national scale to power prosperity and take time, cost and risk out of many vertical initiatives and moonshots. This new horizontal infrastructure would stitch together our physical and digital worlds, weaving together threads such as data, AI, synthetic environments, living labs, smart machines and social science. It could be as transformative as the world wide web and, like the web, would be owned by nobody but used by everybody.
    This public online event was organised through the Royal Academy of Engineering, hosted on the Digital Twin Hub, and supported by the Centre for Digital Built Britain, Robotics Growth Partnership, BEIS, UKRI, Go Science and Alan Turing Institute. As well as hearing from these stakeholders across several panel sessions, there was also ample opportunity for audience questions and engagement within the chat.
    A full version of the chat transcript from the day can be downloaded by clicking the download button on the right.

    66 downloads

    0 comments

    Updated

  21. A Survey of Industry Data Models and Reference Data Libraries

    A Survey of Industry Data Models and Reference Data Libraries published in November 2020 is the initial version of a continuous work to identify and assess existing Industry Data Models and Reference Data Libraries in terms of scope, quality and formalisation of the documentation, formalisation of the representation, maintenance frequency, usage …
    This survey is key to identifying existing industry data models that may have to be mapped into the NDT’s Foundation Data Model and Reference Data Library (RDL). Additionally, this work is intended to help the technical team to identify potential authoritative sources for key classes that will become part of the NDT’s RDL (for instance, units of measure).
    The list is open, and more standards are being added to the survey on the DT Hub. Please refer to this page to see the most up-to-date list and don’t hesitate to suggest standards for the team to add to the list.

    111 downloads

    2 comments

    Updated

  22. A survey of Top-Level Ontologies

    To underpin the sharing of data across organisations and sectors, the National Digital Twin progamme (NDTp) aims to develop an ontology - a theory of what exists, i.e. the things that exist and the rules that govern them – capable to describe “life, the universe and everything” (Adams, 1980). As set out in the Pathway towards the IMF, this ontology will consist of:
    a Foundation Data Model – the rules and constraints on how data is structured an ecosystem of Reference Data Libraries, compliant with the Foundation Data Model – the particular set of classes and properties we use to describe digital twins. To achieve the consistency required to share information across organisations and sectors, the Foundation Data Model needs to be underpinned by a Top-Level Ontology (TLO), i.e. the top level categories (“thing”, “class”, …) and the fundamental relationships between them that are sufficient to cover the scope of a maximally broad range of domains.
    As a starting point to define the NDT’s TLO, the technical team has reviewed and classified existing TLOs in A survey of Top-Level Ontologies, published November 2020, and assessed them against a set of requirements.
    This has led to the identification of 4 potential candidates for the NDT’s TLO: BORO, IDEAS, HQDM and ISO 15926-2, as set out in The Approach to Develop the FDM for the IMF publication.
    Ontologies continue to be added to the survey. Please refer to this page to see the most up-to-update list and to suggest other ontologies for the team to consider.

    138 downloads

    0 comments

    Updated

  23. Integration Architecture Pattern and Principles

    As set out in the Pathway to the Information Management Framework, the Integration Architecture is one of the three key technical components of the Information Management Framework, along with the Reference Data Library and the Foundation Data Model. It consists of the protocols that will enable the managed sharing of data across the National Digital Twin.
    In the Integration Architecture Pattern and Principles paper, the National Digital Twin programme’s (NDTp) technical team sets out key architectural principles and functional components for the creation of this critical technical component. The team defines a redeployable architectural pattern that allows the publication, protection, discovery, query and retrieval of data that conforms to the NDT’s ecosystem of Reference Data Libraries and the NDT’s Foundation Data Model. 
    The paper will take you through:
    A requirement overview: a series of use cases that the Integration Architecture needs to enable, including:
    routine operational use cases: where data from a diverse set of organisations can be shared and analysed for a single purpose (e.g to support legal and regulatory requirements) the ability to respond to an emergency: pulling together data from across different communities in a way that was not foreseen before the incident that caused the requirement ‘business as usual’ NDT maintenance use cases such as publishing a Digital Twin or adding a user to the NDT ecosystem.
      Architectural principles: key architectural principles that must be adhered to, regardless of the type of architecture that is implemented, including:
    Data quality: data quality needs to be measurable and published with the data itself Privacy of the published data: the Integration Architecture shall ensure that data is shared and used only according to the conditions under which it was published.  Security: ensuring that all data and functions are secure from bad actors. Encryption will be a particularly key aspect of the security features in the Integration Architecture.
      Recommended integration architecture pattern:
    Three general architectural pattern options are explored in the paper (centralised, distributed, and federated). The benefits and concerns for each pattern are discussed with respect to the requirements. The recommended architectural pattern is a hybrid of these three approaches – centralising certain functions, whilst distributing and federating others. 
    The recommended pattern is intended to allow datasets to be shared locally (i.e., within an NDT Node, see figure below), but will also allow for inter-node discovery, authorisation and data sharing to take place. NDT Nodes may be established by individual organisations, regulators and industry associations, or service providers and will be able to handle Digital Twins on behalf of their constituent organisations and provide a secure sharing boundary.
    In the recommended architecture:

    Datasets are published by the data owner (1), these are then made available to the organisations within the community of interest, in addition an event is issued to register publication with the Core (2).

    When queries are submitted (A), the dataset can then be discovered by organisations in other communities of interest (B) and retrieved where appropriate (C).

    The release, discovery and retrieval are carried out according to the authorisation service so that access is controlled as specified by the data owner.
     
     
    Detail of the functional components:
    The Core Services are likely to be quite thin, comprising mainly of:
    a master NDT Catalogue that holds the location of available NDT Datasets across the ecosystem
      the master FDM/RDL that will synchronise with the subset that is relevant for each NDT Node
      a publish/ subscribe model to propagate data changes to parties that have an interest and appropriate contract in place. The Core and each NDT Node shall interact through a microservice layer, with which they shall have to be compliant.
     
     
    Next steps:
    The paper concludes with a list of ten key tasks to develop further the Integration Architecture components. We will make sure to inform you on progress and in the meantime, we are looking forward to hearing your questions and comments.

    126 downloads

    1 comment

    Updated

  24. Pathway Toward and Information Management Framework: Summary of Consultation Responses

    In May 2020 the National Digital Twin programme (NDTp) published the Pathway to an Information Management Framework (IMF). The publication was accompanied by an open consultation to seek feedback on our proposed approach and to hear from across the community about how they thought the IMF should develop to support their use and adoption of it.  
    The consultation ran until the end of August, with ongoing engagement with the programme’s technical stakeholders, we received a great deal of valuable feedback. The full summary of the IMF Pathway Consultation Responses is published here today, written by Miranda Sharp, NDTp Commons Lead. 

    45 downloads

    0 comments

    Updated

  25. The Approach to Develop the Foundation Data Model for the Information Management Framework

    The Approach to Develop the Foundation Data Model published in March 2021 follows up on the Survey of Top-level Ontologies (TLO) published in November 2020.
    It sets out the Top-Level-Ontology requirements for the NDT's Foundation Data Model. Drawing upon the assessment of the TLOs listed in the Survey of Top-level Ontologies, it identifies 4 potential candidates for the NDT’s TLO: BORO, IDEAS, HQDM and ISO 15926-2:

    The four candidates are distinct from the other TLOs in being 4-dimensionalist, i.e. they consider individual objects as four-dimensional, with both spatial and temporal parts.
     

    101 downloads

    0 comments

    Updated


Top
×
×
  • Create New...