Jump to content

Public Resources

15 files

  1. Digital Twin Toolkit

    The purpose of this toolkit is to help you on your digital twin journey. It is intended to help you and your team think about why you need a digital twin and what it can be used for.
    DT Hub members asked for support in making the business case for digital twins so through the  Gemini programme, we put together a team of volunteers who are working in the area of digital twins and who offered their contribution on a pro bono basis to the development of a DT Toolkit.
    The result is a DT toolkit report which takes you through:
    ·        What is a digital twin?
    ·        What can digital twins be used for?
    ·        A case study register on the DT Hub
    ·        A business case template for a digital twin
    ·        A roadmap to implementing your digital twin
    This is the first version of the DT toolkit and we’re looking for your involvement in testing this toolkit and developing it further. Please comment here.
    The toolkit includes a business case template (available below or upon clicking "download this file"), which is intended to help you put together the business case for a digital twin.
    To watch the Sarah Hayes' presentation and the DT Toolkit Launch click here.
     
     
    DT toolkit business case template.docx

    845 downloads

    0 comments

    Updated

  2. Cyber-Physical Fabric Summit Recording and Transcript

    About the Event
    The emerging perspective is that the pandemic has driven seven years of digital progress in one year. That acceleration has taught us many things for example: the importance of data, particularly the real-time, crowdsourced data on our national systems and infrastructure; the need for better tools for remote immersive collaboration, innovation and learning; the importance of models and simulations to better understand our current world and predict its futures; all manner of smart machines to power agile configurable manufacturing of critical supplies and the orchestrated scalable logistics for their distribution.
    The pandemic may have amplified these needs, but even in normal times solutions to these challenges are needed to drive greater prosperity, resilience, security and sustainability for the UK; ready when future exogenous shocks strike, such as climate change, global cyber attacks or solar storms.
    On the back of the Government’s previously published R&D Strategy, Build Back Better - Plan for Growth and the forthcoming Innovation Strategy, there is a groundswell of interest and support for developing such solutions. Several publicly funded initiatives are underway in data, AI, connected digital twins, and smart machines, complemented by innovation within the private and public sectors. However, there is a need to build greater awareness, understanding and alignment across this complex and diverse technological landscape.
    This summit pulled together stakeholders across this landscape. It explored a bold, expansive vision for a cyber-physical fabric at a national scale to power prosperity and take time, cost and risk out of many vertical initiatives and moonshots. This new horizontal infrastructure would stitch together our physical and digital worlds, weaving together threads such as data, AI, synthetic environments, living labs, smart machines and social science. It could be as transformative as the world wide web and, like the web, would be owned by nobody but used by everybody.
    This public online event was organised through the Royal Academy of Engineering, hosted on the Digital Twin Hub, and supported by the Centre for Digital Built Britain, Robotics Growth Partnership, BEIS, UKRI, Go Science and Alan Turing Institute. As well as hearing from these stakeholders across several panel sessions, there was also ample opportunity for audience questions and engagement within the chat.
    A full version of the chat transcript from the day can be downloaded by clicking the download button on the right.

    18 downloads

    0 comments

    Updated

  3. An integrated approach to information management: Identifying decisions and the information required for them using activity and process models

    The Information Management Framework (IMF) is intended to enable better information management and information sharing at a national scale and provide the standards, guidance and shared resources to be (re)used by those wishing to participate in the National Digital Twin ecosystem. 

     

    While the scope of the IMF is broad, the “7 circles of Information Management" diagram is a pragmatic way to divide the information management space into areas of concern that can be addressed separately whilst supporting each other. It identifies coherent areas of interest that can be addressed relatively independently. 
    As part of the second circle of the diagram, the IMF technical team has released this paper outlining our recommended approach to developing information requirements, based on the analysis of process models.
     
     
    The methodology first identifies an organisation's processes, the decisions taken as part of the process, and then the information requirements to support the decisions. These are communicated to those who create the information. This provides a systematic approach to identifying the information requirements and when it is most cost-effectively created. Managed appropriately, this information capture can avoid costly activity to create information by surveying or inspecting in-use assets.
    To allow this anticipation of information needs, the methodology set out in the paper recommends the following steps:
    identify the lifecycle activities that an organisation performs decompose the activities in order to identify the material “participants” involved (“things” required for each activity: people, resources, assets, equipment, products, other activities, …) identify the decisions critical for these activities identify the information requirements for those decisions and the quality required. Read more in the blog containing a video introduction to the “7 circles of Information Management” by IMF Technical Team Lead, Matthew West, followed by a deep dive into the second circle – Process Model based Information Requirements – presented by Al Cook, main author of this paper. 

    57 downloads

    0 comments

    Updated

  4. The Approach to Develop the Foundation Data Model for the Information Management Framework

    The Approach to Develop the Foundation Data Model published in March 2021 follows up on the Survey of Top-level Ontologies (TLO) published in November 2020.
    It sets out the Top-Level-Ontology requirements for the NDT's Foundation Data Model. Drawing upon the assessment of the TLOs listed in the Survey of Top-level Ontologies, it identifies 4 potential candidates for the NDT’s TLO: BORO, IDEAS, HQDM and ISO 15926-2:

    The four candidates are distinct from the other TLOs in being 4-dimensionalist, i.e. they consider individual objects as four-dimensional, with both spatial and temporal parts.
     

    5 downloads

    0 comments

    Updated

  5. A survey of Top-Level Ontologies

    To underpin the sharing of data across organisations and sectors, the National Digital Twin progamme (NDTp) aims to develop an ontology - a theory of what exists, i.e. the things that exist and the rules that govern them – capable to describe “life, the universe and everything” (Adams, 1980). As set out in the Pathway towards the IMF, this ontology will consist of:
    a Foundation Data Model – the rules and constraints on how data is structured an ecosystem of Reference Data Libraries, compliant with the Foundation Data Model – the particular set of classes and properties we use to describe digital twins. To achieve the consistency required to share information across organisations and sectors, the Foundation Data Model needs to be underpinned by a Top-Level Ontology (TLO), i.e. the top level categories (“thing”, “class”, …) and the fundamental relationships between them that are sufficient to cover the scope of a maximally broad range of domains.
    As a starting point to define the NDT’s TLO, the technical team has reviewed and classified existing TLOs in A survey of Top-Level Ontologies, published November 2020, and assessed them against a set of requirements.
    This has led to the identification of 4 potential candidates for the NDT’s TLO: BORO, IDEAS, HQDM and ISO 15926-2, as set out in The Approach to Develop the FDM for the IMF publication.
    Ontologies continue to be added to the survey. Please refer to this page to see the most up-to-update list and to suggest other ontologies for the team to consider.

    0 downloads

    0 comments

    Updated

  6. A Survey of Industry Data Models and Reference Data Libraries

    A Survey of Industry Data Models and Reference Data Libraries published in November 2020 is the initial version of a continuous work to identify and assess existing Industry Data Models and Reference Data Libraries in terms of scope, quality and formalisation of the documentation, formalisation of the representation, maintenance frequency, usage …
    This survey is key to identifying existing industry data models that may have to be mapped into the NDT’s Foundation Data Model and Reference Data Library (RDL). Additionally, this work is intended to help the technical team to identify potential authoritative sources for key classes that will become part of the NDT’s RDL (for instance, units of measure).
    The list is open, and more standards are being added to the survey on the DT Hub. Please refer to this page to see the most up-to-date list and don’t hesitate to suggest standards for the team to add to the list.

    2 downloads

    0 comments

    Updated

  7. The pathway towards an Information Management Framework

    Following a year-long consultation exercise bringing together leading experts from the data science and information management communities, The pathway towards an Information Management Framework (IMF) report was published in May 2020. This report sets out the technical approach to deliver the Information Management Framework, a common language by which digital twins of the built and natural environment can communicate securely and effectively to support improved decision taking by those operating, maintaining and using built assets and the services they provide to society.
    The report outlines three building blocks to form an appropriately functioning technical core:
    ·        Foundation Data Model (FDM): a consistent, clear understanding of what constitutes the world of digital twins
    ·        Reference Data Library (RDL): the particular set of classes and properties we will use to describe our digital twins
    ·        Integration Architecture (IA): the protocols that will enable the managed sharing of data.
    A webinar, The pathway towards an Information Management Framework, was held on 8 June 2020, you can watch it here:
    Following the publication of the report,  an open consultation to get feedback on the methodology proposed was run and feedback was consolidated in the following summary.

    2 downloads

    0 comments

    Updated

  8. Agile Standards White Paper

    This white paper explores an exciting new direction in the future of good practice – the potential to take an “agile” approach to the development of standards. It considers what is meant by, and included in, an agile approach and where applying this could be beneficial. It also forms an important contribution to the debate over how standards making can adapt to a new, faster paced world where best practices need to be shaped and refined by stakeholder communities in a form of “dynamic consensus”. The paper investigates the challenges and opportunities for everyone involved and offers guidance on when agile standards are most suitable and where they can deliver greatest value.
     

    44 downloads

    0 comments

    Submitted

  9. Our Vision for the Built Environment

    With contributions from over 75 industry leaders and endorsed by more than 35 cross industry bodies spanning the UK built environment sector, the Vision describes the future we want: a built environment whose explicit purpose is to enable people and nature to flourish together for generations.
    For the first time in history, we have the tools and technology to work together to make this future vision a reality and to manage the built environment in a way that addresses the biggest challenges of our time, from the climate crisis to the COVID-19 pandemic. Aligned to the UK Government’s ambition to build back better, the simple yet radical Vision calls for people and nature to be at the heart of how we design, build, operate, and use our existing built environment. We must make better decisions now to deliver a better future for all.
    Across its programmes the Centre for Digital Built Britain (CDBB) is working towards better outcomes for people and the environment by using digital data well. As well as building a resilient and world-leading UK construction sector, this will serve as a platform for a sustainable and flourishing economy, society and environment.

    14 downloads

    0 comments

    Submitted

  10. Digital Twin Navigator

    The Construction Innovation Hub (CIH) has published the Digital Twin Navigator, an interactive guide looking to help client bodies to embed digital twinning into a project from the earliest opportunity.
    Drawing upon some of the key definitions set out in the DT toolkit report (the use case framework, the DT sophistication levels, …), the Digital Twin Navigator offers specific guidance at an individual project level, focusing on how digital twins can be implemented for capital projects. Taking the construction of a new hospital as a worked example, the navigator suggests a 5-stage approach; from making the initial business case for a digital twin to refining and evaluating it throughout the different phases of the project.
    It also sets out key considerations for defining the requirements for a digital twin, highlighting the importance of interfacing and aligning digital twin strategy to the BIM, Soft landing and Facility management strategies - to build a holistic set of information requirements for the project both in the context of static and dynamic elements. The DT navigator also emphasises the importance of a “no regrets” approach, where alignment with the IMF will ensure that a digital twin can ultimately be part of an ecosystem of connected digital twins – a part of the National Digital Twin.
     

    45 downloads

    1 comment

    Submitted

  11. Integration Architecture Pattern and Principles

    As set out in the Pathway to the Information Management Framework, the Integration Architecture is one of the three key technical components of the Information Management Framework, along with the Reference Data Library and the Foundation Data Model. It consists of the protocols that will enable the managed sharing of data across the National Digital Twin.
    In the Integration Architecture Pattern and Principles paper, the National Digital Twin programme’s (NDTp) technical team sets out key architectural principles and functional components for the creation of this critical technical component. The team defines a redeployable architectural pattern that allows the publication, protection, discovery, query and retrieval of data that conforms to the NDT’s ecosystem of Reference Data Libraries and the NDT’s Foundation Data Model. 
    The paper will take you through:
    A requirement overview: a series of use cases that the Integration Architecture needs to enable, including:
    routine operational use cases: where data from a diverse set of organisations can be shared and analysed for a single purpose (e.g to support legal and regulatory requirements) the ability to respond to an emergency: pulling together data from across different communities in a way that was not foreseen before the incident that caused the requirement ‘business as usual’ NDT maintenance use cases such as publishing a Digital Twin or adding a user to the NDT ecosystem.
      Architectural principles: key architectural principles that must be adhered to, regardless of the type of architecture that is implemented, including:
    Data quality: data quality needs to be measurable and published with the data itself Privacy of the published data: the Integration Architecture shall ensure that data is shared and used only according to the conditions under which it was published.  Security: ensuring that all data and functions are secure from bad actors. Encryption will be a particularly key aspect of the security features in the Integration Architecture.
      Recommended integration architecture pattern:
    Three general architectural pattern options are explored in the paper (centralised, distributed, and federated). The benefits and concerns for each pattern are discussed with respect to the requirements. The recommended architectural pattern is a hybrid of these three approaches – centralising certain functions, whilst distributing and federating others. 
    The recommended pattern is intended to allow datasets to be shared locally (i.e., within an NDT Node, see figure below), but will also allow for inter-node discovery, authorisation and data sharing to take place. NDT Nodes may be established by individual organisations, regulators and industry associations, or service providers and will be able to handle Digital Twins on behalf of their constituent organisations and provide a secure sharing boundary.
    In the recommended architecture:

    Datasets are published by the data owner (1), these are then made available to the organisations within the community of interest, in addition an event is issued to register publication with the Core (2).

    When queries are submitted (A), the dataset can then be discovered by organisations in other communities of interest (B) and retrieved where appropriate (C).

    The release, discovery and retrieval are carried out according to the authorisation service so that access is controlled as specified by the data owner.
     
     
    Detail of the functional components:
    The Core Services are likely to be quite thin, comprising mainly of:
    a master NDT Catalogue that holds the location of available NDT Datasets across the ecosystem
      the master FDM/RDL that will synchronise with the subset that is relevant for each NDT Node
      a publish/ subscribe model to propagate data changes to parties that have an interest and appropriate contract in place. The Core and each NDT Node shall interact through a microservice layer, with which they shall have to be compliant.
     
     
    Next steps:
    The paper concludes with a list of ten key tasks to develop further the Integration Architecture components. We will make sure to inform you on progress and in the meantime, we are looking forward to hearing your questions and comments.

    31 downloads

    1 comment

    Updated

  12. Annual Benchmark Report

    The 31st March 2021 marks one year since the launch of the DT Hub. As we reflect on the year that has been and look ahead to the coming year we are pleased to present the DT Hub’s first Annual Benchmark Report.  This first report aims to share the progress and learnings that we have made. Included in the report is a retrospective summary and analysis of the year as well as highlights, key findings and recommendations for the year ahead.

    61 downloads

    0 comments

    Submitted

  13. Pathway Toward and Information Management Framework: Summary of Consultation Responses

    In May 2020 the National Digital Twin programme (NDTp) published the Pathway to an Information Management Framework (IMF). The publication was accompanied by an open consultation to seek feedback on our proposed approach and to hear from across the community about how they thought the IMF should develop to support their use and adoption of it.  
    The consultation ran until the end of August, with ongoing engagement with the programme’s technical stakeholders, we received a great deal of valuable feedback. The full summary of the IMF Pathway Consultation Responses is published here today, written by Miranda Sharp, NDTp Commons Lead. 

    26 downloads

    0 comments

    Updated

  14. Data for the Public Good

    Report from the National Infrastructure Commission which recommends the creation of a National Digital Twin

    31 downloads

    0 comments

    Updated

  15. The Gemini Principles

    Digital twins of physical assets are helping organisations to make better-informed decisions, leading to improved outcomes.
    Creating an ecosystem of connected digital twins – a national digital twin – opens the opportunity to release even greater value, using data for the public good.
    This paper sets out proposed principles to guide the national digital twin and the information management framework that will enable it.

    49 downloads

    0 comments

    Updated


Top
×
×
  • Create New...