Jump to content

Search the Community

Showing results for tags 'digital twins'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


  • Member Forums
    • General discussion
    • Testing digital twin concepts
    • Digital twin competencies
    • Pathway to value
    • IMF Pathway
    • Videos
    • Q & A
  • IMF Community's Forum
  • Data Value and Quality's Forum
  • 4-dimensionalism's Forum
  • The Defence Digital Network's Welcome!
  • The Defence Digital Network's Defence DT Roadmap
  • The Defence Digital Network's Acronym Buster
  • The Defence Digital Network's Open Forum
  • The Defence Digital Network's Documents
  • Open Innovation - Open (Citizen) Science - SDG's Open Innovation
  • Open Innovation - Open (Citizen) Science - SDG's Social Innovation Ecosystem
  • Open Innovation - Open (Citizen) Science - SDG's Events
  • Funding / Collaborative R&D Opportunities's Challenges
  • Funding / Collaborative R&D Opportunities's Funding
  • Italian DT Hub's Q&A
  • Italian DT Hub's News
  • Gemini Papers Community Review's Gemini Papers
  • DT Hub Community Champions's Discussion
  • Gemini Call's Gemini Snapshot and agenda
  • Gemini Call's Recordings


  • Community Calendar
  • Italian DT Hub's Events
  • DT Hub Community Champions's Events


  • A survey of Top-level ontologies - Appendix D


  • Articles
  • Blogs
  • Publications
  • Editorials
  • Newsletters
  • Shared by the Community


  • A Survey of Industry Data Models and Reference Data Libraries


  • Climate Resilience Demonstrator (CReDo)
  • Gemini Call Feature Focus presentations
  • Hub Insights
  • Digital Twin Talks: Interconnection of digital twins
  • Digital Twin Talks: Exploring the definitions and concepts of digital twins
  • Other Digital Twin media


  • Member Resources
  • Public Resources
  • Guidance
  • IMF Community's Files
  • Data Value and Quality's Shared Resources
  • Italian DT Hub's Risorse
  • Gemini Call's Media
  • Gemini Call's Archive

Find results in...

Find results that contain...

Date Created

  • Start


Last Updated

  • Start


Filter by number of...

Found 24 results

  1. The International Organization for Standardization (ISO) has published over 22,000 formal standards supporting the dissemination of good practice to a range of sectors from agriculture to retail. Due to the breadth of topics covered it is difficult to conceive of a domain which hasn’t been at least partially standardized. In fact, as of 2019, ISO had four standards published which referenced digital twins: ISO 14033 (Quantitative Environmental Information) ISO 15704 (Requirements for enterprise-referencing architectures) ISO 18101-1 (Oil and Gas interoperability) ISO 30146 (Smart City ICT Indicators) And, more interestingly, one of these saw the first definition for a digital twin included within an ISO document: Within ISO, there are several requirements which need to be conformed to when producing a definition. These requirements are outlined within two standards: ISO 10241-1 (general requirements and examples of presentation) ISO 704 (principles and methods) ISO 10241-1, which covers the structure of a term including how to structure a definition and referencing; and ISO 704, which covers the principles of doing terminology work. These standards state that when developing a definition, it should: Be a single phrase specifying the concept and, if possible, representing that concept within a larger system; The digital twin definition from ISO/TS 18001 does so by referencing other key terms such as digital assets and services. This provides a relationship to other related terms. In doing so, this definition makes digital twin a type of digital asset being used to create value. Be general enough to cover the use of the term elsewhere; This definition is specific enough to capture what a digital twin is in a generalist sense, while also being sufficiently generic that the same definition can be used in other standards. This is vital to achieve a harmonization of concepts across a disparate suite of documentation. Not include any requirements; and In addition, this definition doesn’t say what needs to be done for something to be considered a digital twin. This is important as definitions are meant to inform, not instruct. Be able to substitute the term within a sentence. Finally, and possibly the most challenging requirement, a definition needs to be able to substitute for the term within a sentence. For example: This exemplar organization utilizes a digital twin to improve the effectiveness of their predicative maintenance systems This exemplar organization utilizes a digital asset on which services can be performed that provide value to an organization to improve the effectiveness of their predicative maintenance systems Within the Gemini Principles, there is also another definition to consider: However, while this definition isn’t suitable for ISO as it wasn’t designed to meet these requirements, the inclusion of “realistic digital representation” might help enhance the ISO definition. And there we have it. The ISO definition for digital twin is, technically speaking, a good example of an ISO definition. However, does the definition sufficiently capture the correct concepts and relationships outlined within the Gemini Principles? Following the criteria above, how would you define a digital twin?
  2. Smart decisions are made with the help of Digital threads and twins in discrete manufacturing. Anyone want to have a free session, how digital thread and twin are established with the help of CAD, PLM, Digital mock-up, MES, Augmented Reality and IoT, please let me know.
  3. The idea of a Digital Twin [DT] needs to advance in a standardized and formal manner. As @DRossiter87 pointed out, this is necessary to "support the consistent adoption of, and interoperability between, digital twins for the built environment within the UK" To aid in this endeavour, it is useful to delineate the difference between the following terms: “DT General Use Case” [GUC]: a very short sentence, possibly consisting of two or three words – ideally a verb followed by a noun – ultimately concerned with the brief and blunt statement of the central business aim that is motivating the use and implementation of a DT, e.g., ‘optimize traffic’. “DT Use Case Scenario” [UCS]: a documentation of the sequence of actions, or in other words, DT-user interactions executed through a particular DT case study or an actual DT real-world project. “DT Use”: a typical technical function or action executed by any DT throughout the course of any UCS. Accordingly, the DT uses are seen as the standard building blocks by which a standard common language can be founded. Such a standard language, which can possibly be machine-readable as well, can be used in documenting and detailing the DT-user interactions in a standard format to facilitate their publishing and sharing of knowledge. Below is a figure of a “DT uses taxonomy”. It is made up of three distinct hierarchical levels, these are respectively: ‘Included Uses’ containing four high-level cornerstone uses that are, besides rare exceptional cases, included in and executed by almost every DT throughout any UCS (i.e. Mirror, Analyse, Communicate and Control); ‘Specialized Uses’ including special forms of the Included Uses, where each specialized use enjoys unique strengths suitable for specific purposes; and “Specialized Sub-Uses” at the lowest hierarchical level of the taxonomy, which further differentiates between variant types within a Specialized Use at the higher level by virtue of very fine inherent variations that distinguish one type from another and thus, enhances the DT’s practical adequacy in dealing with alternative contexts and user specifically defined purposes. The table below include a formal definition of each DT Use. DT Use Definition Synonyms 01 Mirror To duplicate a physical system in the real world in the form of a virtual system in the cyber world. Replicate, twin, model, shadow, mimic 1.1 Capture Express in a digital format within the virtual world the status of a physical system at a point of time. (Usually, offline DT) collect, scan, survey, digitalize 1.2 Monitor Collecting information related to the performance of a physical system. (Online or Offline DT) Sense, observe, measure 1.3 Quantify Measure quantity of a physical system’s particulars, instances or incidents. (Online or Offline DT) Quantify, takeoff, count 1.4 Qualify Track the ongoing status of a physical system (Online or Offline DT) Qualify, follow, track DT Use Definition Synonyms 02 Analyse To create new knowledge and provide insights for users and stakeholders about a physical system. Examine, manage 2.1 Compute To perform conventional arithmetical calculations, traditional mathematical operations and functions and simple statistical techniques like correlations Calculate, add, subtract, multiply, divide 2.2 Mine To uncover, identify and recognize the web of interdependencies, interconnected mechanisms, complex processes, interwoven feedback loops, masked classes, clusters or typologies, hidden trends and patterns within the physical system. Learn, recognize, identify, detect, AI, ML, BDA 2.3 Simulate To explore and discover the implications and possible emerging behaviours of a complex web of interacting set of variables. 2.3.1 Scenario To find out the implications, impacts or consequences of implementing pre-defined scenarios (akin to non-destructive tests) What-if, evaluate, assess 2.3.2 Stress-Test To identify the scenarios that may lead to failure or breakdown of physical system (akin to destructive tests) Test, inspect, investigate 2.4 Predict Concerned with futures studies 2.4.1 Forecast to predict the most likely state of a real system in the future, by projecting the known current trends forward over a specified time horizon. foresee 2.4.2 Back-cast To question or prove in a prospective manner, how the physical system is operating towards achieving the pre-set aims and goals. manage, confirm 2.5 Qualitize Enhance and improve the quality of the outcomes or deliverables produced by an intervention in real world. 2.5.1 Verify Verify conformance and compliance of physical system with standards, specifications and best practice. Validate, check, comply, conform 2.5.2 Improve Inform the future updating, modifying or enhancing the current standards to be in better coherence and harmony with the actual operational and usage behaviours and patterns. Update, upgrade, revise DT Use Definition Synonyms 03 Communicate To exchange collected and analysed information amongst stakeholders. interact 3.1 Visualize To form and vision a realistic representation or model of current or predicted physical system. review, visioning 3.2 Immerse To involve interested stakeholders in real-like experiences using immersive technologies such as VR, AR and MR. involve 3.3 Document Document and represent gathered and/or analysed data in a professional manner and technical language, forms or symbols. Present 3.4 Transform To modify, process or standardize information to be published and received by other DT(s) or other DT users (e.g. a National DT) or overcome interoperability issues Translate, map 3.5 Engage To involve citizens and large groups of people including marginalized groups in policy and decision-making processes. Empower, include DT Use Definition Synonyms 04 Control To leverage the collected and analysed information to intervene back into the real world to achieve a desirable state. Implement, execute 4.1 Inform To support human decision making throughout the implementation of interventions in the real world. Support, aid 4.2 Actuate Using CPS and actuators to implement changes to physical system. Regulate, manipulate, direct, automate, self-govern Standardised set of ‘DT Uses’ can help avoid miscommunication and confusion while sharing or publishing DT Use Case Scenarios and their content explicitly explaining the 'know-how'. It can also support the procurement of DT services by ensuring the use of a one common language across the supply chain and stakeholders. Al-Sehrawy R., Kumar B. @Bimal Kumarand Watson R. (2021). Digital Twin Uses Classification System for Urban Planning & Infrastructure Program Management. In: Dawood N., Rahimian F., Seyedzadeh S., Sheikhkhoshkar M. (eds) Enabling The Development and Implementation of Digital Twins. Proceedings of the 20th International Conference on Construction Applications of Virtual Reality. Teesside University Press, UK.
  4. Henry Fenby-Taylor

    Digital Twin education

    I was looking at the landscape for Digital Twin education, so of course I went to Udemy which is of course my go to resource for learning anything. Does anyone remember Googlewhacks? The education landscape for Digital Twins feels a bit like that. https://www.udemy.com/course/digital-twin-a-comprehensive-overview/learn/lecture/14566716#overview https://www.udemy.com/course/digital-twins-philosophy-for-the-new-digital-revolution/ I wonder what people think should be out there?
  5. It is proposed that the Information Management Framework (IMF) for the creating of a National Digital Twin will consist of three technical elements: the Foundation Data Model (FDM), Reference Data Library (RDL) and Integration Architecture (IA). The IMF will underpin the creation of an environment which supports the use, management and integration of digital information across the life-cycle of assets. The IMF will also enable secure, resilient information sharing between organisations and will facilitate better decision making across sectors. The National Digital Twin Programme has initiated work investigating this approach with a thin slice of the IMF for the Construction Innovation Hub, to support the development of CIH’s Platform Ecosystem. This thin slice of the IMF is called the FDM Seed. The FDM describes basic concepts such as space-time which are attributable across all areas of our industry. By developing this, the FDM provides a way to explore relationships between these different areas. The FDM Seed is an inception of the above concept and is proposed by starting smaller and watching the development grow - similar to a seed. The first steps of the FDM Seed project is to survey the landscape, to investigate what ontologies and Data models are already in use out there, what they can do, and their limitations, and assess what tools may be useful as a starting point for the FDM and the RDL. The starting point for the FDM is a top-level ontology, this contains the fundamental and generic types of things that exist and the fundamental relationships between them. The survey of Top-Level Ontologies (TLOs) uncovered a surprisingly high number of candidate TLOs with 40 being identified and reviewed, many more that we could have imagined. Fig 1.General classification of the TLO – taken from A Survey of Top-level Ontologies The final survey of top-level ontologies is, we think, the first of its kind. We were looking for an ontology that was rigorous, simple and with sufficient explanatory detail to cover our scope of interest, which is very broad. There are roughly two groups of TLOS, Foundational and Generic: The foundation are rigorous, principled foundations and provide a basis for consistent development and would be suitable for the FDM. The Generic tended to generalisations of lower level, rather than principled and lack a principled basis for extension, and therefore not suitable for the structure of the FDM, though likely to be use for the FDM generic lower levels. An RDL provides the classes and properties to describe the detail of an asset. The survey hoped to identify the most prominent of Industry Data Models and show the best starting point for the IMF RDL. There are many different RDLs in use across sectors. For the purpose of the FDM seed a limited analysis was carried out, but the list is open, and more candidates will be added for future consideration. Surveying and analysing the most commonly used RDLs will mean we are able to give guidance to organization when mapping their existing RDLs to the NDT. Next steps The Survey papers have now been published. We encourage you to engage with the National Digital Twin Programme to find out more about the approach, the results of the survey and the Assessments of the TLOs and Industry Data Models & RDLs. You can find these resources under the 'Files' tab. The Programme is now in the process of gathering their recommendations for the TLOs to use to start the work on the FDM Seed thin slice. We anticipate basing the FDM development on one of the TLOs, bringing in elements from others, based on the survey & analysis.
  6. Version 1.0.0


    Make Network Data Valuable, For the Entire Lifecycle. Expanding permutations of flow and instrument readings should never hinder a utility’s ability to deliver safe, reliable, and affordable water. Every day, you need to make the right decisions with quantitative network information, but solving all those equations can feel like an insurmountable task. Download this e-book to learn how you can make accurate, real-time decisions in every phase of the infrastructure lifecycle—from master planning to capital planning and design, to operations and maintenance.
  7. I've decided to stop dithering and try to set out a concise, to-the-point, Digital Twin Policy for HE. Here's what I have in terms of outline contents, with some help from @Neil Thomspon What am I missing? Topic to be covered: Definition: To agree a working definition of Digital Twin for Highways England, and provide some context on the National Digital Twin programme. Principles: To establish a consistent and generally accepted set of principles for the creation and use of Digital Twins by Highways England and associated supply chain projects, and align these to the Information Principles described in our Information Vision & Strategy. Architecture: To describe the common data and technology underpinnings of Digital Twin development within Highways England, including infrastructure, integration, and interfaces, aligned with National Digital Twin programme's Information Management Framework. Capability: To highlight the skills we require as an organisation in order to be an informed client and custodian of Digital Twins. Ethics: To set guidelines around the ethical implications of using Digital Twins to manage the Strategic Road Network. Governance: To document how we will govern Digital Twins within Highways England as a collaborative body of practice, as well as how we will quantify and capture the benefits of investment in Digital Twins. Federation: Collectively, the definition, principles, architecture, ethics, and governance should allow different parts of Highways England to conduct Digital Twin development whilst minimising the risk of inconsistent, redundant, unaligned, or unethical development.
  8. 10 downloads

    "Recommendation 5 Visibility of infrastructure and asset: digital system map The Taskforce recommends the development of a Digital System Map that will help unlock the opportunities of a Modern Digitalised Energy System. This recommendation builds on those put forward by the Centre for Digital Built Britain, Digital Framework Task Group and supported by the National Infrastructure Commission which recommend that work begins on a digital system map of GB network infrastructure with the overall goal of developing a full digital twin of Energy System infrastructure. Additional detail is included in Appendix 4." Source: https://es.catapult.org.uk/reports/energy-data-taskforce-report/ Energy Data Taskforce: A Strategy for a Modern Digitalised Energy System Published: 12 June 2019 The Energy Data Taskforce, commissioned by Government, Ofgem, and Innovate UK, has set out five key recommendations that will modernise the UK energy system and drive it towards a net zero carbon future through an integrated data and digital strategy throughout the sector. The recommendations highlight that to move towards a ‘Modern, Digitalised Energy System’ is being hindered by often poor quality, inaccurate, or missing data, while valuable data is often restricted or hard to find. The Taskforce run by Energy Systems Catapult and chaired by Laura Sandys, has delivered a strategy centred around two key principles – filling in the data gaps through requiring new and better-quality data, and maximising its value by embedding the presumption that data is open. These two principles will start to unlock the opportunities of a modern, decarbonised and decentralised Energy System for the benefit of consumers. Key findings The Energy Data Taskforce identified that a staged approach needed to be taken to achieve a Modern, Digitalised Energy System in order to fill the data gaps and maximise data value: Data Visibility: Understanding the data that exists, the data that is missing, which datasets are important, and making it easier to access and understand data. Infrastructure and Asset Visibility: Revealing system assets and infrastructure, where they are located and their capabilities, to inform system planning and management. Operational Optimisation: Enabling operational data to be layered across the assets to support system optimisation and facilitating multiple actors to participate at all levels across the system. Open Markets: Achieving much better price discovery, through unlocking new markets, informed by time, location and service value data. Agile Regulation: Enabling regulators to adopt a much more agile and risk reflective approach to regulation of the sector, by giving them access to more and better data. Recommendations Based on those findings, the Taskforce developed five recommendations for Government, Ofgem, and Innovate UK: Recommendation 1: Digitalisation of the Energy System – Government and Ofgem should direct the sector to adopt the principle of Digitalisation of the Energy System in the consumers’ interest, using their range of existing legislative and regulatory measures as appropriate, in line with the supporting principles of ‘New Data Needs’ ‘Continuous Improvement’ and ‘Digitalisation Strategies’. Recommendation 2: Maximising the Value of Data – Government and Ofgem should direct the sector to adopt the principle that Energy System Data should be Presumed Open, using their range of existing legislative and regulatory measures as appropriate, supported by requirements that data is ‘Discoverable, Searchable, Understandable’, with common ‘Structures, Interfaces and Standards’ and is ‘Secure and Resilient’. Recommendation 3: Visibility of Data – A Data Catalogue should be established to provide visibility through standardised metadata of Energy System Datasets across Government, the regulator and industry. Government and Ofgem should mandate industry participation though regulatory and policy frameworks. Recommendation 4: Coordination of Asset Registration – An Asset Registration Strategy should be established to coordinate registration of energy assets, simplifying the experience for consumers through a user-friendly interface in order to increase registration compliance, improve the reliability of data and improve the efficiency of data collection. Recommendation 5: Visibility of Infrastructure and Assets – A unified Digital System Map of the Energy System should be established to increase visibility of the Energy System infrastructure and assets, enable optimisation of investment and inform the creation of new markets. Appendicies EDTF Report Appendix 1 – Recommendation Actions EDTF Report Appendix 2 – Data Catalogue EDTF Report Appendix 3 – Asset Registration EDTF Report Appendix 4 – Digital System Map EDTF Report Appendix 5 – Data for Multi-SO EDTF Report Appendix 6 – Standards EDTF Report Appendix 7 – Glossary
  9. Hello, I’m looking to subcontract a company who has the expertise in designing and developing my digital twins art cities and energy management software application.
  10. Join us for the next video in our series on Tuesday. Kevin Reeves and the CDBB team will host a live chat session at 10.30. Bring your questions.
  11. Mark Coates

    Demonstrator project

    One for the calendar :- we recently did a demonstrator project with Microsoft’s Azure Digital Twins team for the Build Conference which will be featured as a Deep Dive on Microsoft’s Channel 9 IoT Show on August 3rd. https://channel9.msdn.com/Shows/Internet-of-Things-Show/Deep-Dive-Integrating-3D-Models-and-IoT-data-with-iTwin-and-Azure-Digital-Twins In this session we will demonstrate an application that combines 3D models, 2D maps, and reality mesh into a single environment for visualisation Within that environment we will demonstrate a live, real time, seamless visualization of IoT data streams. Next we will walk through the architecture that enables the application. We will show how we have mapped the IoT data to the assets within the digital twin. And we will show how to keep the digital twin in step with engineering changes. This is done by automating the generation of the Digital Twin Description Language (DTDL) .JSON. Bentley's iTwin platform and iModel.js open-source programming library provide powerful capabilities for aggregating 3D, 2D, reality data and other sources to link with IoT data for a "single pane of glass" visualization, analytics, and simulation so users can make more effective decisions in a timely manner. By integrating the iTwin platform with Azure Digital Twins and other Azure IoT services, Bentley and Microsoft are making it easier for developers, integrators, and customers to build digital twins of their infrastructure assets. It would be great to get feed back from as many as possible once the show has aired
  12. Version 1.0.0


    This guide has been produced by CDBB for members of the DT Hub community to provide a foundational knowledge set for all of our members. The purpose of this guide is not to be exhaustive but to document, at a high level, knowledge and examples of Digital Twin use cases that have been shared through the development of the DT Hub and engagement with our founding members. It is hoped that by sharing this knowledge all members of the DT Hub will benefit from a common understanding of foundational concepts and the ‘How, What and Why’ of Digital Twins over a Built Environment asset lifecycle. By spending the time to read this guide we hope to stimulate your thinking and we actively encourage you to bring these thoughts into the DT Hub for discussion with other members. The content of this guide will evolve, based on the feedback provided by you and all of our other members.
  13. Version 1.0.0


    Title: Why the Industry Needs an Open Approach Author: Brian Robins, Vice President, Product and Industry Marketing, Bentley Systems Every infrastructure asset is documented in data, with everything you ever need to know about the asset. Most of the time, however, this data is scattered across various platforms and in various formats. It is not contextualized nor is it easily accessible. This article calls for the infrastructure industry to move toward an open environment, including the adoption of iTwins, or infrastructure digital twins. Unless data can be aligned and synchronized, it will remain “dark data,” and any digital twin will not have veracity or fidelity. The foundation for digital twins, therefore, must be an open, connected data environment. Infrastructure digital twins allow users to bring data together, contextualize it, validate it, and visualize it. The opportunities for connecting infrastructure digital twins are vast and diverse. But with the importance of digital twins rising in the infrastructure industry, the best way to create value from data is through an open source platform for digital twins.
  14. DRossiter87

    Digital Twin Talks

    To explore how digital twins are defined and the overarching concepts around them, the DT Hub hosted a five-part talk series (available here). These talks were introduced by Sam Chorlton, chair of the Digital Twin Hub, who highlighted the fact that digital twin are not a new concept but rather that the technologies are now at a point where they can have a meaningful impact. With the national digital twin (NDT) programme leveraging now matured technologies and principles, these talks were aimed at exploring how they could be utilized within the built environment. In each case, a video from the speaker was used to spark an online discussion involving a mix of stakeholders and experts from across the value chain. This first series of talks included: Olivier Thereaux (ODI), Towards a web of digital twins; Brian Matthews (DAFNI), Meeting the Digital Twin Challenge; Tanguy Coenen (IMEC), Urban Digital Twins; Neil Thompson (Akins), Twinfrastructure; and Simon Evans (Arup), Digital Roundtable. Towards a Web of Digital Twins Beginning the digital twin talk series, Olivier Thereaux from the Open Data Institute (ODI) considered the parallels between the world wide web and the need to connect digital twins to form a national digital twin. By first citing the Gemini Principles and establishing what a digital twin is, Olivier articulated the rationale for their adoption by explaining the concept of digital twin as an approach to collect data to inform decision making within an interactive cycle. Olivier provided further detail about the need to both share and receive data from external datasets (e.g. weather data) and other related digital twins. To enable this exchange, he proposed the need for data infrastructure such as standards and common taxonomies. As these connections develop, Olivier foresees the development of a “network of twins” that regularly exchange data. Scaling these networks, a national digital twin could be achieved. Responding to Olivier’s talk, DT Hub members and guests asked a wide range of questions including on the adherence of technologies to standards, with Olivier confirming the existing of suitable standards; and referring to the work done by W3C and others. In addition, questions were posed around connecting twins that span cross-sectors and the need to ensure trust in data. The full Q&A discussion transcript can be found here. In addition, Olivier has also kindly produced an article on the topic of his talk, which can be found here. Meeting the Digital Twin Challenge Following Olivier, Dr. Brian Matthews from DAFNI presented on the DAFNI platform and the challenges related to developing an ecosystem of connected digital twins. Citing Sir John Armitt and Data for the Public Good, Brian emphasized how data is now considered as important as concrete or steel in regard to UK national infrastructure. Building on the digital twin definition given by Olivier, Brian proposed two types of digital twin: Reactive. Dynamic model with input from live (near real time) data; and Predictive. Static model with input from corporate systems. Linking to the Gemini Principles, Brian acknowledges that a single digit twin is impossible; requiring an ecosystem to achieve a national digital twin. Delving deeper, Brian looked at some of the associated technical challenges related to scaling and integration. He also talked about how the DAFNI platform can meet these challenges, by enabling connections between data and models, in support of the NDT programme. Responding to Brian’s talk, participants asked questions about whether “historic” could be considered an additional digital twin type with Brian confirming that historical are considered within the proposed types.. A lot of the discussion focused on data and data sets. This included the exchange data used by DAFNI with Brian confirming the use of a standardized dataset called DCAT which DAFNI are planning to publish. There were also questions to contextualize DAFNI within the NDT programme. The full Q&A discussion transcript can be found here. Urban Digital Twins Following Brian, Tanguy Coenen from IMEC presented on IMEC’s built environment digital twin (BuDi) as well as the idea of a city-scale digital twin. Explaining BuDi’s role as a decision-marking tool informed by near real-time data via sensors and IoT devices, Tanguy articulated how BuDI can support several use cases. In addition, Tanguy also considered digital twin use case types by considering: Yesterday: Historical Today: Realtime Tomorrow: Predictive Considering current smart cities as a set of silos, Tanguy expressed a desire for interoperability and data connectivity between these disparate datasets to form a urban digital twin what can support both public and private asset collections. Responding to Tanguy’s talk, questions were asked about terminology and the relationship to the ISO smart cities initiatives as well as the importance of standards around open data. Tanguy confirmed IMECs desire to support and align with these efforts. When asked about high-value use cases, Tanguy referred to: people flow, air quality and flooding as key urban-scale use cases. The full Q&A discussion transcript can be found here. Twinfrastructure Continuing the digital twin talk series, Neil Thompson from Atkins introduced the Commons workstream and the Glossary, a key mechanism to enable a common language to support the NDT programme. Neil described the Commons mission to build capability through an evidence-based approach, and drew several parallels between the commons and the creation of the internet, including utilizing open and agile methodologies. As thinking develops, Neil sees the commons as the location for discussion and consensus gathering to support formal standardization once consensus had been achieved. Responding to Neil’s talk, questions were asked about where a similar approach to consensus building had taken place with Neil referring to examples such as GitHub and Stackoverflow. Questions were also asked about the glossaries relationship to existing resources, with Neil referring to its ability to record whether an entry is a “shared” term. The full Q&A discussion transcript can be found here. In addition, the Glossary that Neil referred to can be found here. Digital Roundtable Finally, to conclude the digital twin talk series, Simon Evans from Arup moderated a round table discussion between the previous speakers. Brian, Tanguy, Neil and Simon provided their reflections and insight and answered questions from the audience. The round table dealt with a wide array of topics such as: What makes digital twins different for the built environment compared to other sectors? With the roundtable agreeing that the aspects that constitute a digital twin have been present in the built environment, but the use of the term demonstrates an evolution of thinking, the need for data connectivity, outcome focus, and a focus on data-driven decision making. How the NDT programme will address security and interoperability challenges? With the roundtable referring to the Information Management Framework Pathway and a future pathway related to security and security-mindedness. How might a digital twin support social distancing? With the roundtable providing examples of using hydrodynamic modelling and occupant monitoring via camera data to monitor and support social distance policies. The videos of each of the talks as well as the round table discussion can be found here. And there we have it. This series digital twin talks was developed to explore how digital twins are defined and the overarching concepts around them. Thank you for contributing to the discussions. Your level of engagement and willingness to share are what have made these talks a success. Please let me know what topics you would like future digital twin talks to address? If you have any suggestions on how to improve these talks? Or who you may want to hear a talk from in the future.
  15. Samuel A Chorlton

    Digital Twin Online Resources

    Whilst being based from home many of us may have more time available for research supporting either interests or work activities. I would like to encourage sharing of these as much as possible amongst the members to try and grow our collective knowledge base. Please feel free to comment on the shared resources as critical insights will prove to be an invaluable asset going forward. Here are some starting resources: https://theodi.org/article/digital-twins-user-research/ - Interesting views on the conceptualisation of digital twins. https://www.cdbb.cam.ac.uk/blog - Collection of blogs published by the Centre for Digital Built Britain providing many views and perspectives on Digital Twins. https://www.theiet.org/media/4719/digital-twins-for-the-built-environment.pdf - Publication for the IET on the opportunities, benefits, challenges and risks for Digital Twins in the Built Environment.
  16. As everyone who works within the built environment sector knows, the essential starting point for any successful construction project is the establishment of a solid foundation. With that in mind the Digital Twin Hub is thrilled to announce the publication of its first ever digital twin foundation guide: Digital Twins for the Built Environment. The Purpose The purpose of this guide is not to be exhaustive but to document, at a high level, knowledge and examples of Digital Twin use cases that have been shared through the development of the DT Hub and engagement with our early members. It is hoped that by sharing this knowledge all members of the DT Hub will benefit from a common understanding of foundational concepts and the ‘How, What and Why’ of Digital Twins and that this shared knowledge will enable more meaningful discussions within the DT Hub. The Structure To provide a relatable structure we have broken down the concepts into the different phases of the asset lifecycle. This should provide a greater sense of clarity of how Digital Twins can be applied to support real business problems against tangible examples. The Role of the Community The creation of this guide has demonstrated that there is complexity in distilling foundational concepts. For this publication we have focused on what we hope will benefit the community. To maximise the value we must therefore develop, refine and iterate this guide in partnership with the members. We actively encourage the community to provide feedback, both positive and negative in nature. More importantly than this, we hope that as part of this feedback process the community will be able to suggest potential alterations or amendments to continue increasing the value offering of the document. DTHUb_NewbieGuide_May2020_(1).pdf
  17. DRossiter87

    45 minute cities

    Hi all, Alasdair Rae of University of Sheffield has published a blog piece about travel within 26 cities using open data to visualize how far can you travel from to reach a central train station by 08:45 on a Monday morning. http://www.statsmapsnpix.com/2020/03/45-minute-cities.html The ability to produce this visualization seems very relevant to the NDT programme. Particularly if you could simulate traffic interventions to see how it improved access across the city. Cardiff has fared quite well. How has your city performed?
  18. A lot of the early thinking on digital twins has been led by manufacturers. So, what do digital twins mean to them and what insights could this provide for the built environment? This blog is the second in series that looks at what we can learn from the development of digital twins in other sectors. It draws on key findings from a report by the High Value Manufacturing Catapult. This includes industry perspectives on: The definition of digital twins Key components of digital twins Types of twin and related high-level applications and value The report “Feasibility of an immersive digital twin: The definition of a digital twin and discussions around the benefit of immersion” looks partly at the potential for the use of immersive environments. But, in the main, it asks a range of questions about digital twins that should be of interest to this community. The findings in the report were based on an industry workshop and an online survey with around 150 respondents. We’ve already seen that there are many views on what does or does not constitute a digital twin. Several options were given in the survey, and the most popular definition, resonating with 90% of respondents was: A virtual replica of the physical asset which can be used to monitor and evaluate its performance When it comes to key components of digital twins, the report suggests that these should include: A model of the physical object or system, which provides context Connectivity between digital and physical assets, which transmits data in at least one direction The ability to monitor the physical system in real time. By contrast, in the built environment, digital twins may not always need to be “real-time”. However, looking at the overall document, the position appears to be more nuanced and dependent on the type of application. In which case, “real-time” could be interpreted as “right-time” or “timely”. In addition, analytics, control and simulation are seen as optional or value-added components. Interestingly, 3D representations are seen by many as “nice to have” – though this will vary according to the type of application. In a similar fashion to some of our discussions with DT Hub members, the report looks at several types of digital twin (it is difficult to think of all twins as being the same!). The types relate to the level of interactivity, control and prediction: Supervisory or observational twins that have a monitoring role, receiving and analysing data but that may not have direct feedback to the physical asset or system Interactive digital twins that provide a degree of control over the physical things themselves Predictive digital twins that use simulations along with data from the physical objects or systems, as well as wider contextual data, to predict performance and optimise operations (e.g. to increase output from a wind farm by optimising the pitch of the blades). These types of twin are presented as representing increasing levels of richness or complexity: interactive twins include all the elements of supervisory twins; and predictive twins incorporate the capabilities of all three types. Not surprisingly, the range of feasible applications relates to the type of twin. Supervisory twins can be used to monitor processes and inform non-automated decisions. Interactive twins enable control, which can be remote from the shop-floor or facility. Whereas, predictive twins support predictive maintenance approaches, and can help reduce down-time and improve productivity. More sophisticated twins – and potentially combining data across twins – can provide insight into rapid introduction (and I could imagine customisation) of products or supply chains. Another way of looking at this is to think about which existing processes or business systems could be replaced or complemented by digital twins. This has also come up in some of our discussions with DT Hub members and other built environment stakeholders – in the sense that investments in digital twins should either improve a specific business process/system or mean that that it is no longer needed (otherwise DT investments could just mean extra costs). From the survey: Over 80% of respondents felt that digital twins could complement or replace systems for monitoring or prediction (either simple models or discrete event simulation) Around two-thirds felt the same for aspects related to analysis and control (trend analysis, remote interaction and prescriptive maintenance) with over half seeing a similar opportunity for next generation product design While remote monitoring and quality were seen as the areas with greatest potential value. Cost reduction in operations and New Product Development (NPD) also feature as areas of value generation, as well as costs related to warranty and servicing. The latter reflects increasing servitisation in manufacturing. This could also become more important in the built environment, with growing interest in gain-share type arrangements through asset lifecycles as well as increasing use of components that have been manufactured off-site. It would be great if you would like to share your views on any of the points raised above. For example, do you think built environment twins need the same or different components to those described above? And can digital twins for applications like remote monitoring and quality management also deliver significant value in the built environment?
  19. Digital technologies are no longer considered tools to satisfy a need for efficiency, they are active agents in value creation and new value propositions [1]. The term “digital twin” has entered the regular vocabulary across a myriad of sectors. It’s consistently used as an example of industry revolution and is considered fundamental to transformation, but the broad scope of the concept makes a common definition difficult. Yet it’s only once we understand and demystify the idea - and can see a path to making it reality - that we will start to realise the benefits. Heavy promotion by technology and service providers has inflated expectations, with most focusing on what a digital twin can potentially achieve when fully implemented, which is like buying a unicorn even if currently cost-prohibitive. Few refer to the milestones along the journey, or incremental value-proving developments. This is evidenced, in part, by the fact that only 5% of enterprises have started implementing digital twins, and less than 1% of assets have one [2]. Over the course of three blogs, I will attempt to demystify the concept and break through the platitudes, answering the fundamental questions: What is a digital twin? What type of new skills and capabilities are required? Will a digital twin generate value? And will it support better decision making? “Digital” in context Digital twins are symptomatic of the broader trend toward digitalisation, which is having a profound effect on businesses and society. Widely cited as the “fourth industrial revolution” [3] or Industry 4.0 (broadly following: steam power (c1760-c1840), electricity (c1870-c1914) and microchips (c1970)), it’s characterized by a fusion of technologies that blur the lines between the physical, digital, and biological spheres – such as artificial intelligence, robotics, autonomous vehicles and Internet of Things (IoT). Though the exact dates of the earlier revolutions are disputed, their timeframes were slower than the rapid pace and scale of today’s disruption, and still they saw companies and individuals that were slow or reluctant to embrace change being left behind. The digital revolution is unique, and derives in part from a new ability to massively improve quality and productivity by converging technologies and sources of data within a collaborative framework, which inherently challenges the business and organisational models of the past. Not only this, but the online connection of all assets together (the Internet of Things), is the key enabler to the next phase of industrial development. The complexity of assets, and cost of developing and operating them makes any promise of efficiency gains and improved performance immensely attractive. However, the reality of digital transformation to offer these rewards has too often fallen short. The failure comes from a rush to introduce digital technologies, products, and services without understanding the work processes in which they will be used, or the associated behaviours and joined up thinking required to make them effective. While individual products and services have their place, significant gains in efficiency and productivity will only come by weaving a constellation of technologies together and connecting them with data sources, followed by supporting management and application of that data through project, asset and organisational developments. Is data the “new oil” or the “new asbestos”? and how can industry start tangibly benefiting from the digital twin concept? With data apparently the “new oil”, or maybe the “new asbestos”, and against a backdrop of digital transformation being viewed by many sceptics as a fashionable buzzword, how can industry start tangibly executing and harnessing the benefits of the digital twin concept? Digital twin basics Fundamentally, a digital twin is just a digital representation (model) of a physical thing - its ‘twin’; and therein lies the complexity of this industry agnostic concept. Other commonly used terms, such as Building Information Modelling (BIM), Building Lifecycle Management (BLM) and Product Lifecycle Management (PLM) represent similar concepts with some important distinctions, that are all part of the same theme of data generation and information management. The term “digital twin” first appeared in 2010, developing from the conceptual evolution of PLM in 2002 [4]. Since then, it’s meaning has evolved from simply defining a PLM tool into an integral digital business decision assistant and an agent for new value and service creation [5]. Over time many have attempted to define the digital twin, but often these definitions focus on just a small part of the asset lifecycle, such as operations. “A digital twin can range from a simple 2D or 3D model with a basic level of detail, to a fully integrated model of an entire facility with each component dynamically linked to engineering, construction, and operational data” A digital twin can range from a simple 2D or 3D model of a local component, with a basic level of detail, all the way to a fully integrated and highly accurate model of an asset, an entire facility, or even a country [6], with each component dynamically linked to engineering, construction, and operational data. There is no single solution or platform used to provide a digital twin, just as there isn’t one CAD package used to create a drawing or 3D model. It’s a process and methodology, not a technology; a concept of leveraging experience-based wisdom by managing and manipulating a multitude of datasets. While a fully developed digital model of a facility remains an objective, practically speaking, we are delivering only the “low hanging fruit” pieces of this concept for most facilities now. These fractional elements, however, all point towards a common goal: to contribute a value-added piece that is consistent with the overall concept of the digital twin. As technology and techniques improve, we predict the convergence of the individual parts and the emergence of much more complete digital twins for industrial scale facilities, and ultimately entire countries. “There is no single solution or platform used to provide a digital twin, just as there isn’t one CAD package used to create a drawing or 3D model” The ultimate aim is to create a “single version of truth” for an asset, where all data can be accessed and viewed throughout the design-build-operate lifecycle. This is distinctly different to a “single source of truth”, as a digital twin is about using a constellation, or ecosystem, of technologies that work and connect. The digital twin promises more effective asset design, project execution, and facility operations by dynamically integrating data and information throughout the asset lifecycle to achieve short and long-term efficiency and productivity gains. As such, there is an intrinsic link between the digital twin and all the ‘technologies’ of the fourth industrial revolution, principally IoT, artificial intelligence and machine learning. As sensors further connect our physical world together, monitoring the state and condition, the digital twin can be considered the point of convergence of the internet-era technologies, and has been made possible by their maturity. For example, the reducing costs of storage, sensors and data capture, and the abundance of processing power and connectivity. The digital twin is a data resource that can improve design of a new facility or to understand the condition of an existing asset, to verify the as-built situation, run ‘what if’ simulations and scenarios, or provide a digital snapshot for future works. This vastly reduces the potential for errors and discontinuity present in more traditional methods of information management. As asset owners pivot away from document silos and toward dynamic and integrated data systems, the digital twin should be become an embedded part of the enterprise. Like the financial or HR systems that we expect to be dynamic and accurate, the digital twin should represent a living as-built representation of the operating asset, standing ready at all times to deliver value to the business. Each digital twin fits into the organisation’s overall digital ecosystem like a jigsaw, alongside potentially many other digital twins for different assets or systems. These can be ‘federated’ or connected via securely shared data - making interoperability and data governance key. In simple terms, this overall digital ecosystem consists of all the organisational and operational systems, providing a so-called ‘digital thread’. Author: Simon Evans. Digital Energy Leader, Arup. Delivery Team Lead, National Digital Twin Programme [1] Herterich, M. M., Eck, A., and Uebernickel, F. (2016). Exploring how digitized products enable industrial service innovation. 24th European Conference on Information Systems; 1–17. [2] Gartner, Hype Cycle for O&G [3] https://www.weforum.org/agenda/2016/01/digital-disruption-has-only-just-begun/ [4] Digital Twin: Manufacturing Excellence through Virtual Factory Replication. White Paper, pages 1 – 7 [5] Service business model innovation: the digital twin technology [6] Centre of Digital Build Britain, The Gemini Principles
  20. If you weren't allowed to use the terms "digital" and "twin" how would you describe what a digital twin is (in less than 20 words) to someone not in the know?
  21. andy cooney

    Digital Twin and IoT

    Sort of 2 questions What piece of IoT kit would make your Digital Twin more effective What IoT have you found that transforms your knowledge of business (helps you build a better DT) So what IoT would you like, what have you found/used?
  22. Could Digital Twins support the conservation of historic assets? The team behind the restoration of Notre Dame seen to think so: https://news.cnrs.fr/articles/a-digital-twin-for-notre-dame
  23. Guest

    Digital Twin: Terms

    There are several historic terms that relate to the overall concept of a digital twin, including terms like "information model" and "cyber-physical systems". What others are you aware of?
  24. Nicholas

    Data for the Public Good

    Version 1.0.0


    Report from the National Infrastructure Commission which recommends the creation of a National Digital Twin
  • Create New...