Jump to content

Search the Community

Showing results for tags 'Concept and principles'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Themes

  • Member Forums
    • General discussion
    • Testing digital twin concepts
    • Digital twin competencies
    • Pathway to value
    • IMF Pathway
    • Videos
  • IMF Community's Forum
  • Data Value and Quality's Forum
  • 4-dimensionalism's Forum
  • The Defence Digital Network's Welcome!
  • The Defence Digital Network's Defence DT Roadmap
  • The Defence Digital Network's Acronym Buster
  • The Defence Digital Network's Open Forum
  • The Defence Digital Network's Documents
  • Open Innovation - Open (Citizen) Science - SDG's Open Innovation
  • Open Innovation - Open (Citizen) Science - SDG's Social Innovation Ecosystem
  • Open Innovation - Open (Citizen) Science - SDG's Events
  • Funding / Collaborative R&D Opportunities's Challenges
  • Funding / Collaborative R&D Opportunities's Funding
  • Italian DT Hub's Q&A
  • Italian DT Hub's News

Categories

  • A survey of Top-level ontologies - Appendix D

Categories

  • Articles
  • Blogs
  • Publications
  • Editorials
  • Newsletters
  • Shared by the Community

Categories

  • A Survey of Industry Data Models and Reference Data Libraries

Categories

  • Climate Resilience Demonstrator (CReDo)
  • Gemini Call Feature Focus presentations
  • Hub Insights: Innovators
  • Hub Insights
  • Digital Twin Talks: Interconnection of digital twins
  • Digital Twin Talks: Exploring the definitions and concepts of digital twins
  • Other Digital Twin media

Categories

  • Member Resources
  • Public Resources
  • Guidance
  • IMF Community's Files
  • Data Value and Quality's Shared Resources
  • Italian DT Hub's Risorse

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

  1. You’re invited to a webinar on 2nd March to find out how collaboration through connected digital twins can help plan resilient cities and infrastructure. The National Digital Twin programme has developed a Climate Resilience Demonstrator (CReDo), a pioneering climate change adaptation digital twin project that provides a practical example of how connected data can improve climate adaptation and resilience across a system of systems. Watch the film Tomorrow Today, and try the interactive app to see what CReDo has been working towards. The CReDo team will use synthetic data developed through the project to show how it is possible to better understand infrastructure interdependencies and increase resilience. Join the webinar to hear from the CReDo team about the work that has happened behind the scenes of developing a connected digital twin. CReDo is the result of a first-of-its-kind collaboration between Anglian Water, BT and UK Power Networks, in partnership with several academic institutions. The project has been funded by Connected Places Catapult (CPC) and the University of Cambridge, and technical development was led by CMCL and the Hartree Centre. This collaboration produced a demonstrator that looks at the impact of flooding on energy, water and telecoms networks. CReDo demonstrates how owners and operators of these networks can use secure, resilient, information sharing across sector boundaries to adapt to and mitigate the effect of flooding on network performance and service delivery. It also provides an important template to build on to turn it to other challenges, such as climate change mitigation and Net Zero. Hear from members of the CReDo team – including the asset owners, CPC, and the technical development team - about the demonstrator they have delivered and the lessons they learned. If you’re interested in using connected digital twins to forge the path to Net Zero, then this event is for you. Register for our end-of-project webinar on 2nd March, 10:30 – 12:00: https://www.eventbrite.co.uk/e/credo-collaborating-and-resilience-through-connected-digital-twins-tickets-228349628887
  2. Health Facilities Scotland (HFS) as part of National Services Scotland (NSS) have developed this guide on behalf of NHS Scotland in conjunction with Construction Innovation Hub (The Hub). This Digital Twin framework establishes a set of principles to ensure that digital technology and processes are considered at every stage of the built asset’s lifecycle, supporting our mission by driving the adoption of digital approaches that improve the delivery, resilience and performance of infrastructure. It is part of a series of navigators produced by the Hub to provide a framework for client organisations to consider embedding digital twinning into an individual project. This interactive guide is aimed at Boards that are involved in producing business cases or procuring capital projects with Digital Twin considerations. To learn more: PowerPoint Presentation (cam.ac.uk) Digitising Scotland's Healthcare Estate | Centre for Digital Built Britain (cam.ac.uk)
  3. Hi all, Please have a look at this presentation and share you thoughts! https://digital-twins.kumu.io/describing-digital-twins
  4. Firstly, thank you to everyone who joined the concepts and principles standards workshop on the 11th of February. With 70+ attendees and a wealth of engagement, I feel that we managed to make some real progress in establishing the DT Hub community's views on the future BSI Flex standard. As mentioned during the workshop, the slide deck presented will be given to the technical author and advisory group for consideration, acting as a seed for further standardisation development. A copy of the slide deck used with the comments and changes incorporate is available here. In this article, I wanted to highlight some of the key insights that came from the workshop as well as provide you with the ability to keep track of this work as it progresses. Scope: Generally, the community appeared to be content with the draft scope, which had used the scope of ISO/DIS 23247-1 (Digital Twins for Manufacturing) as its basis. The comments received focused on types of assets, systems and processes which should be highlighted. Of particular note was the desire to include information assets, natural assets, and safety and control systems which have all now been included. There was also a strong desire to highlight the relationship to information and information management. A comment has been included to ask that this is done within the introduction as opposed to the scope. Concepts: After I had introduced a series of different conceptual diagrams, I was surprised to see a preference for the figure within ISO/DIS 23247-1. However while this figure appeared to be preferred, several attendees pointed out the need to articulate the scalability of the built environment; with mention made of effective visuals previously used by @Mark Enzer to explain the relationship between components, systems, and systems of systems. In addition, previous comments around the need to highlight the natural environment as a distinct facet were echoed. This led to the introduction of another figure from Flourishing Systems which highlights the built environment as economic infrastructure, social infrastructure and the natural environment. Principles: Having discussed the overall concept, we moved to the principles that should govern digital twins for the built environment. Using the original Gemini Principles as a basis, the community did not challenge the existing principles but did suggest a potential 10th, Provenance. Distinct from Quality, provenance would focus on the origin of information to enable someone to assess its potential accuracy or trustworthiness. Terminology: After discussing observable elements and references we concluded with Terminology. Using the existing terms on the DTHub Glossary as a basis, the community suggested several additional terms such as: interoperability, asset, system, process, system of systems, semantics and syntax. In addition @Ramy, a Northumbrian University PhD student, shared a figure and thoughts around Digital Twin uses and a “DT uses taxonomy” which he has also published on the DT Hub, here. It is this sort of engagement that makes the DT Hub a community, thank you. As I mentioned, the outcomes of this workshop will be fed into the development of BSI’s Flex standard around digital twins for the built environment. And there we have it. Please feel free to keep the conversation going by commenting below or directly on the slide deck. Stay tuned on the DT Hub for more news, updates and ways to get involved in the development of BSI’s Flex standard.
  5. Hi IMF Community, You may find this workshop interesting: "4-Dimensionalism in Large Scale Data Sharing and Integration" Full details and Registration can be found at: https://gateway.newton.ac.uk/event/tgmw80 . The workshop will feature six presentations on state-of-the–art research from experts on 4-Dimensionalism in large scale data sharing and integration followed by a chaired Presenter's Panel. Each presentation will cover aspects of 4-Dimensionalism from the basics to Top Level Ontologies and Co-Constructional Ontology with each answering the question posed by the previous presentation.
  6. As everyone who works within the built environment sector knows, the essential starting point for any successful construction project is the establishment of a solid foundation. With that in mind the Digital Twin Hub is thrilled to announce the publication of its first ever digital twin foundation guide: Digital Twins for the Built Environment. The Purpose The purpose of this guide is not to be exhaustive but to document, at a high level, knowledge and examples of Digital Twin use cases that have been shared through the development of the DT Hub and engagement with our early members. It is hoped that by sharing this knowledge all members of the DT Hub will benefit from a common understanding of foundational concepts and the ‘How, What and Why’ of Digital Twins and that this shared knowledge will enable more meaningful discussions within the DT Hub. The Structure To provide a relatable structure we have broken down the concepts into the different phases of the asset lifecycle. This should provide a greater sense of clarity of how Digital Twins can be applied to support real business problems against tangible examples. The Role of the Community The creation of this guide has demonstrated that there is complexity in distilling foundational concepts. For this publication we have focused on what we hope will benefit the community. To maximise the value we must therefore develop, refine and iterate this guide in partnership with the members. We actively encourage the community to provide feedback, both positive and negative in nature. More importantly than this, we hope that as part of this feedback process the community will be able to suggest potential alterations or amendments to continue increasing the value offering of the document. DTHUb_NewbieGuide_May2020_(1).pdf
  7. until
    Dan Rossiter, BSI Sector Lead, Built Environment will be hosting a follow-on webinar to explore what content and resources should be considered when drafting a formalised set of concepts and principles for digital twins in the built environment. The aim of the session is to provide an overview of what constitutes formal concepts and principles as well as an opportunity to discuss what additional aspects the DT Hub Community want included. The discussion around this has already begun on the DT Hub here! During this webinar we will discuss:  Purpose of a concepts and principles standard Outline structure for consideration Additional aspects to be included Additional content/resources to cite/consider We would be delighted if you are able to join us and start collaborating with the wider DT Hub community. Please note that this session is for DT Hub members only, but please feel free to invite another colleague from your organization if you feel this would be particularly relevant to them or if you cannot attend yourself.  Register to receive joining instructions. Read the discussion so far: Consolidating Concepts: Scope & Consolidating Concepts: Gemini Principles
  8. Casey Rutland

    Is it? Or is it not?

    Is it? Or is it not? For a few years now, parts of our sector and indeed other sectors, have been researching, defining and promoting digital twins. If we observe anything, it’s that chatter (including within the DT Hub) has been rife with the ‘what is/isn’t a digital twin...’ I’m no expert, and don’t yet claim to offer a resolution to clarify the topic, but I do think a discussion hosted within the DT Hub would be of use. This discussion is something that will provide greater clarity and implementation for those less involved in this definition process and yet vitally important to the delivery of whatever a digital twin of the future is destined to be. Let’s learn from BIM implementation I wear many hats in my career and most of them are related to the implementation and ‘normalisation’ of BIM processes. As Vice Chair of the UK BIM Alliance and Chair of the UK & Ireland Chapter of buildingSMART International, I’m afforded a view of the sector from various different levels of stakeholders and the challenges they face in an ever-changing world as they prepare to digitalise. The silent majority are perhaps the key to unlocking the transformation to a digital sector and it’s vital that the BIM message reaches them and connects in a meaningful way to each and every one of them... BIM in the UK has been ongoing for over a decade and my feeling is that there is at least another to go before we reach ‘business as usual’. It’s exactly the same for Digital Twins. All vocal parties involved here in the DT Hub seem keen to navigate more smoothly through the same sectoral challenges and one of those, in a similar vain to BIM, is “is this a Digital Twin or not”? Acknowledging that BIM in the UK has formerly been going through the same sector engagement, we can also see similar issues appearing now with the concept behind Digital Twins being taken over by technology providers rather than sector stakeholders and subsequently being marketed in that way. It’s by no means a UK-only challenge, with many global discussions observed. Hence, we’re rapidly on the way to Digital Twins being defined by technologies rather than their use and value to us as people. A human-centric approach to any digital transformation will almost always achieve greater adoption and ultimately ‘success’ than one led purely by technology. Hence the CDBB National Digital Twin Programme envisages the built environment as a system of systems, comprising economic infrastructure, social infrastructure and the natural environment. The CDBB Gemini Principles neatly position Digital Twins in a way that forces one to consider the overall business need (the ‘why’) and all the potential societal benefits. Other DT Hub discussions have touched on the possibility of a Turing-type test. The original Turing test was created by Alan Turing to determine whether or nota machine was discernible from a human. Whilst the test is valuable for determining artificial intelligence, it’s also one that is evaluated by humans and hence quite challenging to ensure all evaluators are equal. Perhaps a technology-driven test that provides both a score and a ‘time taken’, introducing a level of competition between creators of Digital Twin systems might help adoption. So here’s the proposition... we hold a workshop (or two) to discuss and investigate the need for a test, the type of test, ‘what’ is being tested, what the thresholds might be, and anything else that’s relevant to the topic of ascertaining whether or not someone’s proposed Digital Twin is actually a Digital Twin. I have three questions to start the discussion here in this thread... 1. Do you feel the need for a ‘test’ to determine whether or not a Digital Twin is a Digital Twin? Can we continue without a formal ‘test’ or should we actively seek to develop something absolute to filter out what we’ve been able to do for many years and focus on true Digital Twin solutions and the search for the allusive Digital Twin unicorn?! 2. If we do need a test, will a simple yes/no suffice? Or does a ‘score have more longevity? If you ever saw the HBO series Silicon Valley, you may be familiar with the Weismann Score, a fictional test and score for online file compression. It enabled the fictional tech companies to demonstrate the success of their software and algorithms by testing their performance for file compression. Would a similar test be suitable for our purposes, with a threshold for determining if a proposed Digital Twin is a Digital Twin and would it then cater for future digital developments and emerging technologies? 3. Finally, are you keen and able to join a virtual workshop? the_pathway_towards_an_imf.pdf the_pathway_towards_an_imf.pdf DTHUb_NewbieGuide_May2020_(1).pdf HUB Version_DT Standards Roadmap_November 2020 (3).pdf
  9. Neil Thomspon

    Digital twins are an extension of the internet

    We have embarked on several industrial ages and long before the arrival of the digital age there were the spinning jenny, coalmines, steam engines and telegraph poles which created the momentum for the economy we have today. This platform of industrial progress has enabled our urbanisation, the travel between urban centres and ultimately, the digital connection between them to support a global industrialised commerce system. The Internet of Computers The Internet is a loose arrangement of connected but autonomous networks of devices. The devices act as hosts (or connected computers) for a specific purpose (a website host for example) and communicate through a protocol that ties together the interconnected set of networks and devices. It is not only the backdrop of our new industrial age that makes the Internet fascinating. It was the culture that emerged from its creation. ‘Request for Comments’ created by junior team members of the ARPANET project enabled a loose and counter-hierarchical method for building consensus and developing standards. That counter-culture was to have a profound impact on the culture of collaborators in internet-engineering circles. These collaborators maintained a meritocracy which was open and inclusive. Hacker culture was born from this and ultimately, the first internet protocol, the Network Control Protocol. The founders of this interconnected network said: Open source and hacking were founding behaviours within the culture of early internet engineers. But the Internet was only the first step in our journey to today’s digital economy. You have to keep in mind that computing in the 1960s was exclusive to national governments, the military and businesses. However, the proliferation of the telephone provided a vision of the future for connected computing. In the 1970s, to meet the demand for connecting multiple computers together, Local Area Networks (LANs) were created. The demand for connectivity did not stop there, the Transmission Control Protocol and Internet Protocol (TCP/IP), opened LANs to connect to other LANs. In 1981 there were only 200 interconnected computers. Despite the vision of an interconnected community of people linked through purpose and interests, instead of proximity, practically it was still a long way away. Internet of Business What about the dot-com-silicon-valley fairytale of rags to riches? CERN, the European Organisation for Nuclear Research, an owner of the world’s largest particle physics laboratory, had a wicked problem to solve: How can CERN map the complex relationships between the various people, programs and systems? The answer was Enquire, a programme that Tim Berners-Lee created to attempt to achieve that outcome at a local level. This effort eventually led to the creation of a World Wide Web of information. It was no longer about merely connecting computers to the Internet, and it became our foundation for publishing the information on those computers to the world. Despite the creation of hypertext mark-up language (HTML) and facilitating connections through uniform resource locations (URL) (the addresses we use to visit data on the web), there was little interest in the WWW. The initiative was twice shelved and worked on without any formal approval. Eventually, through creating a browser for this WWW, its benefits were realised. The first ‘thing’ connected was a toaster in 1990. In 1995 the state ownership of the Internet ended (where the fair use policy restricted commercialisation), unleashing the commercial opportunity of the Internet. From connecting millions of computers to selling millions of products on eBay, the web rapidly went from a data sharing and discovery tool to a fully functioning marketplace. Investors marvelled at the most extensive initial public offering in history (1995) when Netscape (an internet browser), at just two years old, went public. Burners-Lee was vindicated as he was begging uninterested students to develop web software only a few years previous. By July 1997, there were 19.6 million connected computers, and Amazon had over 1 million customers. No brief history of the web would be complete without a mention of Google. A play on the word Googol, which denotes a massive number, Page and Brin set out to make the WWW discoverable. Yahoo! Offered to buy Google for $3bn, Google rejected the offer and eventually generated a need for the Oxford English Dictionary to add the verb Google. The end of the ’90s saw the first dot-com bubble burst, and the NASDAQ peaked at 500% higher than what it had been when Netscape offered it's IPO 5 years earlier. The market contraction was significant, Alan Greenspan coined the phrase irrational exuberance, and it captured the economic problem well. The Internet of Media While the latter stages for the commercial aspect of the Internet failed, hacking and the open-source movement were still active. Wikipedia demonstrated the power of open and collaborative systems. It had 20,000 articles in 2002 and grew to 2.5 million by 2009, today it contains 28 billion words in 52 million articles in 309 languages. Web 2.0 was to take the plastic nature of digital information and extend the Internet into a platform for connecting people with rich media. The printing press, compact disk, and the physical bank statements rom your bank were unable to match the plasticity of the Internet. A simple example is Craigslist, a user-driven website that allows its users to buy and sell anything. It was started by Craig Newmark who circulated e-mail newsletters among friends with notices of events in San Francisco. By utilising the Internet, it became a website with 20 billion page views a month! It did not stop there, in 1996 the song ‘Until it sleeps’ by Metallica became the first track to be illegally copied from CD and made available on the Internet as an MP3. It pathed a way for a generation to thinking music and other digitally related creative output should be digital, easy to access and nearly free. 64% of teenagers in 2007 had created content to upload to the Internet. Solving the problem of compression to enable media to be streamed over the Internet redefined the entertainment industry and shaped today's internet culture, which is now considered pop culture. The Internet of Things There are 20 billion devices connected to the Internet today. In 2013 Hal Varian, Google’s Chief Economist, wrote: We have reached a moment where the website is almost obsolete, and our interface with the WWW is purely through streamed data through services (like Netflix and video games on Steam) or specific applications (like Facebook and TikTok on mobiles). It is clear from the rolling history of the Internet that there is still an opportunity for its extensibility. Where the early founding students in ARPANET set the tone of the culture of openness and agility, leading to connecting computers to computers, networks to networks and toasters to other things. That might sound like an obvious thing to say. However, I honestly believe we are still in the early stages of an internet that will converge vast networks of national infrastructure to the benefit of the citizen. We must preserve the playful and collaborative nature found in internet culture. Today, The Internet of the Built Environment From connecting a toaster in 1990 to connecting our built environment, the Internet has been on a rapid journey, and that journey does not stop here. What next for the Internet? More than data and databases, more than information management, it will help us understand our built and natural environments in new and profound ways. The vision of the Internet enabling an interconnected community of people linked through purpose and interests, instead of proximity is a reality today. The Flourishing Systems paper has developed today’s vision of the Internet. That flourishing converging network of infrastructure systems is enabled by the National Digital Twin programme, and it draws some interesting parallels from the creation of the modern Internet. The Commons is a place where we create the protocols needed to connect economic and social infrastructure digitally. A fundamental founding principle of the commons is setting the behaviour of collaborators. We aim to capture the essence of opensource and collaborate openly with the members of the DT Hub. With that cultural underpinning, the Commons is also like a zipper, where we have a foundation that makes the initial connection and the slider (the commons) moves to connect the following elements together. The foundation data model and the reference data libraries are like the TCP/IP and HTML frameworks. They form the protocols for connecting digital twins together and enables the built environment to communicate digitally. This extension of the Internet is a platform for creativity and profound economic growth. Much like the Internet, the founders did not predict its impact on creative industries and pollical power through empowering communities. We will not know the future impact of this technology, but it will be impactful. Lastly, it is our only chance to adapt our built environment to operate in harmony with our natural environment. The National Digital Twin Programme is standing at the beginning of a new wave of interconnectedness, and with open and inclusive collaboration, we will take the first step into a new future.
  10. To explore how digital twins are defined and the overarching concepts around them, the DT Hub hosted a five-part talk series (available here). These talks were introduced by Sam Chorlton, chair of the Digital Twin Hub, who highlighted the fact that digital twin are not a new concept but rather that the technologies are now at a point where they can have a meaningful impact. With the national digital twin (NDT) programme leveraging now matured technologies and principles, these talks were aimed at exploring how they could be utilized within the built environment. In each case, a video from the speaker was used to spark an online discussion involving a mix of stakeholders and experts from across the value chain. This first series of talks included: Olivier Thereaux (ODI), Towards a web of digital twins; Brian Matthews (DAFNI), Meeting the Digital Twin Challenge; Tanguy Coenen (IMEC), Urban Digital Twins; Neil Thompson (Akins), Twinfrastructure; and Simon Evans (Arup), Digital Roundtable. Towards a Web of Digital Twins Beginning the digital twin talk series, Olivier Thereaux from the Open Data Institute (ODI) considered the parallels between the world wide web and the need to connect digital twins to form a national digital twin. By first citing the Gemini Principles and establishing what a digital twin is, Olivier articulated the rationale for their adoption by explaining the concept of digital twin as an approach to collect data to inform decision making within an interactive cycle. Olivier provided further detail about the need to both share and receive data from external datasets (e.g. weather data) and other related digital twins. To enable this exchange, he proposed the need for data infrastructure such as standards and common taxonomies. As these connections develop, Olivier foresees the development of a “network of twins” that regularly exchange data. Scaling these networks, a national digital twin could be achieved. Responding to Olivier’s talk, DT Hub members and guests asked a wide range of questions including on the adherence of technologies to standards, with Olivier confirming the existing of suitable standards; and referring to the work done by W3C and others. In addition, questions were posed around connecting twins that span cross-sectors and the need to ensure trust in data. The full Q&A discussion transcript can be found here. In addition, Olivier has also kindly produced an article on the topic of his talk, which can be found here. Meeting the Digital Twin Challenge Following Olivier, Dr. Brian Matthews from DAFNI presented on the DAFNI platform and the challenges related to developing an ecosystem of connected digital twins. Citing Sir John Armitt and Data for the Public Good, Brian emphasized how data is now considered as important as concrete or steel in regard to UK national infrastructure. Building on the digital twin definition given by Olivier, Brian proposed two types of digital twin: Reactive. Dynamic model with input from live (near real time) data; and Predictive. Static model with input from corporate systems. Linking to the Gemini Principles, Brian acknowledges that a single digit twin is impossible; requiring an ecosystem to achieve a national digital twin. Delving deeper, Brian looked at some of the associated technical challenges related to scaling and integration. He also talked about how the DAFNI platform can meet these challenges, by enabling connections between data and models, in support of the NDT programme. Responding to Brian’s talk, participants asked questions about whether “historic” could be considered an additional digital twin type with Brian confirming that historical are considered within the proposed types.. A lot of the discussion focused on data and data sets. This included the exchange data used by DAFNI with Brian confirming the use of a standardized dataset called DCAT which DAFNI are planning to publish. There were also questions to contextualize DAFNI within the NDT programme. The full Q&A discussion transcript can be found here. Urban Digital Twins Following Brian, Tanguy Coenen from IMEC presented on IMEC’s built environment digital twin (BuDi) as well as the idea of a city-scale digital twin. Explaining BuDi’s role as a decision-marking tool informed by near real-time data via sensors and IoT devices, Tanguy articulated how BuDI can support several use cases. In addition, Tanguy also considered digital twin use case types by considering: Yesterday: Historical Today: Realtime Tomorrow: Predictive Considering current smart cities as a set of silos, Tanguy expressed a desire for interoperability and data connectivity between these disparate datasets to form a urban digital twin what can support both public and private asset collections. Responding to Tanguy’s talk, questions were asked about terminology and the relationship to the ISO smart cities initiatives as well as the importance of standards around open data. Tanguy confirmed IMECs desire to support and align with these efforts. When asked about high-value use cases, Tanguy referred to: people flow, air quality and flooding as key urban-scale use cases. The full Q&A discussion transcript can be found here. Twinfrastructure Continuing the digital twin talk series, Neil Thompson from Atkins introduced the Commons workstream and the Glossary, a key mechanism to enable a common language to support the NDT programme. Neil described the Commons mission to build capability through an evidence-based approach, and drew several parallels between the commons and the creation of the internet, including utilizing open and agile methodologies. As thinking develops, Neil sees the commons as the location for discussion and consensus gathering to support formal standardization once consensus had been achieved. Responding to Neil’s talk, questions were asked about where a similar approach to consensus building had taken place with Neil referring to examples such as GitHub and Stackoverflow. Questions were also asked about the glossaries relationship to existing resources, with Neil referring to its ability to record whether an entry is a “shared” term. The full Q&A discussion transcript can be found here. In addition, the Glossary that Neil referred to can be found here. Digital Roundtable Finally, to conclude the digital twin talk series, Simon Evans from Arup moderated a round table discussion between the previous speakers. Brian, Tanguy, Neil and Simon provided their reflections and insight and answered questions from the audience. The round table dealt with a wide array of topics such as: What makes digital twins different for the built environment compared to other sectors? With the roundtable agreeing that the aspects that constitute a digital twin have been present in the built environment, but the use of the term demonstrates an evolution of thinking, the need for data connectivity, outcome focus, and a focus on data-driven decision making. How the NDT programme will address security and interoperability challenges? With the roundtable referring to the Information Management Framework Pathway and a future pathway related to security and security-mindedness. How might a digital twin support social distancing? With the roundtable providing examples of using hydrodynamic modelling and occupant monitoring via camera data to monitor and support social distance policies. The videos of each of the talks as well as the round table discussion can be found here. And there we have it. This series digital twin talks was developed to explore how digital twins are defined and the overarching concepts around them. Thank you for contributing to the discussions. Your level of engagement and willingness to share are what have made these talks a success. Please let me know what topics you would like future digital twin talks to address? If you have any suggestions on how to improve these talks? Or who you may want to hear a talk from in the future.
  11. A lot of the early thinking on digital twins has been led by manufacturers. So, what do digital twins mean to them and what insights could this provide for the built environment? This blog is the second in series that looks at what we can learn from the development of digital twins in other sectors. It draws on key findings from a report by the High Value Manufacturing Catapult. This includes industry perspectives on: The definition of digital twins Key components of digital twins Types of twin and related high-level applications and value The report “Feasibility of an immersive digital twin: The definition of a digital twin and discussions around the benefit of immersion” looks partly at the potential for the use of immersive environments. But, in the main, it asks a range of questions about digital twins that should be of interest to this community. The findings in the report were based on an industry workshop and an online survey with around 150 respondents. We’ve already seen that there are many views on what does or does not constitute a digital twin. Several options were given in the survey, and the most popular definition, resonating with 90% of respondents was: A virtual replica of the physical asset which can be used to monitor and evaluate its performance When it comes to key components of digital twins, the report suggests that these should include: A model of the physical object or system, which provides context Connectivity between digital and physical assets, which transmits data in at least one direction The ability to monitor the physical system in real time. By contrast, in the built environment, digital twins may not always need to be “real-time”. However, looking at the overall document, the position appears to be more nuanced and dependent on the type of application. In which case, “real-time” could be interpreted as “right-time” or “timely”. In addition, analytics, control and simulation are seen as optional or value-added components. Interestingly, 3D representations are seen by many as “nice to have” – though this will vary according to the type of application. In a similar fashion to some of our discussions with DT Hub members, the report looks at several types of digital twin (it is difficult to think of all twins as being the same!). The types relate to the level of interactivity, control and prediction: Supervisory or observational twins that have a monitoring role, receiving and analysing data but that may not have direct feedback to the physical asset or system Interactive digital twins that provide a degree of control over the physical things themselves Predictive digital twins that use simulations along with data from the physical objects or systems, as well as wider contextual data, to predict performance and optimise operations (e.g. to increase output from a wind farm by optimising the pitch of the blades). These types of twin are presented as representing increasing levels of richness or complexity: interactive twins include all the elements of supervisory twins; and predictive twins incorporate the capabilities of all three types. Not surprisingly, the range of feasible applications relates to the type of twin. Supervisory twins can be used to monitor processes and inform non-automated decisions. Interactive twins enable control, which can be remote from the shop-floor or facility. Whereas, predictive twins support predictive maintenance approaches, and can help reduce down-time and improve productivity. More sophisticated twins – and potentially combining data across twins – can provide insight into rapid introduction (and I could imagine customisation) of products or supply chains. Another way of looking at this is to think about which existing processes or business systems could be replaced or complemented by digital twins. This has also come up in some of our discussions with DT Hub members and other built environment stakeholders – in the sense that investments in digital twins should either improve a specific business process/system or mean that that it is no longer needed (otherwise DT investments could just mean extra costs). From the survey: Over 80% of respondents felt that digital twins could complement or replace systems for monitoring or prediction (either simple models or discrete event simulation) Around two-thirds felt the same for aspects related to analysis and control (trend analysis, remote interaction and prescriptive maintenance) with over half seeing a similar opportunity for next generation product design While remote monitoring and quality were seen as the areas with greatest potential value. Cost reduction in operations and New Product Development (NPD) also feature as areas of value generation, as well as costs related to warranty and servicing. The latter reflects increasing servitisation in manufacturing. This could also become more important in the built environment, with growing interest in gain-share type arrangements through asset lifecycles as well as increasing use of components that have been manufactured off-site. It would be great if you would like to share your views on any of the points raised above. For example, do you think built environment twins need the same or different components to those described above? And can digital twins for applications like remote monitoring and quality management also deliver significant value in the built environment? the_pathway_towards_an_imf.pdf DTHUb_NewbieGuide_May2020_(1).pdf HUB Version_DT Standards Roadmap_November 2020 (3).pdf
  12. Digital technologies are no longer considered tools to satisfy a need for efficiency, they are active agents in value creation and new value propositions [1]. The term “digital twin” has entered the regular vocabulary across a myriad of sectors. It’s consistently used as an example of industry revolution and is considered fundamental to transformation, but the broad scope of the concept makes a common definition difficult. Yet it’s only once we understand and demystify the idea - and can see a path to making it reality - that we will start to realise the benefits. Heavy promotion by technology and service providers has inflated expectations, with most focusing on what a digital twin can potentially achieve when fully implemented, which is like buying a unicorn even if currently cost-prohibitive. Few refer to the milestones along the journey, or incremental value-proving developments. This is evidenced, in part, by the fact that only 5% of enterprises have started implementing digital twins, and less than 1% of assets have one [2]. Over the course of three blogs, I will attempt to demystify the concept and break through the platitudes, answering the fundamental questions: What is a digital twin? What type of new skills and capabilities are required? Will a digital twin generate value? And will it support better decision making? “Digital” in context Digital twins are symptomatic of the broader trend toward digitalisation, which is having a profound effect on businesses and society. Widely cited as the “fourth industrial revolution” [3] or Industry 4.0 (broadly following: steam power (c1760-c1840), electricity (c1870-c1914) and microchips (c1970)), it’s characterized by a fusion of technologies that blur the lines between the physical, digital, and biological spheres – such as artificial intelligence, robotics, autonomous vehicles and Internet of Things (IoT). Though the exact dates of the earlier revolutions are disputed, their timeframes were slower than the rapid pace and scale of today’s disruption, and still they saw companies and individuals that were slow or reluctant to embrace change being left behind. The digital revolution is unique, and derives in part from a new ability to massively improve quality and productivity by converging technologies and sources of data within a collaborative framework, which inherently challenges the business and organisational models of the past. Not only this, but the online connection of all assets together (the Internet of Things), is the key enabler to the next phase of industrial development. The complexity of assets, and cost of developing and operating them makes any promise of efficiency gains and improved performance immensely attractive. However, the reality of digital transformation to offer these rewards has too often fallen short. The failure comes from a rush to introduce digital technologies, products, and services without understanding the work processes in which they will be used, or the associated behaviours and joined up thinking required to make them effective. While individual products and services have their place, significant gains in efficiency and productivity will only come by weaving a constellation of technologies together and connecting them with data sources, followed by supporting management and application of that data through project, asset and organisational developments. Is data the “new oil” or the “new asbestos”? and how can industry start tangibly benefiting from the digital twin concept? With data apparently the “new oil”, or maybe the “new asbestos”, and against a backdrop of digital transformation being viewed by many sceptics as a fashionable buzzword, how can industry start tangibly executing and harnessing the benefits of the digital twin concept? Digital twin basics Fundamentally, a digital twin is just a digital representation (model) of a physical thing - its ‘twin’; and therein lies the complexity of this industry agnostic concept. Other commonly used terms, such as Building Information Modelling (BIM), Building Lifecycle Management (BLM) and Product Lifecycle Management (PLM) represent similar concepts with some important distinctions, that are all part of the same theme of data generation and information management. The term “digital twin” first appeared in 2010, developing from the conceptual evolution of PLM in 2002 [4]. Since then, it’s meaning has evolved from simply defining a PLM tool into an integral digital business decision assistant and an agent for new value and service creation [5]. Over time many have attempted to define the digital twin, but often these definitions focus on just a small part of the asset lifecycle, such as operations. “A digital twin can range from a simple 2D or 3D model with a basic level of detail, to a fully integrated model of an entire facility with each component dynamically linked to engineering, construction, and operational data” A digital twin can range from a simple 2D or 3D model of a local component, with a basic level of detail, all the way to a fully integrated and highly accurate model of an asset, an entire facility, or even a country [6], with each component dynamically linked to engineering, construction, and operational data. There is no single solution or platform used to provide a digital twin, just as there isn’t one CAD package used to create a drawing or 3D model. It’s a process and methodology, not a technology; a concept of leveraging experience-based wisdom by managing and manipulating a multitude of datasets. While a fully developed digital model of a facility remains an objective, practically speaking, we are delivering only the “low hanging fruit” pieces of this concept for most facilities now. These fractional elements, however, all point towards a common goal: to contribute a value-added piece that is consistent with the overall concept of the digital twin. As technology and techniques improve, we predict the convergence of the individual parts and the emergence of much more complete digital twins for industrial scale facilities, and ultimately entire countries. “There is no single solution or platform used to provide a digital twin, just as there isn’t one CAD package used to create a drawing or 3D model” The ultimate aim is to create a “single version of truth” for an asset, where all data can be accessed and viewed throughout the design-build-operate lifecycle. This is distinctly different to a “single source of truth”, as a digital twin is about using a constellation, or ecosystem, of technologies that work and connect. The digital twin promises more effective asset design, project execution, and facility operations by dynamically integrating data and information throughout the asset lifecycle to achieve short and long-term efficiency and productivity gains. As such, there is an intrinsic link between the digital twin and all the ‘technologies’ of the fourth industrial revolution, principally IoT, artificial intelligence and machine learning. As sensors further connect our physical world together, monitoring the state and condition, the digital twin can be considered the point of convergence of the internet-era technologies, and has been made possible by their maturity. For example, the reducing costs of storage, sensors and data capture, and the abundance of processing power and connectivity. The digital twin is a data resource that can improve design of a new facility or to understand the condition of an existing asset, to verify the as-built situation, run ‘what if’ simulations and scenarios, or provide a digital snapshot for future works. This vastly reduces the potential for errors and discontinuity present in more traditional methods of information management. As asset owners pivot away from document silos and toward dynamic and integrated data systems, the digital twin should be become an embedded part of the enterprise. Like the financial or HR systems that we expect to be dynamic and accurate, the digital twin should represent a living as-built representation of the operating asset, standing ready at all times to deliver value to the business. Each digital twin fits into the organisation’s overall digital ecosystem like a jigsaw, alongside potentially many other digital twins for different assets or systems. These can be ‘federated’ or connected via securely shared data - making interoperability and data governance key. In simple terms, this overall digital ecosystem consists of all the organisational and operational systems, providing a so-called ‘digital thread’. Author: Simon Evans. Digital Energy Leader, Arup. Delivery Team Lead, National Digital Twin Programme [1] Herterich, M. M., Eck, A., and Uebernickel, F. (2016). Exploring how digitized products enable industrial service innovation. 24th European Conference on Information Systems; 1–17. [2] Gartner, Hype Cycle for O&G [3] https://www.weforum.org/agenda/2016/01/digital-disruption-has-only-just-begun/ [4] Digital Twin: Manufacturing Excellence through Virtual Factory Replication. White Paper, pages 1 – 7 [5] Service business model innovation: the digital twin technology [6] Centre of Digital Build Britain, The Gemini Principles
  13. This blog was first produced following discussions with digital twin owners about the importance of learning more from other industries. It also relates to the first “theme” that we identified as a priority for the DT Hub, which looks at digital twin definitions and concepts. We hope you enjoy reading this piece and welcome your comments as well as your thoughts on other topics where you would like to hear more from us. The idea of digital twins in space may seem like science fiction – or at least a long way removed from the day-to-day challenges of the built environment. But, in fact, the aerospace industry has been at the forefront of many of the technology innovations that have transformed other areas. Before Michael Grieves coined the term digital twin in 2002, NASA was using pairing technology to operate and repair remote systems in space. Digital twins, in the aerospace sector, have since gone way beyond simulations. This is driven by a need to accurately reflect the actual condition of space craft and equipment and predict potential future issues. While the crew of Apollo 13 may have relied on a physical double as well as digital data, future space stations and trips beyond our atmosphere will be using digital twins to deliver the right kinds of insights, decision support and automation needed to achieve their missions. Despite the great distances and the technological advancement of space technologies there are valuable parallels with industries back on earth. For example, digital twins of remote and autonomous vehicles (like the Mars Exploration Rover) could provide useful lessons for similar vehicles on earth, from robots in nuclear facilities and sub-sea environments, through to delivery vehicles in a logistics centre or drones on a building site. More specifically, a 2012 paper co-authored by NASA provided several insights into the approach to digital twins in aerospace, including the following definition: A Digital Twin is an integrated multiphysics, multiscale, probabilistic simulation of an as-built vehicle or system that uses the best available physical models, sensor updates, fleet history, etc., to mirror the life of its corresponding flying twin Digital twins could represent a significant shift away from a heuristic (i.e. past-experience based) approach to one using sophisticated modelling combined with real-life data. This shift impacts design and build, certification and ongoing operation. The drivers behind this change include a need to withstand more extreme conditions, increased loads and extended service life. (Imagine a manned trip to Mars, or one of the new commercial space ventures that call for vehicles to be used again and again). The paper also looked at some of the needs and priority areas for digital twins, including: more accurate prediction of potential materials failures; as well as the condition of other systems in space vehicles by connecting multiple models with data from the physical twin. If digital twins can add value in the harshest environment imaginable, what applications could this have for the built environment? One example is the interesting parallels between assessment of the risks of cracks and failures in long-life space vehicles and long-term structural monitoring of bridges and other infrastructure. The required level of fidelity (i.e. the level of detail and accuracy) as well as the extent to which real-time data is needed, may vary considerably – but many of the same principles could apply. More widely, the authors of the paper felt that the parallels and benefits from developing digital twins for aerospace could extend across manufacturing, infrastructure and nanotechnology. The ideas explored in the paper also go well beyond monitoring and towards automation. For complex space missions, vehicles may not be able to get external help and will need to be self-aware, with “real-time management of complex materials, structures and systems”. As the authors put it: “If various best-physics (i.e., the most accurate, physically realistic and robust) models can be integrated with one another and with on-board sensor suites, they will form a basis for certification of vehicles by simulation and for real-time, continuous, health management of those vehicles during their missions. They will form the foundation of a Digital Twin.” Such a digital twin could continuously forecast the health of vehicles and systems, predict system responses and mitigate damage by activating self-healing mechanisms or recommend in-flight changes to the mission profile. While the context may be very different, our discussions with DT Hub members and others in the market suggest that built environment infrastructure owners and operators are aiming to achieve many of the same aspirations as NASA – from better prediction of potential issues through to actuation and self-healing. Which space twin applications and ideas do you think we could apply to the built environment? We would welcome your comments on this piece as well as your thoughts on other topics where you would like to hear more from us. the_pathway_towards_an_imf.pdf DTHUb_NewbieGuide_May2020_(1).pdf HUB Version_DT Standards Roadmap_November 2020 (3).pdf
  14. Context: DT Hub activities focus on a set of thematic areas (themes) that are based on shared opportunities and challenges for members. These themes are areas where collaboration can help members to gain greater understanding and make progress towards realising the potential benefits of digital twins. The first theme is called “Testing digital twin concepts”. The focus is on: Helping DT Hub members increase common understanding of digital twin definitions and concepts, then test and refine this thinking in specific use cases and projects where there is potential to deliver significant value Why start with this theme? Each of the members we spoke to felt that it is difficult to systematically plan and progress digital twins without a clear understanding of what digital twins are, when something is classified as a digital twin, and what are the key concepts or building blocks that lie behind these twins. In other words, discussing digital twin concepts and definitions is a “foundational” activity that is needed to underpin future activities. Moreover, it is difficult to think about making digital twins more connected and interoperable, as stepping-stones towards a national digital twin, if the approach to each twin is inconsistent. Scope: This theme will build on work being done through the NDT Programme, including the Gemini Principles, and feed back ideas and recommendations based on real-world experience from members and from the wider market. There are other reasons why this work is needed. As with other major technology developments, from the internet of things to AI, a growing range of players will claim to have tech solutions for digital twins. There is a risk here of “twinwash”. If every sharp-looking 3D visualization is labelled as a “digital twin”, regardless of whether or not it bears any relation to real-world physical assets, this can create confusion and risks devaluing “real” twins. Tackling this theme can help digital twin owners start to address questions like: What makes my digital twin a twin? What types of areas (e.g. related to data, models, analytics, control systems etc) do I need to consider in creating digital twin strategies and planning for individual digital twin projects? What can I learn from the approaches taken by others to defining and scoping digital twins -including how this relates to specific use cases? How do I relate and connect multiple digital twins within my organization? How does my twin (and the approach I’m taking to it) relate to other third-party twins? For example, how will a water pipeline twin connect with a highways or city twin? Related to the first bullet above, at least some DT Hub members would like to see the creation of a “Turing test for twins”. In other words, to have an agreed set of criteria established as the minimum threshold for a twin to be considered a twin. At the same time, there is also a desire for flexibility - the scope of twins will vary according to the intended purpose and specific use case. For example, not all twins will involve real-time control and actuation. Objectives: The main objectives for this theme are then to: Provide insights on “good” approaches to describe and classify digital twins and their constituent elements – building on the Gemini Principles Understand (from examples) how other industries have advanced their digital twin journeys Apply this thinking to specific use cases in existing or planned founding member digital twins in areas where there is the potential to deliver significant value Help DT Hub members to achieve greater consistency across their organizations and with supply chains and partners Develop an intuitive “test” for what constitutes a digital twin Feedback learnings into the evolution of the “Commons” and the Gemini Principles Activities We’ve already started on the first set of activities for this theme and created some content for you to dive into including: A Webinar to start to relate this to the Gemini principles and to identify some initial use case priorities Research into interesting examples from other industries of approaches to defining and developing twins Creation of blog-style “conversation starters” (for example insights from aerospace and manufacturing as well as thoughts on approaches to defining twins) as well as links to interesting external sources based on this research. We are adding these to a dedicated space for theme 1 What next? There are still plenty of opportunities for you to get involved, including activities to flesh out this theme. This includes an online “jam” – a virtual event that we’ll host on the DT Hub, dates to be confirmed. We want this theme to be driven by member’s views and priorities, so it would be great if you would like to comment on this post including on: Existing initiatives that could feed into this work Use cases that we should prioritise to test emerging thinking on digital twin concepts Specific digital twin projects you are be working on Your views that on what makes a digital twin a twin
  15. DRossiter87

    Reflection: Right vs Real

    Our collective understanding of digital twins is rather nascent. To ensure that we operate under the same base information there is a need to periodically reflect on the concepts and principles we have outlined. This blog post is one in a series which reflects on previously published concepts to consider whether our collective thinking has advanced. As we develop the thinking, tools, and resources relating to digital twins, a lot of discussion is taking place regarding their scope, scale and accuracy. Within the Gemini Principles it stated that a digital twin is: I want to reflect on this statement. In particular, the use of “realistic”. For something to be realistic, according to the Oxford English Dictionary, it must represent something in a way that is accurate and true to life. For example, for something to be “photo-realistic” it must appear as if it was a photograph. However, the Gemini Principles state that a digital twin must represent physical reality at the level of accuracy suited to its purpose. Interestingly, while undertaking discovery interviews with DT Hub members we saw this issue realized. Interview Insight "Several members commented on how people in their organizations would try to extend the use of their digital twins beyond their intended purposes." This was seen as both a positive and a negative outcome. The positive being that members of these organizations saw the value in these digital twins and wanted to harness their insight. The negative being that these digital twins did not have the information or, when available, did not have level of accuracy required to be used for these extended purposes. For these extended needs, these digital twins were not realistic. Amongst DT Hub members there appears to be a shared view that digital twins are, fundamentally, purpose-driven. Therefore, digital twins might not be “real” representations, but instead the “right” representation to support a purpose. Consider an example. An air traffic control system utilizes a “digital twin” of runways, aircraft and their flight paths along with sensor information (e.g. weather and radar) to assist with preventing collisions, organize and control the landing and departing of aircraft. In this example while real-time information and analytics are used, none of the physical elements (planes, control towers) have realistic representations, they instead use basic representations to support the air traffic controller. Instinctually an air traffic control system does everything we want a digital twin to do, it is a digital representation of physical assets which also includes sensor information where the physical assets provide a link back to the digital twin. Given this, it should be fairly clear that an air traffic control system would be considered a digital twin. However, this does not appear to be the case. A poll was placed on twitter asking “would you consider an air traffic control system a digital twin”. After 62 votes were cast, the result was exactly 50:50. What does this tell us? Perhaps public messages on what a digital twin is aren’t sufficiently defined? Perhaps the question was poorly worded? Or perhaps, for some, the lack of a realistic representation is the reason they said no? Unfortunately, context for each vote isn’t available. At the very least we can be sure that our shared view may not be shared by everyone. In an age where many consider data to be the new oil perhaps we should consider using our data sparingly. So long as the data provided is sufficient for its intended purpose, a realistic representation may not always be required. And there we have it, realism and its place within Digital Twins. Do you believe that a digital twin has to be realistic? Can something be a digital twin without being a realistic representation? Had you voted on this poll, would you have considered an air traffic control system a digital twin?
  16. Nicholas

    Concepts & Definitions Theme Summary

    The first theme that we are addressing in the Hub is “Testing digital twin concepts”. This was identified as a key foundational element by the DT Hub members – to help increase understanding and inform the development of strategies and projects. It will build on the Gemini Principles and generate recommendations to feed into the National Digital Twin (NDT) “Commons” stream. The theme is summarised as “helping DT Hub members to increase common understanding related to digital twin definitions and concepts, and then to test and refine this thinking within specific use cases and projects where there is potential to deliver significant value.” You can find out more on the objectives, activities and selection process for the theme in the attached document. DT Hub Theme 1 report 19 December 2019.pdf
  17. Neil Thomspon

    What is the difference between Twins & BIMs

    BIM Twin What is What if Files Queries Physical Real Asset Function Time stamp Time graph Transaction Enterprise Outputs Outcomes As Designed? As Intended? (for discussion) I wanted to share some early thinking with you, and please consider this a consultation not a formal announcement of direction. Following the latency post from @DRossiter87 and some conversations with people in different markets. I have found a useful framework to separate BIM from Digital Twins. There is a caveat with the following, this is not a statement of which is better. Both BIM and Digital Twinning have key benefits. Much like a chef has a collection of knives for different use cases. The same is true for BIM and twins. BIM as defined in the standards available sets out how data can be procured in a transactional model. This is where a client can set the information requirements for a supply chain to author and deliver information for a particular purpose. The table above sets out a series of differences and I will work through them one by one to explain what they mean and how they differ. 1. What is vs What if A BIM will tell you what something is, it cannot answer the question what if. The IMF sets out a pathway for askign questions of datasets. For example “What if I turned this value off?”. 2. Files vs Queries Very similar to the above, but with a view on functionality. The BIM sets out the container of the data and the files within. These files include CSV files or a SQL databases for example. The query in the twin space is an operation on the dataset or file. 3. Physical vs Real The BIM space treats physical elements as assets. Those assets would be on some form of register which lists 'tangible things'. Those assets generally develop over time in line with the level of information need. In the twin space this representation of the physical is abstracted up into its function. The real aspect is how the object interacts with reality. This interaction is physically within the system (a pump pumping water) and is broader service / organisational purpose (the pump provides a minimum pressure to supply water to customers and is linked to the revenue stream that, for example, is charged by the cubic meter of water.) 4. Asset vs Function Related to the above, the asset focus is purely on the performance specification and range of the asset's performance in isolation. The twin considers the function the asset plays. @Simon Scott explained a great example of this. The function of a level crossing is to ensure two types ofmobile infrastrcture do not collide (please correct me if im wrong here simon), there is a difference between asking for a level crossing (an asset) and asking for two infrastructures not to collide (a function) are fundementally two differnt questions. 5. Time stamp vs time graph Time in BIM is a time stamp against a transaction or digital snapshot of an asset. The twin aspect is the time graph, the status of a person over time changes. The queries from the twin understand the historical elements of an asset. For example, when searching for an actor on google it can piece together data of that person from a series of datasets that allows a comprehensive history of that a actor to be rendered. 6. Transaction vs Enterprise The BIM standards describe a process for multiple parties to transact data. They set out how data can be procured, authored and delivered as a series of transactions. The twin represents an enterprise view where data flows with purpose aligned with agreed outcomes. 7. Outputs vs outcomes BIM through its focus on transactions and assets can only provide insight on outputs, where twins focus on functions and enterprise it can provide insights on outcomes. 8. 3D Rendition vs Abstracted BIM requires a 3D rendition of an asset as set out in the level of information need / requirements. For the digital twin, and to use @DRossiter87example of a BMS, there is no need for a full representation of the asset. All that is required is the data needed in order to execute a decision, either for a machine or human. Of course, if the what if statement includes a spatial requirement a boundary condition for the geometry is required. A non-geographic example, is that the BMS wants to know which rooms to heat for the day for a school, a key input could be the lesson plans from the teaching staff to understand occupancy of a space. On the other hand, a geographic example is if the AHU requires a filter replacement and the plantroom is tight for space. There would be a need for a physical representation of the space. I welcome the discussion and feedack! the_pathway_towards_an_imf.pdf DTHUb_NewbieGuide_May2020_(1).pdf
  18. olivierthereaux

    Towards a Web of Digital Twins - Article

    Following the successful webinar / online discussion last month, I have now posted an article version of "Towards a Web of Digital Twins". The document summarises the research conducted by the team at the ODI (Open Data Institute) on what it means to connect digital twins and how the concept can scale to a "web" of twins across domains, sectors, and geography. The synthesis of our research is also available as an annotated deck, released under an open licence.
  19. University College London’s (UCL) Infrastructure Systems Institute is working together with the DT Hub to collect academic, industry and government knowledge and experience on digital twins. This survey is an activity that contributes to our research portfolio in economic infrastructure (energy, transport, water, waste, telecoms) helping to deliver public and environmental good through self-healing and anticipatory systems, sustainable innovation and resilience. For this survey we want to determine the scope of a digital twin and to hear about your examples - both real and planned! Here is the survey link https://ucl.onlinesurveys.ac.uk/digital-twin-survey. In preparation for the survey we have conducted a literature review and a small number of expert interviews. This survey provides a way to validate or challenge our preliminary findings and collect up-to-date examples which is important given that this area is evolving rapidly. You will be able to respond anonymously to the survey but if you want to provide text or examples you should anonymise unless you are happy for these to be publicised. There will be a chance to provide your email if you want to get move involved in CDBB and the National Digital Twin programme or you want to be notified when the results are published (planned for end Dec 2020). Thanking you in advance! Liz Varga, Professor of Complex Systems, UCL
  20. On the 17th November, the DT Hub community came together to share their thoughts in a webinar led by BSI on an intelligence test for digital twins in the built environment. Watch to find out what the community thought.
  21. until
    The event will take place on at 14:00-16:00 on November 17th. Please register for the webinar through Eventbrite. This event will explore whether an intuitive test for digital twins would be a valuable test to have. The aim of the session is to provide an overview of the topic, initial community feedback – as well as to provide an opportunity to share your thoughts on existing tests we could learn from and build on. The discussion around this has already begun on the DT Hub here! During this webinar we will discuss: 'The Why' Discussing the pros and cons of a DT test Objectives & Activities for looking at intuitive tests for digital twins Summary of initial industry feedback A Yes/No or a Sliding Scale/Score Existing 'test' examples that could be leveraged Discussing what elements make up a digital twin We would be delighted if you are able to join us and start collaborating with the wider DT Hub community. Please note that this session is for DT Hub members, but please feel free to invite another colleague from your organization if you feel this would be particularly relevant to them or if you cannot attend yourself.
  22. The National Digital Twin Programme has initiated work to create a thin slice of the IMF for the Construction Innovation Hub, to support the development of CIH’s Platform Ecosystem. This thin slice of the IMF is called the FDM Seed. Fig 1.General classification of the TLO – taken from A Survey of Top-level Ontologies The first steps of the FDM Seed project is to survey the landscape, to investigate what ontologies and Data models are already in use out there; what they can do, their limitations, and assess what tools may be useful as a starting point for the FDM and the RDL. The NDTp Commons Technical team have undertaken the landscape survey and have now published two reports: • A survey of Top-level Ontologies (TLOs) • A Survey of Industry Data Models (IDMs) and Reference Data Libraries (RDLs) The final survey of top-level ontologies is, we think, the first of its kind. To take part in the discussion on the surveys and their implications, we invite you to become a member of the Digital Twin Hub and join the Digital Twin Hub IMF Community Network What next? The Programme is now in the process of gathering recommendations for which TLOs to use to start the work on the FDM Seed thin slice. We anticipate basing the FDM development on one of the TLOs, bringing in elements from others, based on the survey & analysis.
  23. The National Digital Twin programme is a national programme, built on consensus. As the open consultation on our proposed approach to an IMF draws to a close, Miranda Sharp IMF Engagement Lead at CDBB shares how the DT Hub members can continue to shape its development. We believe the IMF is the key to enabling secure, resilient data sharing between organisations and sectors in the built environment and want to work with you to devel0p it. Greater use of digital technologies and information in the built environment increases capacity, efficiency, reliability and resilience. This in turn enables existing assets to enhance service provision for people, as well as improving efficiency in design and delivery of new assets through a better understanding of whole-life performance of those assets already in place. We know that by working collaboratively with members of the DT Hub, who share in that vision, we will end up with a better end result. My role within the programme is to help make your voice heard, and to open conversations where you can ask the challenging questions we need to find answers to. During the next phase of the consultation we will be running in-depth interviews with practiti0ners to understand the challenges the proposed approach faces, and how these could be resolved. We want to know if the approach be top down or bottom up – or both? We want to hear your thoughts, ideas and reflections, both positive and negative. In working collaboratively to establish the IMF we will enable a National Digital Twin that is implementable and usable, in order to enable society to tackle the urgent and cross-silo challenges of achieving carbon reduction targets and effectively coordinate disaster response. We will also be able to derive the greater benefits of securely connecting our data assets. This process will require debate and deliberation along the way and invite many questions to which answers might not be immediate or clear. That is because our vision to create a digital built Britain is not complete or static; it is an evolving development emerging from multiple voices and viewpoints across a wide range of organisations – big and small, public and private, clients and contractors. Our webinar marking the publication of the report attracted a range of questions and this is precisely what is needed to interrogate the approach, to challenge its strengths, identify weaknesses and test resilience. Some people are keen to know if CDBB has started to build a prototype to demonstrate the framework but, as the pathway document explains, we must first build consensus on the prosed approach to an IMF – one cannot happen without the other. Work is underway to create a thin slice of an IMF to start to establish and test a common language and apply this early framework to a platform being developed by the Construction Innovation Hub (CIH) and put it under scrutiny. But first we need to map out the approach that will enable a demonstrable piece of the framework. The top-level academic contribution to the developing framework will also be studied closely to ensure it is robust and resilient, able to withstand change, to grow and expand. Challenge will simultaneously come from the bottom-up when organisations will input to test competency. The IMF is designed to make connections between digital twins and provide operators and decision-makers with resilient and secure verified data sharing to enable a wider view of the implications of decisions and insights that invite timely interventions and potentially better outcomes. The decision-making and the form it takes is the responsibility of organisations and businesses themselves and beyond the scope of this report – but we do want to hear your views. With that in mind we have created two new communities here on the DT Hub – Architects and Developers – dedicated to the discussion around the development of the IMF and the operational implementation of the Framework within organisations. Our Architects Community has been established for those involved in real-world application and implementation of information management. The group will test and calibrate the NDT programme’s approach to the IMF and provide a forum for discussion on challenges, opportunities and the practicalities for implementation. Our Developer Community, comprising information management, data science and integration specialists, is bringing scrutiny to the IMF approach and has been established to provide a rich discussion area for the core concepts, tenets and philosophies of the Framework and its constituent parts. If you have ideas, questions or observations about the Pathway document then please engage with us via these two new communities. Achieving alignment, interoperability, protocols, governance and standards to allow individual businesses to flourish while serving the interests of society needs engagement and contribution from as many of you as possible. It is time to collaborate at scale. I look forward to working together to shape an IMF to secure the sharing of data, enabling insight to drive informed decision-making is an essential process, unlocking value and delivering better outcomes for the greater good.
Top
×
×
  • Create New...