Jump to content

The IMF & Connecting Digital Twins

Showing all content tagged 'Information Management Framework (IMF)', 'Ontology', 'Data quality', 'Data value', 'Data sharing', 'Data visualisation', 'Regulation', 'Ethics', 'Legal', 'Security', 'National Digital Twin', 'Interoperability', 'CReDo', 'Climate Resilience', 'Process Model based information Requirements', 'Integration Architecture', 'Industry Data Models & Reference Data', 'Foundation Data Model' and 'Top-Level Ontology' and posted in for the last 365 days.

  • Content items, comments, and reviews
  • Content items only

  • Tagged with

    Type tags separated by commas.

Content Types

This stream auto-updates

  1. Last week
  2. Ioana R

    Digital twin Internship

    Hello 🖐️ I am an Architecture Graduate, interested in Digital Twin, also looking for an internship. I would appreciate if anyone knows companies that recruit in this domain. Thank you ! 🙃
  3. Digital Twins are a way of getting better insights about the assets you commission, design, construct or manage, enhancing their performance and the outcomes for people who use them. But how do you get started creating one? What are the questions you need to ask yourself and potential challenges you’ll face? What lessons have been learned that may have slipped through the cracks of academic papers and published case studies? ‘Digital Twin Journeys’ will present lessons learned by our researchers in digital twin projects enabled by the Construction Innovation Hub. Culminating in a report for industry professionals who are involved with developing their first digital twins (March 2022), this series of outputs will highlight in various engaging formats many of the processes, decisions and insights our researchers have explored during their own digital twin journeys. Hear from the researchers themselves about how they have developed digital twin processes and tools, and the key themes that run through their projects. The outputs will be shared on the DT Hub blog, and will be collated on a dedicated page on the CDBB website. But first, we want to hear from you! Let us know in the comments what you still want to know about the process of developing digital twins.
  4. Peter Parslow

    Describing a digital twin - seeking feedback

    "intervention" in "city scale digital twins", which have until recently been the domain of "smart cities" discussions, the language is still settling down, but.... The ISO/IEC Smart Cities working group talks about "city models" ("appropriate set of data which models those physical and social aspects of the city that are relevant for its objectives"). Sometimes those objectives are just to "follow" & sometimes they include to "influence" / change. Setting/agreeing those objectives - and hence the scope of the 'twin' is a social/governance question, with its own set of standards & approaches. ISO/IEC 30146 Smart city ICT indicators gives a set of quality indicators for city models; in the current edition, these are focused on it being complete (for purpose), up to date, and used in operational / crisis / analytical / strategic business processes - deliberately echoing ISO 30182 Smart City concept model - Guidance for establishing a model for data interoperability (which is itself based on BSI PAS 182 with the same name, the result of work across some 80 UK local authorities). I could envisage a sort of maturity level for city digital twins, 'starting' with those which try to keep up & working through to those which actively manage agreed aspects of the city. 30145-3 Smart City ICT reference framework - engineering framework considers sensors & actuators, where 'senses' sense the environment, and actuators change it (open / close flood gates, change traffic lights to influence traffic flow, adjust building heating controls).
  5. A4I round 6 launches tomorrow, 29/07/2021. This funding is aimed at SMEs looking to solve analysis or measurement problems. Below are some example ideas which might be eligible for A4I funding, and relevant to Digital Twin development: Collection of real-time data Accessing new sensing technologies, analytical tools & methodologies for input into Digital Twins Data analysis techniques Developing new analytical techniques or systems to improve existing Digital Twins e.g. data quality verification, or generating new insights using AI. Measurement of Digital Twin performance Note that this is a fast tracked funding round so please pay close attention to the closing dates. Link to the full information on the A4I funding: https://apply-for-innovation-funding.service.gov.uk/competition/975/overview For projects requiring Hartree Centre capabilities (AI, Data Science, HPC) you can also contact me directly to discuss the project and funding submission process. Examples of previous A4I projects: https://www.a4i.info/a4i_case_studies/data-performance-consultancy-limited/?bpage=1 https://www.a4i.info/a4i_case_studies/riskaware/?bpage=1 Summary:
  6. Earlier
  7. If you read carefully the DT promotion documents is all about only the capitalist VC and Ivory Tower academic research supported and NOTHING FOR GENERAL PUBLIC. VIVA BREXIT - NO ANY TECH infrastructure support for SME and innovative communities.
  8. Does this help? Elevator pitch (also known as an elevator speech) is a short, persuasive speech you use to introduce yourself, your product, or your company. Its purpose is to explain the concept quickly and clearly to spark interest in who you are and what you do.
  9. The pandemic has highlighted the need to make better, data-driven decisions that are focused on creating better outcomes. It has shown how digital technologies and the data that drives them are key to putting the right information in the right hands at the right time to ensure that we make the right decision to achieve the right outcomes. Connected ecosystems of digital twins, part of the cyber physical fabric, will allow us to share data across sectors, in a secure and resilient fashion, to ensure that we can make those important decisions for the outcomes that we need. They provide us with a transformative tool to tackle the major issues of our time, such as climate change, global healthcare and food inequality. We must use digital twins for the public good, as set out in “Data for the Public Good”, and we must also use those digital twins to create a better future for people and the planet. The recent publication of the Vision for the Built Environment sets out a pioneering vision for the built environment, and we want to see that vision expanded further, to include other sectors, such as health, education, manufacturing and agriculture. As the UK considers what a national digital twin might look like, we draw on the experience of the past three years to add to the discussion. A UK national digital twin must have a purpose-built delivery vehicle that works through coordination, alignment and collaboration. It needs to bring together those working in the field, across sectors, across industries, and across government departments. It must balance the need for research, both within academic institutions and industry, with the industry implementation and adoption that is already underway. And it must ensure that the programme is socio-technical in nature; if we concentrate solely on the technical factors, while failing to address the equally important social considerations, we risk creating a solution that cannot or will not be adopted – a beautiful, shiny, perfect piece of ‘tech’ that sits on a shelf gathering dust. There are many in the UK doing fantastic work in the digital twin space, and the wider cyber-physical fabric of which connected digital twins are a part. We know from experience that we get much better outcomes when we work together as a diverse team, rather than in siloes which lead to fragmentation. Industry is already creating digital twins and connecting them to form ecosystems. If we are to avoid divergence, we have to act now. To start the discussion and allow the sharing of thoughts and experience, the Royal Academy of Engineering has convened an open summit, hosted by the DT Hub on the 19th July from 10:00 – 16:00. The day will start with an introduction laying out the opportunities and challenges we face as a nation and as a planet. This will be followed by four expert-led panels, each with a Q&A session. The first is chaired by Paul Clarke CBE on the cyber physical fabric; followed by a panel on data and technical interoperability chaired by Professor Dame Wendy Hall; after lunch, Professor David Lane CBE will chair a panel on research; followed by a panel on adoption chaired by Mark Enzer OBE. The four panel chairs will convene a final plenary session. I do hope you will join us, to hear the experiences of others and to add your own expertise and knowledge to the conversation. To register for the Summit, click here.
  10. Version 1.0.0


    The Information Management Framework (IMF) is intended to enable better information management and information sharing at a national scale and provide the standards, guidance and shared resources to be (re)used by those wishing to participate in the National Digital Twin ecosystem. While the scope of the IMF is broad, the “7 circles of Information Management" diagram is a pragmatic way to divide the information management space into areas of concern that can be addressed separately whilst supporting each other. It identifies coherent areas of interest that can be addressed relatively independently. As part of the second circle of the diagram, the IMF technical team has released this paper outlining our recommended approach to developing information requirements, based on the analysis of process models. The methodology first identifies an organisation's processes, the decisions taken as part of the process, and then the information requirements to support the decisions. These are communicated to those who create the information. This provides a systematic approach to identifying the information requirements and when it is most cost-effectively created. Managed appropriately, this information capture can avoid costly activity to create information by surveying or inspecting in-use assets. To allow this anticipation of information needs, the methodology set out in the paper recommends the following steps: identify the lifecycle activities that an organisation performs decompose the activities in order to identify the material “participants” involved (“things” required for each activity: people, resources, assets, equipment, products, other activities, …) identify the decisions critical for these activities identify the information requirements for those decisions and the quality required. Read more in the blog containing a video introduction to the “7 circles of Information Management” by IMF Technical Team Lead, Matthew West, followed by a deep dive into the second circle – Process Model based Information Requirements – presented by Al Cook, main author of this paper.
  11. Dave Murray

    Test Engineering and DTs

    Following several conversations, I was persuaded that this approach was too narrow. The original topic could be included in a wider objective to look at the rapidly emerging ‘new-look’ of defence, its Digital Backbone and its nascent DT journey. See the new network “The Defence Digital Network” for the outcome!
  12. frank doherty

    Planning Golden Thread

    why is building and fire safety being referred to under "planning" when planning and building control are two separate professions and disciplines?
  13. hello as a masters student currently I'm doing my thesis related to digital twin, and I just proposed a cloud based tool that can control and monitor IoT device through cloud using BIM Model File. I've tested the tool and it is now working but my worry is which standard or ontology that states the relation between BIM and IoT device specially for Controlling? thanks in advance
  14. Ontopop available to everyone in our society for free with full features and can extend and customize as you wish. Thank you @Ian Gordon - (But your GitHub link doesn't works and NO any Ontopop repository there.) https://github.com/hyperlearningai/ontopop https://github.com/leipzig/ontopop
  15. Version 1.0.0


    The Approach to Develop the Foundation Data Model published in March 2021 follows up on the Survey of Top-level Ontologies (TLO) published in November 2020. It sets out the Top-Level-Ontology requirements for the NDT's Foundation Data Model. Drawing upon the assessment of the TLOs listed in the Survey of Top-level Ontologies, it identifies 4 potential candidates for the NDT’s TLO: BORO, IDEAS, HQDM and ISO 15926-2: The four candidates are distinct from the other TLOs in being 4-dimensionalist, i.e. they consider individual objects as four-dimensional, with both spatial and temporal parts.
  16. Version 1.0.0


    To underpin the sharing of data across organisations and sectors, the National Digital Twin progamme (NDTp) aims to develop an ontology - a theory of what exists, i.e. the things that exist and the rules that govern them – capable to describe “life, the universe and everything” (Adams, 1980). As set out in the Pathway towards the IMF, this ontology will consist of: a Foundation Data Model – the rules and constraints on how data is structured an ecosystem of Reference Data Libraries, compliant with the Foundation Data Model – the particular set of classes and properties we use to describe digital twins. To achieve the consistency required to share information across organisations and sectors, the Foundation Data Model needs to be underpinned by a Top-Level Ontology (TLO), i.e. the top level categories (“thing”, “class”, …) and the fundamental relationships between them that are sufficient to cover the scope of a maximally broad range of domains. As a starting point to define the NDT’s TLO, the technical team has reviewed and classified existing TLOs in A survey of Top-Level Ontologies, published November 2020, and assessed them against a set of requirements. This has led to the identification of 4 potential candidates for the NDT’s TLO: BORO, IDEAS, HQDM and ISO 15926-2, as set out in The Approach to Develop the FDM for the IMF publication. Ontologies continue to be added to the survey. Please refer to this page to see the most up-to-update list and to suggest other ontologies for the team to consider.
  17. Version 1.0.0


    A Survey of Industry Data Models and Reference Data Libraries published in November 2020 is the initial version of a continuous work to identify and assess existing Industry Data Models and Reference Data Libraries in terms of scope, quality and formalisation of the documentation, formalisation of the representation, maintenance frequency, usage … This survey is key to identifying existing industry data models that may have to be mapped into the NDT’s Foundation Data Model and Reference Data Library (RDL). Additionally, this work is intended to help the technical team to identify potential authoritative sources for key classes that will become part of the NDT’s RDL (for instance, units of measure). The list is open, and more standards are being added to the survey on the DT Hub. Please refer to this page to see the most up-to-date list and don’t hesitate to suggest standards for the team to add to the list.
  18. Version 1.0.0


    Following a year-long consultation exercise bringing together leading experts from the data science and information management communities, The pathway towards an Information Management Framework (IMF) report was published in May 2020. This report sets out the technical approach to deliver the Information Management Framework, a common language by which digital twins of the built and natural environment can communicate securely and effectively to support improved decision taking by those operating, maintaining and using built assets and the services they provide to society. The report outlines three building blocks to form an appropriately functioning technical core: · Foundation Data Model (FDM): a consistent, clear understanding of what constitutes the world of digital twins · Reference Data Library (RDL): the particular set of classes and properties we will use to describe our digital twins · Integration Architecture (IA): the protocols that will enable the managed sharing of data. A webinar, The pathway towards an Information Management Framework, was held on 8 June 2020, you can watch it here: Following the publication of the report,  an open consultation to get feedback on the methodology proposed was run and feedback was consolidated in the following summary.
  19. As part of the Climate REsilience DemOnstrator (CReDo) project - which is a collaboration of the National Digital Twin programme and the Connected Places Catapult - we are looking for a supplier to deliver the data engineering underpinning the demonstrator Digital Twin. The tasks to perform include: - data engineering, with data scientists and modellers as users; - descriptive data visualisation, showcasing the fusing of disparate datasets and computer models to paint a picture of multiple infrastructure systems in one place; and - source complementary datasets, join clean and enhance existing datasets, deal with missingness and creative data fusion We believe this is an excellent opportunity for the supplier to showcase their skills and capabilities, not just to the National Digital Twin community but to the wider world through a high-profile demonstration (hopefully at or linked to COP26). While we expect the outcomes of the project to be owned by the CPC (and disseminated widely to stimulate development and uptake of Digital Twins), we are very supportive of the supplier leveraging the outcomes to generate new business for themselves. The deadline for submission of proposal is 14 June and the contract is expected to start on 5 July. More details, including the official tender document, can be found here.
  20. As set out in the Pathway to the Information Management Framework, the Integration Architecture is one of the key technical components of the Information Management Framework. It consists of the protocols that will enable the managed sharing of data across the National Digital Twin. In the recently released Integration Architecture Pattern and Principles paper, the NDTp’s technical team set out key architectural principles and functional components for the creation of this critical technical component. The team defines a redeployable architectural pattern that allows the publication, protection, discovery, query and retrieval of data that conforms to the NDT’s ecosystem of Reference Data Libraries and the NDT’s Foundation Data Model. Download the Integration Architecture Pattern and Principles paper The Integration Architecture Pattern and Principles paper will take you through: A requirement overview: a series of use cases that the Integration Architecture needs to enable, including: routine operational use cases: where data from a diverse set of organisations can be shared and analysed for a single purpose (e.g to support legal and regulatory requirements) the ability to respond to an emergency: pulling together data from across different communities in a way that was not foreseen before the incident that caused the requirement ‘business as usual’ NDT maintenance use cases such as publishing a Digital Twin or adding a user to the NDT ecosystem. Architectural principles: key architectural principles that must be adhered to, regardless of the type of architecture that is implemented, including: Data quality: data quality needs to be measurable and published with the data itself Privacy of the published data: the Integration Architecture shall ensure that data is shared and used only according to the conditions under which it was published. Security: ensuring that all data and functions are secure from bad actors. Encryption will be a particularly key aspect of the security features in the Integration Architecture. Recommended integration architecture pattern: Three general architectural pattern options are explored in the paper (centralised, distributed, and federated). The benefits and concerns for each pattern are discussed with respect to the requirements. The recommended architectural pattern is a hybrid of these three approaches – centralising certain functions, whilst distributing and federating others. The recommended pattern is intended to allow datasets to be shared locally (i.e., within an NDT Node, see figure below), but will also allow for inter-node discovery, authorisation and data sharing to take place. NDT Nodes may be established by individual organisations, regulators and industry associations, or service providers and will be able to handle Digital Twins on behalf of their constituent organisations and provide a secure sharing boundary. In the recommended architecture: Datasets are published by the data owner (1), these are then made available to the organisations within the community of interest, in addition an event is issued to register publication with the Core (2). When queries are submitted (A), the dataset can then be discovered by organisations in other communities of interest (B) and retrieved where appropriate (C). The release, discovery and retrieval are carried out according to the authorisation service so that access is controlled as specified by the data owner. Detail of the functional components: The Core Services are likely to be quite thin, comprising mainly of: a master NDT Catalogue that holds the location of available NDT Datasets across the ecosystem the master FDM/RDL that will synchronise with the subset that is relevant for each NDT Node a publish/ subscribe model to propagate data changes to parties that have an interest and appropriate contract in place. The Core and each NDT Node shall interact through a microservice layer, with which they shall have to be compliant. Next steps The paper concludes with a list of 10 key tasks to develop further the Integration Architecture components. We will make sure to inform you on progress and in the meantime, we are looking forward to hearing your questions and comments on the paper!
  21. Digital Water

    Climate Accounting - Hyperledger

    An example was given of applying accounting principles which justified cutting down trees to encourage growth of a spagnum bog because Spagnum is apparently 200 times more effiient at carbon capture and water retention than a wooded forest covering the same ground area and faster growing.
  22. HenryFT

    Insights on performance saves money

    I completely believe this, you can't manage something you can't see
  23. Last May the National Digital Twin programme (NDTp) published our proposed Pathway to an Information Management Framework (IMF). The publication was accompanied by an open consultation to seek feedback on our proposed approach and to hear from across the community about how they thought the IMF should develop to support their use and adoption of it. The consultation ran until the end of August, alongside ongoing engagement with the programme’s technical stakeholders, we received a great deal of valuable feedback. The full summary of the IMF Pathway Consultation Responses is published today, written by Miranda Sharp, NDTp Commons Lead. Overall, the responses to the Pathway were positive and respondents welcomed the opportunity to give feedback and contribute to its improvement. This was hugely gratifying for everyone who has contributed to the work over the last 18 months. Some of the responses to the pathways document challenged the proposed approach and we are keen to keep learning from these differences of opinion and perspective. In the paper we have summarised the range of responses in the table below: Positive response themes Nuanced response themes and questions Critical response themes The work is welcome and progress towards it is considered consistent with the Gemini Principles. The plans to build on existing work are particularly welcome. Discussion of the technical challenge is valid but respondents called for human factors associated with change to be explored in parallel. A small number of respondents rejected the approach as too “top down”. There was broad agreement that the IMF should consist of a FDM, RDL and IA. Representatives from organisations often sought an indication for tangible next steps. Some respondents stated that more than a single Integration Architecture is required. The models and protocols described in the report were seen as comprehensive. There were specific asks for advice on data quality, security, legal provenance and the securing of benefits from investment. Several responses disputed the possibility and validity of a single FDM . More details about the responses can be found in the paper, but what we are hoping is to use this space in the IMF Community to discuss and work on the themes raised here in the IMF Community. Do you agree or disagree with these themes? Do you think any are missing? What work do you think could be done to address the questions and criticisms? Some work has already begun: The legal, commercial and regulatory elements of resilient and secure sharing of information. You can read about our first steps on this journey, Legal Roundtables held in November, here under the Digital Commons/ Legal. The need for a demonstrator and guidance for communicating the benefits of a National Digital Twin and how to begin readying organisations for the change. There was demand for use cases, case studies which is being addressed through the Gemini Programme and the DT Toolkit. Work has been undertaken this year, with funding from the Construction Innovation Hub (CIH) to create an FDM Seed for the CIH’s Platform Design Programme. We hope this will be the first demonstrator, of sorts, for the technical work that is being developed by the NDTp’s Technical Team. You can see the outputs of the Technical team here, and we will be releasing the next paper, Approach to develop the Foundation Data Model (FDM) here soon. We are planning further demonstrators that show the tangible benefits of the National Digital Twin, we hope to be able to share our plans with you in the near future. The Programme also strives to continue to build a body of evidence (‘Corpus’)), as per the Tasks set out in the Pathway, to build other demonstrators for the programme. Alongside this work the publication of the Response to IMF Pathway Consultation will contribute to the development of an updated Pathways document that will refocus efforts of the NDTp. We hope to share that will you in the coming months.
  24. Version 1.0.0


    In May 2020 the National Digital Twin programme (NDTp) published the Pathway to an Information Management Framework (IMF). The publication was accompanied by an open consultation to seek feedback on our proposed approach and to hear from across the community about how they thought the IMF should develop to support their use and adoption of it. The consultation ran until the end of August, with ongoing engagement with the programme’s technical stakeholders, we received a great deal of valuable feedback. The full summary of the IMF Pathway Consultation Responses is published here today, written by Miranda Sharp, NDTp Commons Lead.
  1. Load more activity
  • Create New...