Jump to content
Join live conversation in our weekly Gemini Call - knowledge sharing and digital twin presentations from the community ×

Search the Community

Showing results for tags 'Information Management Framework (IMF)'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Themes

  • Member Forums
    • General discussion
    • Testing digital twin concepts
    • Digital twin competencies
    • Pathway to value
    • IMF Pathway
    • Videos
    • Q & A
  • IMF Community's Forum
  • Data Value and Quality's Forum
  • 4-dimensionalism's Forum
  • The Defence Digital Network's Welcome!
  • The Defence Digital Network's Defence DT Roadmap
  • The Defence Digital Network's Acronym Buster
  • The Defence Digital Network's Open Forum
  • The Defence Digital Network's Documents
  • Open Innovation - Open (Citizen) Science - SDG's Open Innovation
  • Open Innovation - Open (Citizen) Science - SDG's Social Innovation Ecosystem
  • Open Innovation - Open (Citizen) Science - SDG's Events
  • Funding / Collaborative R&D Opportunities's Challenges
  • Funding / Collaborative R&D Opportunities's Funding
  • Italian DT Hub's Q&A
  • Italian DT Hub's News
  • Gemini Papers Community Review's Gemini Papers
  • DT Hub Community Champions's Discussion
  • Gemini Call's Full recordings
  • Gemini Call's Gemini Snapshot and agenda

Calendars

  • Community Calendar
  • Italian DT Hub's Events
  • DT Hub Community Champions's Events

Categories

  • A survey of Top-level ontologies - Appendix D

Categories

  • Articles
  • Blogs
  • Publications
  • Editorials
  • Newsletters
  • Shared by the Community

Categories

  • A Survey of Industry Data Models and Reference Data Libraries

Categories

  • Climate Resilience Demonstrator (CReDo)
  • Gemini Call Feature Focus presentations
  • Hub Insights
  • Digital Twin Talks: Interconnection of digital twins
  • Digital Twin Talks: Exploring the definitions and concepts of digital twins
  • Other Digital Twin media

Categories

  • Member Resources
  • Public Resources
  • Guidance
  • IMF Community's Files
  • Data Value and Quality's Shared Resources
  • Italian DT Hub's Risorse
  • Gemini Call's Slide decks
  • Gemini Call's Archive

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

  1. Described in the Pathway to the Information Management Framework, the Integration Architecture is one of the three key technical components of the Information Management Framework (IMF), along with the Reference Data Library and the Foundation Data Model. It consists of the technology and protocols that will enable the managed sharing of data across the National Digital Twin (NDT). The IMF Integration Architecture (IA) team began designing and building the IA in April 2021. This blog gives an insight on its progress to date. Principles First, it is worth covering some of the key principles being used by the team to guide the design and build of the IA: Open Source: It is vital that the software and technology that drives the IA are not held in proprietary systems that raise barriers to entry and prevent community engagement and growth. The IA will be open source, allowing everyone to utilise the capability and drive it forward. Federated: The IA does not create a single monolithic twin. When Data Owners establish their NDT Node, the IA will allow them to publish details of data they want to share to a NDT data catalogue, and then other users can browse, select and subscribe to the data they need to build a twin that is relevant to their needs. This subscription is on a node-to-node basis, not via a central twin or data hub, and Owners can specify the access, use, or time constraints that they may wish to apply to that subscriber. Once subscribed, the IA takes care of authenticating users and updating and synchronising data between nodes. Data-driven access control: To build trust in the IA, Data Owners must be completely comfortable that they retain full control over who can access the data they share to the NDT. The IA will use an ABAC security model to allow owners to specify in fine-grained detail who can access their data, and permissions can be added or revoked very simply and transparently. This is implemented as data labels which accompany the data, providing instructions to receiving systems on how to protect the data. IMF Ontology Driven: NDT Information needs to be accessed seamlessly. The NDT needs a common language so that data can be shared consistently, and this language is being described in the IMF Ontology and Foundation Data Model being developed by another element of the IMF team. The IA team are working with them closely to create capabilities that will automate conversion of incoming data to the ontology and transact it across the architecture without requiring further “data wrangling” by users. Simple Integration: To minimise the risk of implementation failure or poor engagement due architectural incompatibility or high cost of implementation, the IA needs to be simple to integrate into client environments. The IA will use well understood architectural patterns and technologies (for example REST, GraphQL) to minimise local disruption when data owners create an NDT node, and ensure that once implemented the ongoing focus of owner activity is on where the value is – the data – rather than maintenance of the systems that support it. Cloud and On-Prem: An increasing number of organisations are moving operations to the cloud, but the IA team recognises that this may not be an option for everyone. Even when cloud strategies are adopted, the journey can be long and difficult, with hybridised options potentially being used in the medium to long term. The IA will support all these operating modes, ensuring the membership of the NDT does not negatively impact existing or emerging environment strategies. Open Standards: for similar drivers behind making the IA open-source, the IA team is committed to ensuring that data in the NDT IA are never locked-in or held in inaccessible proprietary formats. What has the IA team been up to this year? The IMF chose to adopt the existing open-source Telicent CORE platform to handle the ingest, transformation and publishing of data to the IMF ontology within NDT nodes, and the focus has been on beginning to build and prove some of the additional technical elements required to make the cross-node transactional and security elements of the IA function. Key focus areas were: Creation of a federation capability to allow Asset Owners to publish, share and consume data across nodes Adding ABAC security to allow Asset Owners to specify fine-grain access to data Building a ‘Model Railway’ to create an end-to-end test bed for the NDT Integration Architecture, and prove-out deployment in containers
  2. The vision of a National Digital Twin as an ecosystem of connected digital twins enabling better social and economic outcomes across the built environment continues to gain wide support. But to make it a reality, we need people with the right skills to put it into play. “Collaborate on the rules and compete on the game” is a phrase we use to describe how we want connected digital twins to evolve. The sporting analogy carries over well into skills. We want the best teams to deliver on the National Digital Twin, not just a team of strikers or goalkeepers but diverse teams with a range of skillsets and capabilities. Diversity has to be at the heart of a skills strategy ensuring that the future workforce is more effective. The skills & competency framework sets out the skills that are needed to manage information and work with data in the future. These aren’t just what we might see as hardcore technical skills such as data modelling and analytics which are described as digital skills but also business skills like transformational leadership which recognises the benefits of getting information management right. The capability enhancement programme sets out pathways for individuals and organisations to get the right skills in place depending upon aspirations both at the personal level and the organisational level. Have a go at the self-assessment questionnaire to assess what training might be helpful to you and take a look at the training register to find a suitable course. The National Digital Twin is a long term journey and there is time to get the right skills in place to reach our destination.
  3. You’re invited to a webinar on 2nd March to find out how collaboration through connected digital twins can help plan resilient cities and infrastructure. The National Digital Twin programme has developed a Climate Resilience Demonstrator (CReDo), a pioneering climate change adaptation digital twin project that provides a practical example of how connected data can improve climate adaptation and resilience across a system of systems. Watch the film Tomorrow Today, and try the interactive app to see what CReDo has been working towards. The CReDo team will use synthetic data developed through the project to show how it is possible to better understand infrastructure interdependencies and increase resilience. Join the webinar to hear from the CReDo team about the work that has happened behind the scenes of developing a connected digital twin. CReDo is the result of a first-of-its-kind collaboration between Anglian Water, BT and UK Power Networks, in partnership with several academic institutions. The project has been funded by Connected Places Catapult (CPC) and the University of Cambridge, and technical development was led by CMCL and the Hartree Centre. This collaboration produced a demonstrator that looks at the impact of flooding on energy, water and telecoms networks. CReDo demonstrates how owners and operators of these networks can use secure, resilient, information sharing across sector boundaries to adapt to and mitigate the effect of flooding on network performance and service delivery. It also provides an important template to build on to turn it to other challenges, such as climate change mitigation and Net Zero. Hear from members of the CReDo team – including the asset owners, CPC, and the technical development team - about the demonstrator they have delivered and the lessons they learned. If you’re interested in using connected digital twins to forge the path to Net Zero, then this event is for you. Register for our end-of-project webinar on 2nd March, 10:30 – 12:00: https://www.eventbrite.co.uk/e/credo-collaborating-and-resilience-through-connected-digital-twins-tickets-228349628887
  4. Version 1.0.0

    6 downloads

    After introducing the Information Management Framework (IMF) and its scope, Matthew West, Technical Lead, National Digital Twin Programme, touches on why the IMF doesn’t exist already and provides insights into macro-economic enablers to support its development and adoption. One of the key points is looking into the values / benefits of information in order to establish the business case for the required market enablement. Click the download button on the right to access the slides deck and watch the recording below.
  5. At this morning's Gemini call NDTp update, @Peter El Hajjpresented a timeline of milestones from across the programme in 2021. From breaking through the 3000 member barrier here in the DT Hub community, to projects that have received wide recognition such as the CReDo film, a lot has happened that you may not know about. Here is the timeline and I have taken the liberty of sharing links to information on all of the key milestones below: Digital Twin Toolkit and Collaborative Workshop 2020 Benchmark Report Skills and Competency framework Capability Enhancement programme Agile Standards white paper The 7 Circles of Information Management and Process Model-based Information Requirement Cyber Physical Summit 1 Year of Gemini Calls The journey towards ‘grounded semantics’ CReDo webinar and film Ethics and the Gemini Principles report Would you like to give some recognition for a job well done in 2021? Whether it's for any of the CDBB team or even within your own organisation, feel free to share it below. On a final note, it's been a pleasure to be your Community Manager for the last 2 months, to have seen so many faces on Gemini Calls and I'm excited to establish our brand new DT Hub Community Council in 2022!
  6. Anne GUINARD

    Updates to the Network

    We would like to announce a couple of changes to the IMF Community network: You may be aware that the content of this open network has been publicly available since 30 November 2021. It means that posts in the forum are also visible to guests of the website (non-registered members). In line with the change in permissions, we have also refreshed the description for the network, to align with an enriched definition of the IMF. Following the publication of the Pathway towards an Information Management Framework, we have enriched the definition of the IMF to emphasise the importance of the non-technical elements of the framework, which have always been recognised as crucial for the realisation of a National Digital Twin. Non-technical and technical elements are placed on an equal footing: The Information Management Framework is a collection of open, technical and non-technical standards, guidance and common resources to enable the seamless sharing of data across organisations and sectors. The changes to the network reach and its description reflect the widening scope and relevance of the IMF and position the network as a window to the IMF team thinking, and as a unique channel for members to contact the team and ask questions about IMF resources.
  7. Matthew West, Technical Lead for the NDTp, recently gave an internal presentation on the Information Management Framework (IMF) with a focus on its Change dimension. After introducing the IMF and its scope, the presentation touches on why the IMF doesn’t exist already and provides insights into macro-economic enablers to support its development and adoption. One of the key points is looking into the values / benefits of information in order to establish the business case for the required market enablement. Access the slides deck and watch the recording below:
  8. Bringing CreDo to life With COP26 on the horizon, we are fully immersed in preparing to showcase the Climate Resilience Demonstrator - CReDo. We have appointed two partners to help us communicate the story in an engaging and inspiring way and demonstrate the huge potential of information sharing. Firstly, we are working with Crocodile Media to develop a short, dramatic film that will tell the story of a flooding event and how connected digital twins may provide a better response to climate disasters. The second partnership is with ESRI, a provider of online maps and 3D models of cities, who are developing an interactive demonstrator that will allow the public to test out various scenarios on a made-up city. The purpose of both will be to demonstrate how information sharing across organisational boundaries is a key enabler to improving resilience of infrastructure systems. We have organised an event “Increasing our climate resilience through connected digital twins” on the 2nd of Nov to watch the film, see the interactive tool in action and find out more about how connected digital twins can help to tackle climate change. We’re delighted that the project doesn’t end with COP26 – instead, the technical development of CReDo will continue until next year and will be delivered through a collaboration of research centres and industry partners; The Universities of Cambridge, Edinburgh, Exeter, Newcastle and Warwick will work alongside the Hartree Centre, DAFNI, Science and Technology Facilities Council, CMCL Innovations, the Joint Centre of Excellence in Environmental Intelligence, CPNI and Mott MacDonald. We are also delighted to be working in partnership with three major UK utility providers; Anglian Water, BT and UK Power Networks who are equally committed to making bold steps towards resilient infrastructure. Progress on IMF’s seven circles We have been moving forward with all seven circles of the Information Management Framework from top level ontologies, to integration architecture to information quality management. One document I particularly want to highlight is ‘Managing Shared Data’, an exciting piece of work being developed by @Matthew West, Technical Lead for the NDTp. He is bringing together the lessons we’ve learned over the past three years since publication of the Pathway toward an IMF report and providing clarity on what it means for organisations to manage information effectively, an essential enabler for connecting digital twin. It is in development and we’re hoping to release the final document by the end of the year. DT Hub There are three main activities to highlight for October: DT Hub website update. We’re keen to keep improving the useability and layout of the site so the new version of the DT Hub will include a public facing page, with all the resources to make it easier to access public documents. It also includes a page to host all information related to CReDo. DT Roadblocks workshop series. As the community progresses on their digital twin journeys, it is inevitable there will be a myriad of challenges. The great aspect of being part of a community is that there are others who have faced similar challenges and can share their learnings or provide insights in to how to overcome your particular hurdle. Our first workshop is aptly named, “Problems shared, problems halved”. If you would like to be part of a constructive discussion, do sign up to this series, running until the end of the year. Smart Infrastructure Index. We have just launched our latest SII survey to enable members to measure their digital maturity and benchmark progress against peers. When members complete and submit the survey, the SII will generate a personalised report including a score and targeted recommendations. The idea is that it enables users to identify areas for improvement and to support the prioritisation of future activities. The survey is open until mid-November and can be accessed here.
  9. Danny Murguia

    Network FOuNTAIN Online Survey

    Information ontologies and management in a digitised built environment The Network FOuNTAIN is inviting professionals in the AEC industry to participate in our survey that investigates the role of information ontologies and management for a digitised built environment. If you have experience in managing digital information, help us understand the level of adoption of information ontologies and information management activities and their relationship with performance outcomes. To access our survey please click here: https://lboro.onlinesurveys.ac.uk/network-fountain-survey. For comments and feedback please contact the Principal Investigator, Dr Peter Demian, at P.Demian@lboro.ac.uk.
  10. until
    The Climate Resilience Demonstrator (CReDo) project from the National Digital Twin programme is holding a webinar to launch the project to a global audience in conjunction with the COP26 climate conference on 2nd November at 10:30-12. This webinar replaces the weekly Gemini Call, and the DT Hub community are encouraged to sign up, as well as inviting their wider networks to attend. The climate emergency is here now, and connected digital twins are an important part of achieving net zero and climate resilience. The CReDo team will present how the project meets this urgent need, and will premiere two exciting outputs – a short film and an interactive visualisation of how connected data across three infrastructure networks can provide better insights and lead to better resilience of the system-of-systems overall. Only if we come together to securely share data across sectors can we plan a smarter, greener, more resilient built environment. Book your spot today! Keep an eye on the DT Hub website for updates about the CReDo programme.
  11. I came across an EU funded project "xr4all" which provides a development environment(among other things) for XR projects. The details are here: https://dev.xr4all.eu Will it be possible for the NDT programme to provide similar platform for DT community in the UK? It will help in fostering rapid collaboration and development of the DT ecosystem. Thanks and kind regards, Ajeeth
  12. Version 1.0.0

    98 downloads

    A Survey of Industry Data Models and Reference Data Libraries published in November 2020 is the initial version of a continuous work to identify and assess existing Industry Data Models and Reference Data Libraries in terms of scope, quality and formalisation of the documentation, formalisation of the representation, maintenance frequency, usage … This survey is key to identifying existing industry data models that may have to be mapped into the NDT’s Foundation Data Model and Reference Data Library (RDL). Additionally, this work is intended to help the technical team to identify potential authoritative sources for key classes that will become part of the NDT’s RDL (for instance, units of measure). The list is open, and more standards are being added to the survey on the DT Hub. Please refer to this page to see the most up-to-date list and don’t hesitate to suggest standards for the team to add to the list.
  13. Version 1.0.0

    128 downloads

    To underpin the sharing of data across organisations and sectors, the National Digital Twin progamme (NDTp) aims to develop an ontology - a theory of what exists, i.e. the things that exist and the rules that govern them – capable to describe “life, the universe and everything” (Adams, 1980). As set out in the Pathway towards the IMF, this ontology will consist of: a Foundation Data Model – the rules and constraints on how data is structured an ecosystem of Reference Data Libraries, compliant with the Foundation Data Model – the particular set of classes and properties we use to describe digital twins. To achieve the consistency required to share information across organisations and sectors, the Foundation Data Model needs to be underpinned by a Top-Level Ontology (TLO), i.e. the top level categories (“thing”, “class”, …) and the fundamental relationships between them that are sufficient to cover the scope of a maximally broad range of domains. As a starting point to define the NDT’s TLO, the technical team has reviewed and classified existing TLOs in A survey of Top-Level Ontologies, published November 2020, and assessed them against a set of requirements. This has led to the identification of 4 potential candidates for the NDT’s TLO: BORO, IDEAS, HQDM and ISO 15926-2, as set out in The Approach to Develop the FDM for the IMF publication. Ontologies continue to be added to the survey. Please refer to this page to see the most up-to-update list and to suggest other ontologies for the team to consider.
  14. 122 downloads

    As set out in the Pathway to the Information Management Framework, the Integration Architecture is one of the three key technical components of the Information Management Framework, along with the Reference Data Library and the Foundation Data Model. It consists of the protocols that will enable the managed sharing of data across the National Digital Twin. In the Integration Architecture Pattern and Principles paper, the National Digital Twin programme’s (NDTp) technical team sets out key architectural principles and functional components for the creation of this critical technical component. The team defines a redeployable architectural pattern that allows the publication, protection, discovery, query and retrieval of data that conforms to the NDT’s ecosystem of Reference Data Libraries and the NDT’s Foundation Data Model. The paper will take you through: A requirement overview: a series of use cases that the Integration Architecture needs to enable, including: routine operational use cases: where data from a diverse set of organisations can be shared and analysed for a single purpose (e.g to support legal and regulatory requirements) the ability to respond to an emergency: pulling together data from across different communities in a way that was not foreseen before the incident that caused the requirement ‘business as usual’ NDT maintenance use cases such as publishing a Digital Twin or adding a user to the NDT ecosystem. Architectural principles: key architectural principles that must be adhered to, regardless of the type of architecture that is implemented, including: Data quality: data quality needs to be measurable and published with the data itself Privacy of the published data: the Integration Architecture shall ensure that data is shared and used only according to the conditions under which it was published. Security: ensuring that all data and functions are secure from bad actors. Encryption will be a particularly key aspect of the security features in the Integration Architecture. Recommended integration architecture pattern: Three general architectural pattern options are explored in the paper (centralised, distributed, and federated). The benefits and concerns for each pattern are discussed with respect to the requirements. The recommended architectural pattern is a hybrid of these three approaches – centralising certain functions, whilst distributing and federating others. The recommended pattern is intended to allow datasets to be shared locally (i.e., within an NDT Node, see figure below), but will also allow for inter-node discovery, authorisation and data sharing to take place. NDT Nodes may be established by individual organisations, regulators and industry associations, or service providers and will be able to handle Digital Twins on behalf of their constituent organisations and provide a secure sharing boundary. In the recommended architecture: Datasets are published by the data owner (1), these are then made available to the organisations within the community of interest, in addition an event is issued to register publication with the Core (2). When queries are submitted (A), the dataset can then be discovered by organisations in other communities of interest (B) and retrieved where appropriate (C). The release, discovery and retrieval are carried out according to the authorisation service so that access is controlled as specified by the data owner. Detail of the functional components: The Core Services are likely to be quite thin, comprising mainly of: a master NDT Catalogue that holds the location of available NDT Datasets across the ecosystem the master FDM/RDL that will synchronise with the subset that is relevant for each NDT Node a publish/ subscribe model to propagate data changes to parties that have an interest and appropriate contract in place. The Core and each NDT Node shall interact through a microservice layer, with which they shall have to be compliant. Next steps: The paper concludes with a list of ten key tasks to develop further the Integration Architecture components. We will make sure to inform you on progress and in the meantime, we are looking forward to hearing your questions and comments.
  15. Version 1.0.0

    45 downloads

    In May 2020 the National Digital Twin programme (NDTp) published the Pathway to an Information Management Framework (IMF). The publication was accompanied by an open consultation to seek feedback on our proposed approach and to hear from across the community about how they thought the IMF should develop to support their use and adoption of it. The consultation ran until the end of August, with ongoing engagement with the programme’s technical stakeholders, we received a great deal of valuable feedback. The full summary of the IMF Pathway Consultation Responses is published here today, written by Miranda Sharp, NDTp Commons Lead.
  16. Version 1.0.0

    98 downloads

    The Approach to Develop the Foundation Data Model published in March 2021 follows up on the Survey of Top-level Ontologies (TLO) published in November 2020. It sets out the Top-Level-Ontology requirements for the NDT's Foundation Data Model. Drawing upon the assessment of the TLOs listed in the Survey of Top-level Ontologies, it identifies 4 potential candidates for the NDT’s TLO: BORO, IDEAS, HQDM and ISO 15926-2: The four candidates are distinct from the other TLOs in being 4-dimensionalist, i.e. they consider individual objects as four-dimensional, with both spatial and temporal parts.
  17. Version 1.0.0

    174 downloads

    The Information Management Framework (IMF) is intended to enable better information management and information sharing at a national scale and provide the standards, guidance and shared resources to be (re)used by those wishing to participate in the National Digital Twin ecosystem. While the scope of the IMF is broad, the “7 circles of Information Management" diagram is a pragmatic way to divide the information management space into areas of concern that can be addressed separately whilst supporting each other. It identifies coherent areas of interest that can be addressed relatively independently. As part of the second circle of the diagram, the IMF technical team has released this paper outlining our recommended approach to developing information requirements, based on the analysis of process models. The methodology first identifies an organisation's processes, the decisions taken as part of the process, and then the information requirements to support the decisions. These are communicated to those who create the information. This provides a systematic approach to identifying the information requirements and when it is most cost-effectively created. Managed appropriately, this information capture can avoid costly activity to create information by surveying or inspecting in-use assets. To allow this anticipation of information needs, the methodology set out in the paper recommends the following steps: identify the lifecycle activities that an organisation performs decompose the activities in order to identify the material “participants” involved (“things” required for each activity: people, resources, assets, equipment, products, other activities, …) identify the decisions critical for these activities identify the information requirements for those decisions and the quality required. Read more in the blog containing a video introduction to the “7 circles of Information Management” by IMF Technical Team Lead, Matthew West, followed by a deep dive into the second circle – Process Model based Information Requirements – presented by Al Cook, main author of this paper.
  18. 106 downloads

    The National Digital Twin programme, in partnership with the Construction Innovation Hub, has released a Capability Enhancement programme, as a follow-on from the Skills and Competency Framework published in March 2021. The Capability Enhancement programme aims to provide organisations and individuals with tools and guidance to understand and cultivate the skills and knowledge needed to support the National Digital Twin, as outlined in the Skills and Competency Framework. The Capability Enhancement Programme includes a self-assessment survey to help individuals assess their competency level against a chosen role and a training register that provides a starting point for individuals and organisations to develop a training plan. Self Assessment Questionnaire Training Register Watch this video for further insights into the steps you can take to help your organisation grow their information management maturity and get ready to become part of the National Digital Twin:
  19. Version 1.0.0

    224 downloads

    Following a year-long consultation exercise bringing together leading experts from the data science and information management communities, The pathway towards an Information Management Framework (IMF) report was published in May 2020. This report sets out the technical approach to deliver the Information Management Framework, a common language by which digital twins of the built and natural environment can communicate securely and effectively to support improved decision taking by those operating, maintaining and using built assets and the services they provide to society. The report outlines three building blocks to form an appropriately functioning technical core: · Foundation Data Model (FDM): a consistent, clear understanding of what constitutes the world of digital twins · Reference Data Library (RDL): the particular set of classes and properties we will use to describe our digital twins · Integration Architecture (IA): the protocols that will enable the managed sharing of data. A webinar, The pathway towards an Information Management Framework, was held on 8 June 2020, you can watch it here: Following the publication of the report,  an open consultation to get feedback on the methodology proposed was run and feedback was consolidated in the following summary.
  20. During Tuesday’s Gemini call, the above was raised to help promote awareness of the CDBB Digital Twin programme with developers and the alike. This struck me as a pretty good idea.. So based on the Gemini Principles and my understanding of the IMF pathways document, the below is a draft suggestion for the pot, to provoke the thoughts and ideas of the community: The IMF is rooted in the Gemini Principles; a collaborative top-down approach, driven by bottom-up integrated processes, embracing holistic systems thinking and pragmatic ontology, enabled by secured digital platforms, to derive better delivery and asset lifecycle outcomes. Its key value proposition is that it enables the story of an asset, infrastructure system or system of systems, registering its trigger events and the evidence-, risk-based decisions-making, from cradle-to-grave, the digital golden thread generating future benefits.
  21. Matthew West, Technical Lead, National Digital Twin Programme, introduces a video on the 7 circles of Information Management and Process Model Information Requirements. Join Matthew and Al Cook, a member of the technical team of the NDTp and an expert in data integration activities and information security, as they take you through key elements of the Information Management Framework and detail a new approach to effective information management. A video is available to view below, with a live Q&A session from 10:00 to 10:30 on Thursday 15 July 2021. Access to quality and well-managed information in organisations is key to support decision making and optimise outcomes at all levels. Decisions based on poor quality information, or no information at all, can significantly increase the risk of mistakes or even disasters. Systematically implementing information management ensures the ability to deliver the right information to the right decision-makers, at the right time. It is a critical success factor for the National Digital Twin (NDT), an ecosystem of connected Digital Twins where high-quality data is shared securely, on a massive scale to improve decision making across the UK. The “7 circles of Information Management”: developing the Information Management Framework The Information Management Framework (IMF), a collection of open, technical and non-technical standards, guidance and common resources, is intended to enable better information management and information sharing at a national scale and provide the building blocks to be (re)used by those wishing to be part of the NDT. The scope of the IMF is broad and the “7 circles diagram” that I introduce in the video below is a pragmatic way to divide the Information Management space into areas of concern that can be addressed separately as well as supporting each other. It is intended to help identify areas and associated NDTp deliverables that are of particular relevance to you. The technical aspects of the IMF may come first to your mind. On top of “information transport” mechanisms, together with authorisation and security protocols, to ensure that information can be accessed seamlessly, the NDT needs a language, an inter-lingua, so that data can be shared consistently and used to support decisions without requiring any further “data wrangling”. To develop this common language (the NDT’s ontology) the team is pursuing a principled approach, deeply grounded in mathematics and science to ensure that it is as extensible and all-encompassing as possible. This is what the deepest circles of the 7 circles diagram are about. There is, however, much more to the Information Management Framework than the purely technical aspects, and as part of the highest circles of the 7 circles diagram, we are developing guidance on how to systematically improve information management so that producing data that meets the quality standards required to be part of the NDT becomes part of “business as usual”. A first step towards better information management: defining your information requirements This means that while the NDT’s ontology is being developed, steps can be taken to work towards better information management. Organisations need to reach a point of recognition that there is a need to address data quality in a way that enables improved decisions within their own business and with those they have data-based relationships with. And defining Information Requirements (the second circle in the stack) is a key starting point. Process Model based Information Requirements Too often, information requirements are incomplete or even absent in organisations, with the implication that if requirements are not identified and agreed there is no reason that they would be met. As part of the second circle of the “7 circles diagram”, the team has released a paper outlining the proposed approach to developing information requirements, based on the analysis of process models. This is a novel approach, ensuring the systematic identification of information needed (no more, no less) to support decisions and to identify where it is captured. I encourage you to watch Al Cook’s presentation in the second part of the video to find out more about this approach. The team and I hope to share more detailed guidance on information management in the near future, helping you to assess your organisation’s current information management maturity, prioritise areas for obvious improvements in decision-making and start addressing them, so that mistakes can be avoided and better outcomes achieved. And as we continue to further develop the Information Management Framework, we look forward to accompanying you through the discovery of other circles among the 7 circles of Information Management. This video contains an introduction to the 7 circles of Information Management presented by Matthew West followed by a presentation by Al Cook on a suggested approach to define information requirements. Al and Matthew look forward to answering your questions and talking about next steps in a live Q&A session on the DT Hub, on the 15/07 from 10:00 to 10:30.
  22. Hello Everyone! What approach do you all suggest one should take if the project/idea being worked on is an upcoming new domain and where DT seems to be a perfect match? My project idea is in education space and I think DT will make a huge difference and revolutionise the way education is delivered. I would like some guidance on how I should start with respect to DT. What should be my starting point? I am a software engineer, so I am comfortable with software development, tools and libraries. Thanks and kind regards, Ajeeth
  23. The digital future of the built environment relies on the people that will create it. In our integrated world, over two thirds of UK leaders say their organisation is facing a digital skills gap (Microsoft, 2020) - we have a challenge and opportunity to close this gap whilst realising the benefits of the National Digital Twin. Working as part of the Mott MacDonald and Lane4 team appointed by the Construction Innovation Hub, we have developed a Skills and Competency Framework to raise awareness of the skills and roles needed to deliver a National Digital Twin. The skills and roles identified relate specifically to the Information Management Framework (IMF) - the core component of the National Digital Twin that will enable digital twins to speak the same language. The future of the National Digital Twin is in your hands Seize the opportunity to use this Skills and Competency Framework, to underpin digital twin development and IMF adoption. Without understanding the skills and roles required, there is a risk that organisations may deploy staff lacking sufficient skills to develop their digital twins. A skills gap could also risk poorly designed digital twins which do not support interoperability and connectivity with the IMF or failed digital twin pilots and projects which have direct economic consequences for those organisations. Accelerating progress with skills development With the Skills and Competency Framework, we can accelerate progress, reduce the rate of digital twin failure and ensure consistency in the approach to enable the National Digital Twin – all while establishing a pathway for digital skills and capability enhancement across the UK. We can do this by: Communicating the value of data as infrastructure, and the importance of literacy, quality and security Taking a systems-thinking approach to see data, technology and process as part of an interconnected ecosystem Having a collaborative and adaptable culture that is benefits driven, focused on outcomes to achieve and recognise the role people play in achieving this Find out how to achieve this by using the Skills and Competency Framework and stay tuned for a supporting Capability Enhancement Programme with role-based training plans and skill self-assessments. Learn by doing, Progress by sharing This Skills and Competency Framework is the first of its kind, but the topic of digital skills development in our industry is not. Throughout the development of the Framework, we have engaged with stakeholders and material from many bodies such as the Construction Innovation Training Board (CITB), Open Data Institute and other CDBB initiatives around skills. We intend to progress the Framework by sharing it with the industry and connecting to other bodies, industries and people with similar purposes and goals as CDBB. We are open, we are collaborative and we are ready to close the skills gap.
  24. We all know that Ontologies have a massive role to play in the realisation of the Information Management Framework and the wider National Digital Twin Programme. But damn(!), they can be hard to work with. Create an Ontology of any scale and the existing academic tools such as Protégé become pretty unmanageable pretty quickly. And that's before you try to explain your ontology to any sort of Normal Human. Even the most curated Ontology can be flabbergasting to the majority of people. If we are going to use Ontologies to define the logic behind Digital Twins, and crucially if we expect to be able to explain that logic to Normal Humans, then we need a better way of visualising, filtering, and editing our Ontologies. That's where OntoPop comes in. It's intended as a free-to-use, open source, non-proprietary Ontology visualisation tool. Highways England have funded the OntoPop MVP using innovation funding. Our hope is that we can expand its use across the other infrastructure owners and suppliers involved in the National Digital Twin programme, and use it to co-develop and own functionality that ultimately we are all going to need at some point. The MVP of OntoPop is now available to play with on the link below. Please visit https://ontopop.com/ and tell us what you think, all feedback is appreciated. We're particularly interested in if you would like to work on this project with us.
  25. As set out in the Pathway to the Information Management Framework, the Integration Architecture is one of the key technical components of the Information Management Framework. It consists of the protocols that will enable the managed sharing of data across the National Digital Twin. In the recently released Integration Architecture Pattern and Principles paper, the NDTp’s technical team set out key architectural principles and functional components for the creation of this critical technical component. The team defines a redeployable architectural pattern that allows the publication, protection, discovery, query and retrieval of data that conforms to the NDT’s ecosystem of Reference Data Libraries and the NDT’s Foundation Data Model. Download the Integration Architecture Pattern and Principles paper The Integration Architecture Pattern and Principles paper will take you through: A requirement overview: a series of use cases that the Integration Architecture needs to enable, including: routine operational use cases: where data from a diverse set of organisations can be shared and analysed for a single purpose (e.g to support legal and regulatory requirements) the ability to respond to an emergency: pulling together data from across different communities in a way that was not foreseen before the incident that caused the requirement ‘business as usual’ NDT maintenance use cases such as publishing a Digital Twin or adding a user to the NDT ecosystem. Architectural principles: key architectural principles that must be adhered to, regardless of the type of architecture that is implemented, including: Data quality: data quality needs to be measurable and published with the data itself Privacy of the published data: the Integration Architecture shall ensure that data is shared and used only according to the conditions under which it was published. Security: ensuring that all data and functions are secure from bad actors. Encryption will be a particularly key aspect of the security features in the Integration Architecture. Recommended integration architecture pattern: Three general architectural pattern options are explored in the paper (centralised, distributed, and federated). The benefits and concerns for each pattern are discussed with respect to the requirements. The recommended architectural pattern is a hybrid of these three approaches – centralising certain functions, whilst distributing and federating others. The recommended pattern is intended to allow datasets to be shared locally (i.e., within an NDT Node, see figure below), but will also allow for inter-node discovery, authorisation and data sharing to take place. NDT Nodes may be established by individual organisations, regulators and industry associations, or service providers and will be able to handle Digital Twins on behalf of their constituent organisations and provide a secure sharing boundary. In the recommended architecture: Datasets are published by the data owner (1), these are then made available to the organisations within the community of interest, in addition an event is issued to register publication with the Core (2). When queries are submitted (A), the dataset can then be discovered by organisations in other communities of interest (B) and retrieved where appropriate (C). The release, discovery and retrieval are carried out according to the authorisation service so that access is controlled as specified by the data owner. Detail of the functional components: The Core Services are likely to be quite thin, comprising mainly of: a master NDT Catalogue that holds the location of available NDT Datasets across the ecosystem the master FDM/RDL that will synchronise with the subset that is relevant for each NDT Node a publish/ subscribe model to propagate data changes to parties that have an interest and appropriate contract in place. The Core and each NDT Node shall interact through a microservice layer, with which they shall have to be compliant. Next steps The paper concludes with a list of 10 key tasks to develop further the Integration Architecture components. We will make sure to inform you on progress and in the meantime, we are looking forward to hearing your questions and comments on the paper!
Top
×
×
  • Create New...