Jump to content

Search the Community

Showing results for tags 'Information Management Framework (IMF)'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Themes

  • Member Forums
    • General discussion
    • Testing digital twin concepts
    • Digital twin competencies
    • Pathway to value
    • IMF Pathway
    • Videos
  • IMF Community's Forum
  • DT Toolkit's Case studies
  • DT Toolkit's Business case
  • DT Toolkit's Use cases
  • DT Toolkit's Roadmap
  • DT Toolkit's Network feedback
  • DT Toolkit's Toolkit
  • Data Value and Quality's Forum
  • The Defence Digital Network's Welcome!
  • The Defence Digital Network's Defence DT Roadmap
  • The Defence Digital Network's Acronym Buster
  • The Defence Digital Network's Open Forum
  • Open Innovation - Open (Citizen) Science - SDG's Open Innovation
  • Open Innovation - Open (Citizen) Science - SDG's Social Innovation Ecosystem
  • Open Innovation - Open (Citizen) Science - SDG's Events
  • Funding / Collaborative R&D Opportunities's Challenges
  • Funding / Collaborative R&D Opportunities's Funding

Categories

  • A Survey of Industry Data Models and Reference Data Libraries

Categories

  • Public Resources
  • Guidance
  • IMF Community's Files
  • DT Toolkit's Files
  • Data Value and Quality's Shared Resources

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

  1. I came across an EU funded project "xr4all" which provides a development environment(among other things) for XR projects. The details are here: https://dev.xr4all.eu Will it be possible for the NDT programme to provide similar platform for DT community in the UK? It will help in fostering rapid collaboration and development of the DT ecosystem. Thanks and kind regards, Ajeeth
  2. Version 1.0.0

    2 downloads

    A Survey of Industry Data Models and Reference Data Libraries published in November 2020 is the initial version of a continuous work to identify and assess existing Industry Data Models and Reference Data Libraries in terms of scope, quality and formalisation of the documentation, formalisation of the representation, maintenance frequency, usage … This survey is key to identifying existing industry data models that may have to be mapped into the NDT’s Foundation Data Model and Reference Data Library (RDL). Additionally, this work is intended to help the technical team to identify potential authoritative sources for key classes that will become part of the NDT’s RDL (for instance, units of measure). The list is open, and more standards are being added to the survey on the DT Hub. Please refer to this page to see the most up-to-date list and don’t hesitate to suggest standards for the team to add to the list.
  3. Version 1.0.0

    0 downloads

    To underpin the sharing of data across organisations and sectors, the National Digital Twin progamme (NDTp) aims to develop an ontology - a theory of what exists, i.e. the things that exist and the rules that govern them – capable to describe “life, the universe and everything” (Adams, 1980). As set out in the Pathway towards the IMF, this ontology will consist of: a Foundation Data Model – the rules and constraints on how data is structured an ecosystem of Reference Data Libraries, compliant with the Foundation Data Model – the particular set of classes and properties we use to describe digital twins. To achieve the consistency required to share information across organisations and sectors, the Foundation Data Model needs to be underpinned by a Top-Level Ontology (TLO), i.e. the top level categories (“thing”, “class”, …) and the fundamental relationships between them that are sufficient to cover the scope of a maximally broad range of domains. As a starting point to define the NDT’s TLO, the technical team has reviewed and classified existing TLOs in A survey of Top-Level Ontologies, published November 2020, and assessed them against a set of requirements. This has led to the identification of 4 potential candidates for the NDT’s TLO: BORO, IDEAS, HQDM and ISO 15926-2, as set out in The Approach to Develop the FDM for the IMF publication. Ontologies continue to be added to the survey. Please refer to this page to see the most up-to-update list and to suggest other ontologies for the team to consider.
  4. 31 downloads

    As set out in the Pathway to the Information Management Framework, the Integration Architecture is one of the three key technical components of the Information Management Framework, along with the Reference Data Library and the Foundation Data Model. It consists of the protocols that will enable the managed sharing of data across the National Digital Twin. In the Integration Architecture Pattern and Principles paper, the National Digital Twin programme’s (NDTp) technical team sets out key architectural principles and functional components for the creation of this critical technical component. The team defines a redeployable architectural pattern that allows the publication, protection, discovery, query and retrieval of data that conforms to the NDT’s ecosystem of Reference Data Libraries and the NDT’s Foundation Data Model. The paper will take you through: A requirement overview: a series of use cases that the Integration Architecture needs to enable, including: routine operational use cases: where data from a diverse set of organisations can be shared and analysed for a single purpose (e.g to support legal and regulatory requirements) the ability to respond to an emergency: pulling together data from across different communities in a way that was not foreseen before the incident that caused the requirement ‘business as usual’ NDT maintenance use cases such as publishing a Digital Twin or adding a user to the NDT ecosystem. Architectural principles: key architectural principles that must be adhered to, regardless of the type of architecture that is implemented, including: Data quality: data quality needs to be measurable and published with the data itself Privacy of the published data: the Integration Architecture shall ensure that data is shared and used only according to the conditions under which it was published. Security: ensuring that all data and functions are secure from bad actors. Encryption will be a particularly key aspect of the security features in the Integration Architecture. Recommended integration architecture pattern: Three general architectural pattern options are explored in the paper (centralised, distributed, and federated). The benefits and concerns for each pattern are discussed with respect to the requirements. The recommended architectural pattern is a hybrid of these three approaches – centralising certain functions, whilst distributing and federating others. The recommended pattern is intended to allow datasets to be shared locally (i.e., within an NDT Node, see figure below), but will also allow for inter-node discovery, authorisation and data sharing to take place. NDT Nodes may be established by individual organisations, regulators and industry associations, or service providers and will be able to handle Digital Twins on behalf of their constituent organisations and provide a secure sharing boundary. In the recommended architecture: Datasets are published by the data owner (1), these are then made available to the organisations within the community of interest, in addition an event is issued to register publication with the Core (2). When queries are submitted (A), the dataset can then be discovered by organisations in other communities of interest (B) and retrieved where appropriate (C). The release, discovery and retrieval are carried out according to the authorisation service so that access is controlled as specified by the data owner. Detail of the functional components: The Core Services are likely to be quite thin, comprising mainly of: a master NDT Catalogue that holds the location of available NDT Datasets across the ecosystem the master FDM/RDL that will synchronise with the subset that is relevant for each NDT Node a publish/ subscribe model to propagate data changes to parties that have an interest and appropriate contract in place. The Core and each NDT Node shall interact through a microservice layer, with which they shall have to be compliant. Next steps: The paper concludes with a list of ten key tasks to develop further the Integration Architecture components. We will make sure to inform you on progress and in the meantime, we are looking forward to hearing your questions and comments.
  5. Version 1.0.0

    27 downloads

    In May 2020 the National Digital Twin programme (NDTp) published the Pathway to an Information Management Framework (IMF). The publication was accompanied by an open consultation to seek feedback on our proposed approach and to hear from across the community about how they thought the IMF should develop to support their use and adoption of it. The consultation ran until the end of August, with ongoing engagement with the programme’s technical stakeholders, we received a great deal of valuable feedback. The full summary of the IMF Pathway Consultation Responses is published here today, written by Miranda Sharp, NDTp Commons Lead.
  6. Version 1.0.0

    5 downloads

    The Approach to Develop the Foundation Data Model published in March 2021 follows up on the Survey of Top-level Ontologies (TLO) published in November 2020. It sets out the Top-Level-Ontology requirements for the NDT's Foundation Data Model. Drawing upon the assessment of the TLOs listed in the Survey of Top-level Ontologies, it identifies 4 potential candidates for the NDT’s TLO: BORO, IDEAS, HQDM and ISO 15926-2: The four candidates are distinct from the other TLOs in being 4-dimensionalist, i.e. they consider individual objects as four-dimensional, with both spatial and temporal parts.
  7. Version 1.0.0

    60 downloads

    The Information Management Framework (IMF) is intended to enable better information management and information sharing at a national scale and provide the standards, guidance and shared resources to be (re)used by those wishing to participate in the National Digital Twin ecosystem. While the scope of the IMF is broad, the “7 circles of Information Management" diagram is a pragmatic way to divide the information management space into areas of concern that can be addressed separately whilst supporting each other. It identifies coherent areas of interest that can be addressed relatively independently. As part of the second circle of the diagram, the IMF technical team has released this paper outlining our recommended approach to developing information requirements, based on the analysis of process models. The methodology first identifies an organisation's processes, the decisions taken as part of the process, and then the information requirements to support the decisions. These are communicated to those who create the information. This provides a systematic approach to identifying the information requirements and when it is most cost-effectively created. Managed appropriately, this information capture can avoid costly activity to create information by surveying or inspecting in-use assets. To allow this anticipation of information needs, the methodology set out in the paper recommends the following steps: identify the lifecycle activities that an organisation performs decompose the activities in order to identify the material “participants” involved (“things” required for each activity: people, resources, assets, equipment, products, other activities, …) identify the decisions critical for these activities identify the information requirements for those decisions and the quality required. Read more in the blog containing a video introduction to the “7 circles of Information Management” by IMF Technical Team Lead, Matthew West, followed by a deep dive into the second circle – Process Model based Information Requirements – presented by Al Cook, main author of this paper.
  8. Version 1.0.0

    2 downloads

    Following a year-long consultation exercise bringing together leading experts from the data science and information management communities, The pathway towards an Information Management Framework (IMF) report was published in May 2020. This report sets out the technical approach to deliver the Information Management Framework, a common language by which digital twins of the built and natural environment can communicate securely and effectively to support improved decision taking by those operating, maintaining and using built assets and the services they provide to society. The report outlines three building blocks to form an appropriately functioning technical core: · Foundation Data Model (FDM): a consistent, clear understanding of what constitutes the world of digital twins · Reference Data Library (RDL): the particular set of classes and properties we will use to describe our digital twins · Integration Architecture (IA): the protocols that will enable the managed sharing of data. A webinar, The pathway towards an Information Management Framework, was held on 8 June 2020, you can watch it here: Following the publication of the report,  an open consultation to get feedback on the methodology proposed was run and feedback was consolidated in the following summary.
  9. During Tuesday’s Gemini call, the above was raised to help promote awareness of the CDBB Digital Twin programme with developers and the alike. This struck me as a pretty good idea.. So based on the Gemini Principles and my understanding of the IMF pathways document, the below is a draft suggestion for the pot, to provoke the thoughts and ideas of the community: The IMF is rooted in the Gemini Principles; a collaborative top-down approach, driven by bottom-up integrated processes, embracing holistic systems thinking and pragmatic ontology, enabled by secured digital platforms, to derive better delivery and asset lifecycle outcomes. Its key value proposition is that it enables the story of an asset, infrastructure system or system of systems, registering its trigger events and the evidence-, risk-based decisions-making, from cradle-to-grave, the digital golden thread generating future benefits.
  10. Hello Everyone! What approach do you all suggest one should take if the project/idea being worked on is an upcoming new domain and where DT seems to be a perfect match? My project idea is in education space and I think DT will make a huge difference and revolutionise the way education is delivered. I would like some guidance on how I should start with respect to DT. What should be my starting point? I am a software engineer, so I am comfortable with software development, tools and libraries. Thanks and kind regards, Ajeeth
  11. The digital future of the built environment relies on the people that will create it. In our integrated world, over two thirds of UK leaders say their organisation is facing a digital skills gap (Microsoft, 2020) - we have a challenge and opportunity to close this gap whilst realising the benefits of the National Digital Twin. Working as part of the Mott MacDonald and Lane4 team appointed by the Construction Innovation Hub, we have developed a Skills and Competency Framework to raise awareness of the skills and roles needed to deliver a National Digital Twin. The skills and roles identified relate specifically to the Information Management Framework (IMF) - the core component of the National Digital Twin that will enable digital twins to speak the same language. The future of the National Digital Twin is in your hands Seize the opportunity to use this Skills and Competency Framework, to underpin digital twin development and IMF adoption. Without understanding the skills and roles required, there is a risk that organisations may deploy staff lacking sufficient skills to develop their digital twins. A skills gap could also risk poorly designed digital twins which do not support interoperability and connectivity with the IMF or failed digital twin pilots and projects which have direct economic consequences for those organisations. Accelerating progress with skills development With the Skills and Competency Framework, we can accelerate progress, reduce the rate of digital twin failure and ensure consistency in the approach to enable the National Digital Twin – all while establishing a pathway for digital skills and capability enhancement across the UK. We can do this by: Communicating the value of data as infrastructure, and the importance of literacy, quality and security Taking a systems-thinking approach to see data, technology and process as part of an interconnected ecosystem Having a collaborative and adaptable culture that is benefits driven, focused on outcomes to achieve and recognise the role people play in achieving this Find out how to achieve this by using the Skills and Competency Framework and stay tuned for a supporting Capability Enhancement Programme with role-based training plans and skill self-assessments. Learn by doing, Progress by sharing This Skills and Competency Framework is the first of its kind, but the topic of digital skills development in our industry is not. Throughout the development of the Framework, we have engaged with stakeholders and material from many bodies such as the Construction Innovation Training Board (CITB), Open Data Institute and other CDBB initiatives around skills. We intend to progress the Framework by sharing it with the industry and connecting to other bodies, industries and people with similar purposes and goals as CDBB. We are open, we are collaborative and we are ready to close the skills gap.
  12. We all know that Ontologies have a massive role to play in the realisation of the Information Management Framework and the wider National Digital Twin Programme. But damn(!), they can be hard to work with. Create an Ontology of any scale and the existing academic tools such as Protégé become pretty unmanageable pretty quickly. And that's before you try to explain your ontology to any sort of Normal Human. Even the most curated Ontology can be flabbergasting to the majority of people. If we are going to use Ontologies to define the logic behind Digital Twins, and crucially if we expect to be able to explain that logic to Normal Humans, then we need a better way of visualising, filtering, and editing our Ontologies. That's where OntoPop comes in. It's intended as a free-to-use, open source, non-proprietary Ontology visualisation tool. Highways England have funded the OntoPop MVP using innovation funding. Our hope is that we can expand its use across the other infrastructure owners and suppliers involved in the National Digital Twin programme, and use it to co-develop and own functionality that ultimately we are all going to need at some point. The MVP of OntoPop is now available to play with on the link below. Please visit https://ontopop.com/ and tell us what you think, all feedback is appreciated. We're particularly interested in if you would like to work on this project with us.
  13. As set out in the Pathway to the Information Management Framework, the Integration Architecture is one of the key technical components of the Information Management Framework. It consists of the protocols that will enable the managed sharing of data across the National Digital Twin. In the recently released Integration Architecture Pattern and Principles paper, the NDTp’s technical team set out key architectural principles and functional components for the creation of this critical technical component. The team defines a redeployable architectural pattern that allows the publication, protection, discovery, query and retrieval of data that conforms to the NDT’s ecosystem of Reference Data Libraries and the NDT’s Foundation Data Model. Download the Integration Architecture Pattern and Principles paper The Integration Architecture Pattern and Principles paper will take you through: A requirement overview: a series of use cases that the Integration Architecture needs to enable, including: routine operational use cases: where data from a diverse set of organisations can be shared and analysed for a single purpose (e.g to support legal and regulatory requirements) the ability to respond to an emergency: pulling together data from across different communities in a way that was not foreseen before the incident that caused the requirement ‘business as usual’ NDT maintenance use cases such as publishing a Digital Twin or adding a user to the NDT ecosystem. Architectural principles: key architectural principles that must be adhered to, regardless of the type of architecture that is implemented, including: Data quality: data quality needs to be measurable and published with the data itself Privacy of the published data: the Integration Architecture shall ensure that data is shared and used only according to the conditions under which it was published. Security: ensuring that all data and functions are secure from bad actors. Encryption will be a particularly key aspect of the security features in the Integration Architecture. Recommended integration architecture pattern: Three general architectural pattern options are explored in the paper (centralised, distributed, and federated). The benefits and concerns for each pattern are discussed with respect to the requirements. The recommended architectural pattern is a hybrid of these three approaches – centralising certain functions, whilst distributing and federating others. The recommended pattern is intended to allow datasets to be shared locally (i.e., within an NDT Node, see figure below), but will also allow for inter-node discovery, authorisation and data sharing to take place. NDT Nodes may be established by individual organisations, regulators and industry associations, or service providers and will be able to handle Digital Twins on behalf of their constituent organisations and provide a secure sharing boundary. In the recommended architecture: Datasets are published by the data owner (1), these are then made available to the organisations within the community of interest, in addition an event is issued to register publication with the Core (2). When queries are submitted (A), the dataset can then be discovered by organisations in other communities of interest (B) and retrieved where appropriate (C). The release, discovery and retrieval are carried out according to the authorisation service so that access is controlled as specified by the data owner. Detail of the functional components: The Core Services are likely to be quite thin, comprising mainly of: a master NDT Catalogue that holds the location of available NDT Datasets across the ecosystem the master FDM/RDL that will synchronise with the subset that is relevant for each NDT Node a publish/ subscribe model to propagate data changes to parties that have an interest and appropriate contract in place. The Core and each NDT Node shall interact through a microservice layer, with which they shall have to be compliant. Next steps The paper concludes with a list of 10 key tasks to develop further the Integration Architecture components. We will make sure to inform you on progress and in the meantime, we are looking forward to hearing your questions and comments on the paper!
  14. Last May the National Digital Twin programme (NDTp) published our proposed Pathway to an Information Management Framework (IMF). The publication was accompanied by an open consultation to seek feedback on our proposed approach and to hear from across the community about how they thought the IMF should develop to support their use and adoption of it. The consultation ran until the end of August, alongside ongoing engagement with the programme’s technical stakeholders, we received a great deal of valuable feedback. The full summary of the IMF Pathway Consultation Responses is published today, written by Miranda Sharp, NDTp Commons Lead. Overall, the responses to the Pathway were positive and respondents welcomed the opportunity to give feedback and contribute to its improvement. This was hugely gratifying for everyone who has contributed to the work over the last 18 months. Some of the responses to the pathways document challenged the proposed approach and we are keen to keep learning from these differences of opinion and perspective. In the paper we have summarised the range of responses in the table below: Positive response themes Nuanced response themes and questions Critical response themes The work is welcome and progress towards it is considered consistent with the Gemini Principles. The plans to build on existing work are particularly welcome. Discussion of the technical challenge is valid but respondents called for human factors associated with change to be explored in parallel. A small number of respondents rejected the approach as too “top down”. There was broad agreement that the IMF should consist of a FDM, RDL and IA. Representatives from organisations often sought an indication for tangible next steps. Some respondents stated that more than a single Integration Architecture is required. The models and protocols described in the report were seen as comprehensive. There were specific asks for advice on data quality, security, legal provenance and the securing of benefits from investment. Several responses disputed the possibility and validity of a single FDM . More details about the responses can be found in the paper, but what we are hoping is to use this space in the IMF Community to discuss and work on the themes raised here in the IMF Community. Do you agree or disagree with these themes? Do you think any are missing? What work do you think could be done to address the questions and criticisms? Some work has already begun: The legal, commercial and regulatory elements of resilient and secure sharing of information. You can read about our first steps on this journey, Legal Roundtables held in November, here under the Digital Commons/ Legal. The need for a demonstrator and guidance for communicating the benefits of a National Digital Twin and how to begin readying organisations for the change. There was demand for use cases, case studies which is being addressed through the Gemini Programme and the DT Toolkit. Work has been undertaken this year, with funding from the Construction Innovation Hub (CIH) to create an FDM Seed for the CIH’s Platform Design Programme. We hope this will be the first demonstrator, of sorts, for the technical work that is being developed by the NDTp’s Technical Team. You can see the outputs of the Technical team here, and we will be releasing the next paper, Approach to develop the Foundation Data Model (FDM) here soon. We are planning further demonstrators that show the tangible benefits of the National Digital Twin, we hope to be able to share our plans with you in the near future. The Programme also strives to continue to build a body of evidence (‘Corpus’)), as per the Tasks set out in the Pathway, to build other demonstrators for the programme. Alongside this work the publication of the Response to IMF Pathway Consultation will contribute to the development of an updated Pathways document that will refocus efforts of the NDTp. We hope to share that will you in the coming months.
  15. Today we are delighted to publish the Approach to Develop the Foundation Data Model for the Information Management Framework. This document follows up on the November publication of the Survey of Top-level Ontologies (TLO) and the Survey of Industry Data Models (IDM) and Reference Data Libraries (RDL). (You can find these publications under Gemini Commons/IMF technical Documents.) The pragmatic and technical requirements for the Foundation Data Model have now been developed and consideration has been given as to whether any existing Top-Level Ontologies could be used as a suitable start-point. The Approach takes you through these requirements, the assessment of the surveyed TLOs to the final decision. There are four Top-Level Ontologies that meet all the technical requirements: BORO, IDEAS, HQDM and ISO 15926-2. They are distinct from the other Top-Level Ontologies in being 4-dimensionalist. These allow us to see individual objects as four-dimensional, having both spatial and temporal parts. You can find the Approach to Develop the FDM for the IMF here
  16. In November 2020 the National Digital Twin programme programme published the Survey of Top-level Ontologies (TLO) and the Survey of Industry Data Models (IDM) and Reference Data Libraries (RDL). You can find these publications under Gemini Commons/IMF technical Documents. The technical part of the proposed pathway to an Information Management Framework comprises of three main elements: A Foundation Data Model A Reference Data Library, and An Integration Architecture which define a common structure and meaning for the consistent and integrated sharing of information. The pragmatic and technical requirements for the Foundation Data Model have now been developed and consideration has been given as to whether any existing Top-Level Ontologies could be used as a suitable start-point. There are four Top-Level Ontologies that meet all the technical requirements: BORO, IDEAS, HQDM and ISO 15926-2. They are distinct from the other Top-Level Ontologies in being 4-dimensionalist. These allow us to see individual objects as four-dimensional, having both spatial and temporal parts. We are therefore proceeding to develop the Foundation Data Model seed from these 4-dimensionalist Top-Level Ontologies. The Approach to Develop the Foundation Data Model for the Information Management Framework has been published here alongside the Surveys in the Gemini Commons/ IMF Technical documents. If you would like to ask any questions about the publication, the methods taken and choices made, head over to the IMF Community Network where the programme team are available to respond.
  17. (8) Data wrangling - importing 300+ datasets a quarter - YouTube Is this making the case for bread and butter digital transformation?
  18. Acronym Full Name Initial release Self Description BFO Basic Formal Ontology 2002 The Basic Formal Ontology (BFO) framework developed by Barry Smith and his associates consists of a series of sub-ontologies at different levels of granularity. The ontologies are divided into two varieties: relating to continuant entities such as three-dimensional enduring objects, and occurrent entities (primarily) processes conceived as unfolding in successive phases through time. BFO thus incorporates both three-dimensionalist and four-dimensionalist perspectives on reality within a single framework. Interrelations are defined between the two types of ontologies in a way which gives BFO the facility to deal with both static/spatial and dynamic/temporal features of reality. A continuant domain ontology descending from BFO can be conceived as an inventory of entities existing at a time. Each occurrent ontology can be conceived as an inventory of processes unfolding through a given interval of time. Both BFO itself and each of its extension sub-ontologies can be conceived as a window on a certain portion of reality at a given level of granularity. BORO Business Objects Reference Ontology late 1980s Business Objects Reference Ontology is an upper ontology designed for developing ontological or semantic models for large complex operational applications that consists of a top ontology as well as a process for constructing the ontology. It is built upon a series of clear metaphysical choices to provide a solid (metaphysical) foundation. A key choice was for an extensional (and hence, four-dimensional) ontology which provides it a simple criteria of identity. Elements of it have appeared in a number of standards. For example, the ISO standard, ISO 15926 – Industrial automation systems and integration – was heavily influenced by an early version. The IDEAS (International Defence Enterprise Architecture Specification for exchange) standard is based upon BORO, which in turn was used to develop DODAF 2.0. CIDOC CIDOC object-oriented Conceptual Reference Model 1999 Although "CIDOC object-oriented Conceptual Reference Model" (CRM) is a domain ontology, specialised to the purposes of representing cultural heritage, a subset called CRM Core is a generic upper ontology, including:[15][16] - Space-Time – title/identifier, place, era/period, time-span, relationship to persistent items - Events – title/identifier, beginning/ending of existence, participants (people, either individually or in groups), creation/modification of things (physical or conceptional), relationship to persistent items - Material Things – title/identifier, place, the information object the material thing carries, part-of relationships, relationship to persistent items - Immaterial Things – title/identifier, information objects (propositional or symbolic), conceptional things, part-of relationships CIM Common Information Model 1999 The Common Information Model (CIM) is an open standard that defines how managed elements in an IT environment are represented as a common set of objects and relationships between them. COSMO COmmon Semantic MOdel not known - pre-2006 Developed with the goal of developing a foundation ontology that can serve to enable broad general Semantic Interoperability. Cyc Cyc 1984 Artificial intelligence project that aims to assemble a comprehensive ontology and knowledge base that spans the basic concepts and rules about how the world works. DC The Dublin Core ontology 1995 This is a light weight RDFS vocabulary for describing generic metadata. DOLCE Descriptive Ontology for Linguistic and Cognitive Engineering 2002 Is oriented toward capturing the ontological categories underlying natural language and human common sense. EMMO The European Materials Modelling Ontology (EMMO) 2019 (?) The EMMO top level is the group of fundamental axioms that constitute the philosophical foundation of the EMMO. Adopting a physicalistic/nominalistic perspective, the EMMO defines real world objects as 4D objects that are always extended in space and time (i.e. real world objects cannot be spaceless nor timeless). For this reason abstract objects, i.e. objects that does not extend in space and time, are forbidden in the EMMO. It has been instigated by materials science and provides the connection between the physical world, the experimental world (materials characterisation) and the simulation world (materials modelling). FIBO Financial Industry Business Ontology 2010 (?) The Financial Industry Business Ontology (FIBO) defines the sets of things that are of interest in financial business applications and the ways that those things can relate to one another. FrameNet FrameNet 2000 (?) The FrameNet project is building a lexical database of English that is both human- and machine-readable, based on annotating examples of how words are used in actual texts. GFO General Formal Ontology 2006 Realistic ontology integrating processes and objects. It attempts to include many aspects of recent philosophy, which is reflected both in its taxonomic tree and its axiomatizations. gist gist 2007 It is designed to have the maximum coverage of typical business ontology concepts with the fewest number of primitives and the least amount of ambiguity. HQDM High Quality Data Models 2011 The High Quality Data Models (HQDM) Framework is a 4 dimensionalist top level ontology with extensional identity criteria that aims to support large scale data integration. As such it aims to ensure there is consistency among data created using the framework. The HQDM Framework is based on work developing and using ISO 15926 and lessons learnt from BORO, which influenced ISO 19526. IDEAS International Defence Enterprise Architecture Specification 2006 The upper ontology developed by the IDEAS Group is higher-order, extensional and 4D. It was developed using the BORO Method. The IDEAS ontology is not intended for reasoning and inference purposes; its purpose is to be a precise model of business. IEC 62541 IEC 62541 - OPC Unified Architecture 2006 OPC Unified Architecture (OPC UA) is a machine to machine communication protocol for industrial automation developed by the OPC Foundation. IEC 63088 Smart manufacturing - Reference architecture model industry 4.0 (RAMI4.0) 2017 IEC PAS 63088:2017(E) describes a reference architecture model in the form of a cubic layer model, which shows technical objects (assets) in the form of layers, and allows them to be described, tracked over their entire lifetime (or “vita”) and assigned to technical and/or organizational hierarchies. It also describes the structure and function of Industry 4.0 components as essential parts of the virtual representation of assets. ISO 12006-3 ISO 12006-3:2007 - Building construction — Organization of information about construction works — Part 3: Framework for object-oriented information 2007 ISO 12006-3:2007 specifies a language-independent information model which can be used for the development of dictionaries used to store or provide information about construction works. It enables classification systems, information models, object models and process models to be referenced from within a common framework. ISO 15926-2 Industrial automation systems and integration—Integration of life-cycle data for process plants including oil and gas production facilities 2003 The ISO 15926 is a standard for data integration, sharing, exchange, and hand-over between computer systems. KR Ontology KR Ontology 1999 The KR Ontology is defined in the book Knowledge Representation by John F. Sowa. Its categories have been derived from a synthesis of various sources, but the two major influences are the semiotics of Charles Sanders Peirce and the categories of existence of Alfred North Whitehead. The primitive categories are: Independent, Relative, or Mediating; Physical or Abstract; Continuant or Occurrent. MarineTLO Marine Top Level Ontology 2013 (?) Is a top level ontology, generic enough to provide consistent abstractions or specifications of concepts included in all data models or ontologies of marine data sources and provide the necessary properties to make this distributed knowledge base a coherent source of facts relating observational data with the respective spatiotemporal context and categorical (systematic) domain knowledge. MIMOSA CCOM MIMOSA CCOM (Machinery Information Management Open Systems Alliance - Common Conceptual Object Model) not known MIMOSA CCOM (Common Conceptual Object Model) serves as an information model for the exchange of asset information. Its core mission is to facilitate standards-based interoperability between systems: providing an XML model to allow systems to electronically exchange data. OWL-2 OWL-2 2004 The Web Ontology Language (OWL) is a family of knowledge representation languages for authoring ontologies. Ontologies are a formal way to describe taxonomies and classification networks, essentially defining the structure of knowledge for various domains: the nouns representing classes of objects and the verbs representing relations between the objects. PROTON PROTo ONtology 2005 (?) Is designed as a lightweight upper-level ontology for use in Knowledge Management and Semantic Web applications. Schema.org Schema.org 2011 Schema.org is a collaborative, community activity with a mission to create, maintain, and promote schemas for structured data on the Internet, on web pages, in email messages, and beyond. SENSUS The SENSUS ontology 2001 We have constructed SENSUS, a 70,000-node terminology taxonomy, as a framework into which additional knowledge can be placed. SENSUS is an extension and reorganization of WordNet SKOS Simple Knowledge Organization System 2009 SKOS is an area of work developing specifications and standards to support the use of knowledge organization systems (KOS) such as thesauri, classification schemes, subject heading systems and taxonomies within the framework of the Semantic Web. SUMO Suggested Upper Merged Ontology 2000 Is an upper ontology intended as a foundation ontology for a variety of computer information processing systems. TMRM The Topic Maps Reference Model late 1990s A topic map is a standard for the representation and interchange of knowledge, with an emphasis on the findability of information. UFO Unified Foundational Ontology 2005 Incorporates developments from GFO, DOLCE and the Ontology of Universals underlying OntoClean in a single coherent foundational ontology. UMBEL Upper Mapping and Binding Exchange Layer 2008 Is a logically organized knowledge graph of 34,000 concepts and entity types that can be used in information science for relating information from disparate sources to one another. Since UMBEL is an open-source extract of the OpenCyc knowledge base, it can also take advantage of the reasoning capabilities within Cyc. UML Unified Modeling Language (UML) 1994 The Unified Modeling Language (UML) is a general-purpose, developmental, modeling language in the field of software engineering that is intended to provide a standard way to visualize the design of a system. UMLS UMLS 1986 The Unified Medical Language System (UMLS) is a compendium of many controlled vocabularies in the biomedical sciences (created 1986).[1] It provides a mapping structure among these vocabularies and thus allows one to translate among the various terminology systems; it may also be viewed as a comprehensive thesaurus and ontology of biomedical concepts. UMLS further provides facilities for natural language processing. It is intended to be used mainly by developers of systems in medical informatics. WordNet WordNet 1985 WordNet® is a large lexical database of English. Nouns, verbs, adjectives and adverbs are grouped into sets of cognitive synonyms (synsets), each expressing a distinct concept. YAMATO Yet Another More Advanced Top Ontology 1999 YAMATO: Yet Another More Advanced Top-level Ontology which has been developed intended to cover three features in Quality description, Representation and Process/Event, respectively, in a better way than existing ontologies. It has been extensively used for developing other, more applied, ontologies.
  19. What key resource has been instrumental in giving you or your team the right skills in tackling information management, digital transformation and thinking about designing and operating digital twins? Your opinions and knowledge are vital as we develop training plans to deliver a National Digital Twin and we ask that you kindly comment below. Suggestions could be online resources, higher education or professional institution courses or any other sources of training that have helped when tackling information management, digital transformation and thinking about designing and operating digital twins. Working as part of the Mott MacDonald and Lane4 team appointed by the Construction Innovation Hub, we are developing a Skills and Competency Framework with targeted role-based training plans to upskill the wider workforce in key skills and competencies needed to design and operate digital twins. This moves beyond the technical ruleset, toolset and mindset of the Information Management Framework (IMF), and will address the training needed to engage a workforce ripe for progression and change.   All submissions and comments sent before the 12th February will be included in our training research, but we encourage the conversation and sharing of training materials to continue beyond this date. We look forward to sharing the outputs with you.
  20. Realising the benefits of a National Digital Twin is not just a technical challenge. In order to establish a digital ecosystem to enable the interoperability, integration and linking of data and models across the built and natural environments, we need a workforce with the appropriate skills who – together – can make it happen. Working as part of the Mott MacDonald and Lane4 team appointed by the Construction Innovation Hub, we are developing a Skills and Competency Framework. In setting out the key roles and skills needed to deliver a National Digital Twin, we recognise the key contributions of people in this endeavour. With this in mind, we are developing a career pathway for digital-twin related roles to address the industry skills gap. Moving beyond the technical ruleset, toolset and mindset of the Information Management Framework (IMF), we will address the skillset requirements to engage a workforce ripe for progression and change. This project will offer greater awareness for what roles and skills are needed to develop and implement the IMF at a national and organisational level and, through competency assessment, a view of the key skills and capability gaps to drive early intervention and inform targeted role-based training plans to increase IMF and digital twin adoption. Our framework will in turn help to generate targeted role-based training plans to upskill the wider workforce in key skills and competencies needed to design and operate digital twins. Without sufficiently enabling and empowering the workforce, the extent to which our industry will benefit from the common ruleset, toolset and mindset promoted through the IMF will be limited. However, there is an opportunity to accelerate progress, reduce the rate of digital twin failure and ensure consistency in approach to enable the National Digital Twin – all while establishing a pathway for digital skills and capability enhancement across the UK. Seizing the opportunity to develop a skills and capability framework to support the IMF is essential to future success. Without this understanding, there is a risk organisations may deploy staff lacking sufficient technical skills or knowledge to develop their digital twins, which could lead to erosion of confidence in the IMF and digital twins in general. A skills gap could also risk poorly designed digital twins which do not support interoperability and connectivity or failed digital twin pilots and projects which have direct economic consequences for those organisations. As part of our project to develop a skills and capability framework, we ran a series of workshops towards the end of 2020. These included a number of subject matter experts across industry, academia and government who we brought together to discuss potential roles and skills required to support the IMF and National Digital Twin. A longlist of roles and accompanying skill areas were identified which have now been rationalised and prioritised. One of the key themes that emerged from the workshops was the need for people to understand and be able to communicate the value of data, and the importance of data quality and data literacy. While digital twin development is likely to be driven by information managers, technologists and business leaders, every member of our industry has a role to play in collecting and managing good quality data. Without the right culture in place, both nationally and at an organisational level, supported by fundamental data skills, the scale of benefits offered by a National Digital Twin won’t be fully realised. We are now at a stage in the project where we want to hear from you, the DT Hub community. We ask that you kindly let us know your thoughts and opinions on this topic by commenting or submitting a question. And we also have questions we would like to ask you: Do you have any thoughts on the roles, skills and capabilities needed to develop and implement the IMF and National Digital Twin? What key skills are at the forefront of your mind and your organisation's future thinking in this space? What skills gaps are apparent in the industry or your organisation? What are you doing to address these gaps? As a community of digital twin owners and information management experts, your opinions, knowledge and experience are vital to paving the way for our digital twin future. In the coming weeks we will be putting the finishing touches to the framework and look forward to being able to share the outputs with you. David Plummer, Global Practice Lead for Digital Transformation at Mott MacDonald, is part of the team developing the Skills and Capability Framework to foster an empowered workforce capable of delivering the National Digital Twin. The Construction Innovation Hub brings together world-class expertise from the BRE, the Manufacturing Technology Centre (MTC) and the Centre for Digital Built Britain (CDBB) to transform the UK construction industry.
  21. The National Digital Twin Legal Implications It was with great foresight that the Digital Framework Task Group (DFTG) recognised in advance of furthering the National Digital Twin (NDT) programme that the legal implications would be plentiful and required debate at a high level. This proactive attitude to the legal side of the NDT is intended to provide initial thoughts and guidance for the DFTG going forward. Lawyers are often seen as a last resort and are therefore too often forced to work reactively. This fresh approach by the DFTG will hopefully flush out potential pitfalls and provoke further legal debate in this fast moving and highly exciting area of technology. An initial group of leading private practice lawyers gave their time to a series of legal roundtables to discuss and debate the possible legal outcomes of the NDT. The following participants brought their expertise in their individual practice areas: · Sarah Rock – Gowling WLG – Construction · Ian Mason – Gowling WLG – Financial Regulation · Diana France – HFW – Energy · Tamara Quinn – Osborne Clarke LLP – IP and Data Privacy · Clare Fielding – Town Legal LLP – Planning · Fleur Ruda – MHCLG Legal Advisors – Government and Public Good · Alan Stone – RPC LLP – Insurance · Serena Tierney – Veale Wasbrough Vizards LLP – IP and Technology · Naveen Vijh – BCLP Law LLP – Funding and Finance The group met on four occasions, chaired by Sarah Rock and organised by Miranda Sharp and James Harris who led the NDTp Commons stream alongside Rachel Judson. The lawyers debated the initial responses about the NDT from each practice area, looked at the live National Underground Asset Register, had a general discussion to bring the various streams together and finally debated a theoretical case study of a digital twin. The roundtable discussions have been converted into a report which was presented to the DFTG and is now available for review on the DT Hub. The four main outcomes for further discussion were: 1. Governance. The legal side of the governance of the NDT was a point raised again and again during the roundtable sessions when looking at various aspects of the NDT. The legal experts felt there was a gap in governance which could perhaps best be filled with legislation or taking mandatory action at Government level. The legal experts felt that a top down approach would best work for this. There was concern to avoid competitive advantage being given to some, or even being perceived as such. 2. Early Engagement. The lawyers were all thrilled to be involved so early on with the NDT and saw it as very wise for the DFTG to be proactive in the legal space, as opposed to the all too often path taken with legals of being reactive. The legal experts felt that a continued engagement during the early years of the NDT would be very sensible. In addition, it was suggested that other non-technical areas also might benefit from early engagement including specifically insurance, finance and regulators. 3. Interaction of Stakeholders. It was highlighted that there are plenty of legal challenges for the interoperability of the NDT. It was felt that existing contracts and methods of procurement would need to adapt to respond to the changes ahead. It was also noted that the DFTG needs to ensure its thinking is always a tad wider than the NDT. This was highlighted in particular in the energy sector where the interaction between the UK and international operators is critical. 4. IPR, data and access. The IP created in the NDT was debated at length and needs to be further considered. This point circles back to point one above. The governance of the NDT needs to set out clearly where the IP sits. Further the levels of access ought to be decided in a top down fashion. Liability and obligations regarding the NDT can be assessed and allocated but the legal experts felt that the Government lead approach would be crucial. Overall, the legal experts posed various questions for the DFTG to consider in its implementation of the NDT. It was suggested that none of the issues raised by the experts were too difficult to resolve legally. The project was deemed achievable legally but only with continued legal engagement for further consideration, issues to be raised and ideas for solutions to be formed. Please do take the time to read and consider the legal report and the DFTG welcomes further comment on this. The DFTG recognises the time commitment of the legal experts involved and once again would like to thank them for their efforts and their collaborative approach. Roundtable Outcomes Report v1.0.pdf
  22. We hope that you have had opportunity to look through the two survey documents published by the NDTp technical development team. A survey of Top-Level Ontologies A survey of Industry Data Models and Reference Data Libraries If you think there have been omissions, and you would like to submit others for assessment, Matthew and the team would be happy to receive them. Linked here is a proforma that we ask for you to complete. You can post your completed form to the files here in the network and we can pick it up from there. Ontology Assessment Criteria - Proforma We appreciate your interest and involvement to ensure the surveys are as complete as they can be.
  23. Miranda Sharp

    people in the ontology

    Hello all I took a question after the Gemini call this morning. Apologies for my naivety in not knowing the answer myself, but where are people in the TLO? Miranda
  24. Morning All, Currently investigating a case for a Digital Twin concept - is there any guidance relating to the structure of the data to align up to a more national structure? Any useful guidance to help the beginnings of this case move in the right direction? Thanks, Lewis
  25. The IMF Architects and IMF Developers Networks will be merging today into the IMF Community Network. Any members of the IMF Architects Community not currently in the IMF Developers will be invited to join that group. The newly merged group will be renamed 'IMF Community Network'. The merger is to ensure a single point for information and discussion and to rectify any confusion or boundary setting that was not helpful to the community.
Top
×
×
  • Create New...