Search the Community
Showing results for tags 'Taxonomy'.
Ramy posted a topic in General discussionThe idea of a Digital Twin [DT] needs to advance in a standardized and formal manner. As @DRossiter87 pointed out, this is necessary to "support the consistent adoption of, and interoperability between, digital twins for the built environment within the UK" To aid in this endeavour, it is useful to delineate the difference between the following terms: “DT General Use Case” [GUC]: a very short sentence, possibly consisting of two or three words – ideally a verb followed by a noun – ultimately concerned with the brief and blunt statement of the central business aim that is motivating the use and implementation of a DT, e.g., ‘optimize traffic’. “DT Use Case Scenario” [UCS]: a documentation of the sequence of actions, or in other words, DT-user interactions executed through a particular DT case study or an actual DT real-world project. “DT Use”: a typical technical function or action executed by any DT throughout the course of any UCS. Accordingly, the DT uses are seen as the standard building blocks by which a standard common language can be founded. Such a standard language, which can possibly be machine-readable as well, can be used in documenting and detailing the DT-user interactions in a standard format to facilitate their publishing and sharing of knowledge. Below is a figure of a “DT uses taxonomy”. It is made up of three distinct hierarchical levels, these are respectively: ‘Included Uses’ containing four high-level cornerstone uses that are, besides rare exceptional cases, included in and executed by almost every DT throughout any UCS (i.e. Mirror, Analyse, Communicate and Control); ‘Specialized Uses’ including special forms of the Included Uses, where each specialized use enjoys unique strengths suitable for specific purposes; and “Specialized Sub-Uses” at the lowest hierarchical level of the taxonomy, which further differentiates between variant types within a Specialized Use at the higher level by virtue of very fine inherent variations that distinguish one type from another and thus, enhances the DT’s practical adequacy in dealing with alternative contexts and user specifically defined purposes. The table below include a formal definition of each DT Use. DT Use Definition Synonyms 01 Mirror To duplicate a physical system in the real world in the form of a virtual system in the cyber world. Replicate, twin, model, shadow, mimic 1.1 Capture Express in a digital format within the virtual world the status of a physical system at a point of time. (Usually, offline DT) collect, scan, survey, digitalize 1.2 Monitor Collecting information related to the performance of a physical system. (Online or Offline DT) Sense, observe, measure 1.3 Quantify Measure quantity of a physical system’s particulars, instances or incidents. (Online or Offline DT) Quantify, takeoff, count 1.4 Qualify Track the ongoing status of a physical system (Online or Offline DT) Qualify, follow, track DT Use Definition Synonyms 02 Analyse To create new knowledge and provide insights for users and stakeholders about a physical system. Examine, manage 2.1 Compute To perform conventional arithmetical calculations, traditional mathematical operations and functions and simple statistical techniques like correlations Calculate, add, subtract, multiply, divide 2.2 Mine To uncover, identify and recognize the web of interdependencies, interconnected mechanisms, complex processes, interwoven feedback loops, masked classes, clusters or typologies, hidden trends and patterns within the physical system. Learn, recognize, identify, detect, AI, ML, BDA 2.3 Simulate To explore and discover the implications and possible emerging behaviours of a complex web of interacting set of variables. 2.3.1 Scenario To find out the implications, impacts or consequences of implementing pre-defined scenarios (akin to non-destructive tests) What-if, evaluate, assess 2.3.2 Stress-Test To identify the scenarios that may lead to failure or breakdown of physical system (akin to destructive tests) Test, inspect, investigate 2.4 Predict Concerned with futures studies 2.4.1 Forecast to predict the most likely state of a real system in the future, by projecting the known current trends forward over a specified time horizon. foresee 2.4.2 Back-cast To question or prove in a prospective manner, how the physical system is operating towards achieving the pre-set aims and goals. manage, confirm 2.5 Qualitize Enhance and improve the quality of the outcomes or deliverables produced by an intervention in real world. 2.5.1 Verify Verify conformance and compliance of physical system with standards, specifications and best practice. Validate, check, comply, conform 2.5.2 Improve Inform the future updating, modifying or enhancing the current standards to be in better coherence and harmony with the actual operational and usage behaviours and patterns. Update, upgrade, revise DT Use Definition Synonyms 03 Communicate To exchange collected and analysed information amongst stakeholders. interact 3.1 Visualize To form and vision a realistic representation or model of current or predicted physical system. review, visioning 3.2 Immerse To involve interested stakeholders in real-like experiences using immersive technologies such as VR, AR and MR. involve 3.3 Document Document and represent gathered and/or analysed data in a professional manner and technical language, forms or symbols. Present 3.4 Transform To modify, process or standardize information to be published and received by other DT(s) or other DT users (e.g. a National DT) or overcome interoperability issues Translate, map 3.5 Engage To involve citizens and large groups of people including marginalized groups in policy and decision-making processes. Empower, include DT Use Definition Synonyms 04 Control To leverage the collected and analysed information to intervene back into the real world to achieve a desirable state. Implement, execute 4.1 Inform To support human decision making throughout the implementation of interventions in the real world. Support, aid 4.2 Actuate Using CPS and actuators to implement changes to physical system. Regulate, manipulate, direct, automate, self-govern Standardised set of ‘DT Uses’ can help avoid miscommunication and confusion while sharing or publishing DT Use Case Scenarios and their content explicitly explaining the 'know-how'. It can also support the procurement of DT services by ensuring the use of a one common language across the supply chain and stakeholders. Al-Sehrawy R., Kumar B. @Bimal Kumarand Watson R. (2021). Digital Twin Uses Classification System for Urban Planning & Infrastructure Program Management. In: Dawood N., Rahimian F., Seyedzadeh S., Sheikhkhoshkar M. (eds) Enabling The Development and Implementation of Digital Twins. Proceedings of the 20th International Conference on Construction Applications of Virtual Reality. Teesside University Press, UK.
RachelJudson posted a topic in IMF Community's ForumIt is proposed that the Information Management Framework (IMF) for the creating of a National Digital Twin will consist of three technical elements: the Foundation Data Model (FDM), Reference Data Library (RDL) and Integration Architecture (IA). The IMF will underpin the creation of an environment which supports the use, management and integration of digital information across the life-cycle of assets. The IMF will also enable secure, resilient information sharing between organisations and will facilitate better decision making across sectors. The National Digital Twin Programme has initiated work investigating this approach with a thin slice of the IMF for the Construction Innovation Hub, to support the development of CIH’s Platform Ecosystem. This thin slice of the IMF is called the FDM Seed. The FDM describes basic concepts such as space-time which are attributable across all areas of our industry. By developing this, the FDM provides a way to explore relationships between these different areas. The FDM Seed is an inception of the above concept and is proposed by starting smaller and watching the development grow - similar to a seed. The first steps of the FDM Seed project is to survey the landscape, to investigate what ontologies and Data models are already in use out there, what they can do, and their limitations, and assess what tools may be useful as a starting point for the FDM and the RDL. The starting point for the FDM is a top-level ontology, this contains the fundamental and generic types of things that exist and the fundamental relationships between them. The survey of Top-Level Ontologies (TLOs) uncovered a surprisingly high number of candidate TLOs with 40 being identified and reviewed, many more that we could have imagined. Fig 1.General classification of the TLO – taken from A Survey of Top-level Ontologies The final survey of top-level ontologies is, we think, the first of its kind. We were looking for an ontology that was rigorous, simple and with sufficient explanatory detail to cover our scope of interest, which is very broad. There are roughly two groups of TLOS, Foundational and Generic: The foundation are rigorous, principled foundations and provide a basis for consistent development and would be suitable for the FDM. The Generic tended to generalisations of lower level, rather than principled and lack a principled basis for extension, and therefore not suitable for the structure of the FDM, though likely to be use for the FDM generic lower levels. An RDL provides the classes and properties to describe the detail of an asset. The survey hoped to identify the most prominent of Industry Data Models and show the best starting point for the IMF RDL. There are many different RDLs in use across sectors. For the purpose of the FDM seed a limited analysis was carried out, but the list is open, and more candidates will be added for future consideration. Surveying and analysing the most commonly used RDLs will mean we are able to give guidance to organization when mapping their existing RDLs to the NDT. Next steps The Survey papers have now been published. We encourage you to engage with the National Digital Twin Programme to find out more about the approach, the results of the survey and the Assessments of the TLOs and Industry Data Models & RDLs. You can find these resources under the 'Files' tab. The Programme is now in the process of gathering their recommendations for the TLOs to use to start the work on the FDM Seed thin slice. We anticipate basing the FDM development on one of the TLOs, bringing in elements from others, based on the survey & analysis.
Dear DT Hub Community Does it feel like getting started on a Digital Twin journey is too hard? Firstly I agree with you and believe that it won't stay this way for long (believe.. he says!). After several years working in UK infra, I now see more of the "realities" of digital twins.... I've worked as the buyer, seller, and builder of digital twins. I am trailing this theory with a major UK Infrastructure Operator (early stages), however, I would really like to determine the interest of this group because I want to draft up a proposal/action plan. To give you an idea the organisations that I have worked with, and who have influenced my thinking are: Clients - Network Rail, Highways England, Connect Plus Services, National Grid, Thames Water, Northumbrian Water, EDF Contractors - Costain, Balfour Beatty, BAM Nuttall, Kier..... Consultants - Mott MacDonald, Turner, and Townsend, PWC, Deloitte..... 1. Digitisation of existing physical assets is seen as a laborious process that doesn't deliver immediate results. This is primarily due to a lack of awareness of the people, processes, and products involved…. It is infact becoming far cheaper and more reliable than ever before: The range and “sophistication" of mass data capture systems and semi/fully autonomous mapping hardware, means that all visible assets can now be readily scanned/digitised/mapped to a high degree of fidelity. (See the wide range of ROVs, UAVs, autonomous submersibles, Network Route Scanning by land air and sea (rail, road, utilities)) Not to mention the advances in Persistent Scatter Interferometric Synthetic Aperture Radar. Which means creating an accurate digital replica of visible assets is become cheaper, safer, more available and more intelligent every 6 months. 2. Organisation and structuring of data are often viewed as highly necessary, however, due to its perceived complexity/risk/cost it is always approached hesitantly/using old thinking There are many options for automatically structuring/re-structuring/transforming data of most types. For fully autonomous data cleansing and structuring, there are now a several information classifier systems (See NET CAD and many more). Even for semi-autonomous cleansing, the functionality of Excel - Power BI - SQL Databases is fit for most of these activities and can drive immediate near term value using recently trained in house staff. This is commonplace and has been happening in other industries for decades. We recently completed an internal activity using an auto data cleanser and classifier and a standard taxonomy, lessons learned for us where there is no point getting the tech right if you forget to train the people! But we all make mistakes. 3. Analysis and use of data - Companies that don’t do much of this believe that to do a lot of it they need 1 of 3 things (Expert 3Rd Party Support/Extensively upskilled internal staff/Hired in Talent). Most organisation agonise between buying supplier services in to do this job quickly (and well!) and wanting to develop their in house data analysis skills (because who wants to be beholden to suppliers?) either through training or in hiring. There is a 4th option though!! Many many organisations around the world now focus on building data analysis capability at client organisations through the provision of "Standard Algorithm Market places". This is an online Ocado for algorithms. Sometimes you want to buy a ready meal (fully finished algorithm) and sometimes you want to build a soup from several base ingredients. In the modern-day, you don't need to know how to grow a carrot to make carrot soup, you just need to combine ingredients. And the same thinking should be applied to Analysis of Data, you don’t need to be able to write code, or even understand how code is written, to be able to develop very powerful algorithms. Just like the farmer who cultures the Carrot for you, these new suppliers culture very powerful modular algorithms that do specific things. Lets take Satellite imagery analysis for example, there are no longer complex codes, just drag and drop tools that do different things such as (Pick out trees, Trace out the Road markings, Determine total blacktop surface). So now all you need to do is plug these functional algorithms (tools) together and you are up and running! 4. Monetising the data is hard. This bit is actually very simple to do! This is the fun part, there are loads of Management Consulting and Business Startup exercises that help show how to monetise data, information, and analytics. If you are worried only about this bit then you haven't got much to worry about. 5. People, people people........people People need more attention! I have been told that this community is eager to hear from suppliers who are already delivering digital twin solutions, already delivering value, and want a large partner to work alongside? It is not unusual for a rapidly emerging industry to have sudden drops to the "barriers for entry". I believe we are entering one now so it is a really good time to talk about what might and might not work for us. Much respect, Peter Slater MIET MEng - "Digital Maturity has no finish line"