Jump to content

Back to Visual Index

The IMF & Connecting Digital Twins

Showing all content tagged 'Information Management Framework (IMF)', 'Ontology', 'Data quality', 'Data value', 'Data sharing', 'Data visualisation', 'Regulation', 'Ethics', 'Legal', 'Security', 'National Digital Twin' and 'Interoperability' and posted in for the last 365 days.

Content Types


This stream auto-updates

  1. Last week
  2. You can criticize Ontopop but you can't compare with your very expensive Graphshare as you provide only DEMO for free and the next price step so high for most of the SME or individuals. Also there are only bla-bla on Graphshare website with NO ANY exact information or video illustration. Ontopop available to everyone in our society for free with full features and can extend and customize as you wish. Thank you @Ian Gordon - (But your GitHub link doesn't works and NO any Ontopop repository there.) https://github.com/hyperlearningai/ontopop https://github.com/leipzig/ontopop
  3. Humanner

    Planning Golden Thread

    Hi Rachel Thank you for your answer for my comment. I suggest make a research about what people think of the UK housing design. You will get quite negative feedback from those who come from Europe or other developed countries. This kind of template building style not a big tourist attraction. Most of the houses built with weak walls by big housing construction companies as you can't hang on any relatively heavy furniture or TV set. Also the garages size are too small for the modern cars but nobody willing to change on this problem and all new garages used for storage only. That is a comedy.
  4. Dave Murray

    Test Engineering and DTs

    Thanks for posting the slides Alexandra and apologies again to all for the internet bandwidth problem that I experienced during the call today. The Defence Sector is actively looking at ways to use DTs. It is coming to this from a strong background in Modelling and Simulation. There is opportunity for useful 2-way transfer of information and knowledge-sharing that would be beneficial to all participants. My suggestion is that a Lifecycle V&V network would be a good way to facilitate that, but all suggestions welcomed! Dave Murray
  5. Our fundamental concepts, tools and platforms in which we design and implement information systems prevent us from making them flexible and “organic”. They tie us to their components and versions, hide the structures in huge text files, get more rigid and fragile by every change. To step forward, address the true living and evolving nature of every system, we need a radically new approach. This would be important before start solving any particular task, but essential in mitigating critical situations like the Corona crisis.
  6. Earlier
  7. Thanks for sharing @Peter Foxley I always love a tree based analogy.. especially an Oak based analogy!
  8. Humanner

    Digital twin Internship

    Hi Amir, If you interested in scope of social system innovation would be great to have a discus - transdisciplinarian public engagement. Cross sectorCultural heritage https://www.linkedin.com/pulse/citizen-social-science-age-alpha-generation-humanner-/ these are not directly relevant : TransportEnergyResidential buildingsCommercial buildingsIndustrial buildings
  9. As set out in the Pathway to the Information Management Framework, the Integration Architecture is one of the key technical components of the Information Management Framework. It consists of the protocols that will enable the managed sharing of data across the National Digital Twin. In the recently released Integration Architecture Pattern and Principles paper, the NDTp’s technical team set out key architectural principles and functional components for the creation of this critical technical component. The team defines a redeployable architectural pattern that allows the publication, protection, discovery, query and retrieval of data that conforms to the NDT’s ecosystem of Reference Data Libraries and the NDT’s Foundation Data Model. Download the Integration Architecture Pattern and Principles paper The Integration Architecture Pattern and Principles paper will take you through: A requirement overview: a series of use cases that the Integration Architecture needs to enable, including: routine operational use cases: where data from a diverse set of organisations can be shared and analysed for a single purpose (e.g to support legal and regulatory requirements) the ability to respond to an emergency: pulling together data from across different communities in a way that was not foreseen before the incident that caused the requirement ‘business as usual’ NDT maintenance use cases such as publishing a Digital Twin or adding a user to the NDT ecosystem. Architectural principles: key architectural principles that must be adhered to, regardless of the type of architecture that is implemented, including: Data quality: data quality needs to be measurable and published with the data itself Privacy of the published data: the Integration Architecture shall ensure that data is shared and used only according to the conditions under which it was published. Security: ensuring that all data and functions are secure from bad actors. Encryption will be a particularly key aspect of the security features in the Integration Architecture. Recommended integration architecture pattern: Three general architectural pattern options are explored in the paper (centralised, distributed, and federated). The benefits and concerns for each pattern are discussed with respect to the requirements. The recommended architectural pattern is a hybrid of these three approaches – centralising certain functions, whilst distributing and federating others. The recommended pattern is intended to allow datasets to be shared locally (i.e., within an NDT Node, see figure below), but will also allow for inter-node discovery, authorisation and data sharing to take place. NDT Nodes may be established by individual organisations, regulators and industry associations, or service providers and will be able to handle Digital Twins on behalf of their constituent organisations and provide a secure sharing boundary. In the recommended architecture: Datasets are published by the data owner (1), these are then made available to the organisations within the community of interest, in addition an event is issued to register publication with the Core (2). When queries are submitted (A), the dataset can then be discovered by organisations in other communities of interest (B) and retrieved where appropriate (C). The release, discovery and retrieval are carried out according to the authorisation service so that access is controlled as specified by the data owner. Detail of the functional components: The Core Services are likely to be quite thin, comprising mainly of: a master NDT Catalogue that holds the location of available NDT Datasets across the ecosystem the master FDM/RDL that will synchronise with the subset that is relevant for each NDT Node a publish/ subscribe model to propagate data changes to parties that have an interest and appropriate contract in place. The Core and each NDT Node shall interact through a microservice layer, with which they shall have to be compliant. Next steps The paper concludes with a list of 10 key tasks to develop further the Integration Architecture components. We will make sure to inform you on progress and in the meantime, we are looking forward to hearing your questions and comments on the paper!
  10. Digital Water

    Climate Accounting - Hyperledger

    An example was given of applying accounting principles which justified cutting down trees to encourage growth of a spagnum bog because Spagnum is apparently 200 times more effiient at carbon capture and water retention than a wooded forest covering the same ground area and faster growing.
  11. HenryFT

    Insights on performance saves money

    I completely believe this, you can't manage something you can't see
  12. David McK

    The value of, and from, Data

    Thanks, Tom. Some people might be interested in this Forbes magazine interview with Davin Crowley-Sweet, Chief Data Officer at Highways England https://www.forbes.com/sites/douglaslaney/2021/02/01/data-valuation-paves-the-road-to-the-future-for-highways-england/?sh=4510bb46612c&utm_medium=email&_hsmi=109498131&_hsenc=p2ANqtz-9xpRbsKEFsqfxsup9dXUfcWRvrxHE687_X-_FPyl93QAm2b520GA-mCmlwizF_e8UyDiX908Lbfr2dPkVcCnumXnwADw&utm_content=109498131&utm_source=hs_email
  13. Today we are delighted to publish the Approach to Develop the Foundation Data Model for the Information Management Framework. This document follows up on the November publication of the Survey of Top-level Ontologies (TLO) and the Survey of Industry Data Models (IDM) and Reference Data Libraries (RDL). (You can find these publications under Gemini Commons/IMF technical Documents.) The pragmatic and technical requirements for the Foundation Data Model have now been developed and consideration has been given as to whether any existing Top-Level Ontologies could be used as a suitable start-point. The Approach takes you through these requirements, the assessment of the surveyed TLOs to the final decision. There are four Top-Level Ontologies that meet all the technical requirements: BORO, IDEAS, HQDM and ISO 15926-2. They are distinct from the other Top-Level Ontologies in being 4-dimensionalist. These allow us to see individual objects as four-dimensional, having both spatial and temporal parts. You can find the Approach to Develop the FDM for the IMF here
  14. In November 2020 the National Digital Twin programme programme published the Survey of Top-level Ontologies (TLO) and the Survey of Industry Data Models (IDM) and Reference Data Libraries (RDL). You can find these publications under Gemini Commons/IMF technical Documents. The technical part of the proposed pathway to an Information Management Framework comprises of three main elements: A Foundation Data Model A Reference Data Library, and An Integration Architecture which define a common structure and meaning for the consistent and integrated sharing of information. The pragmatic and technical requirements for the Foundation Data Model have now been developed and consideration has been given as to whether any existing Top-Level Ontologies could be used as a suitable start-point. There are four Top-Level Ontologies that meet all the technical requirements: BORO, IDEAS, HQDM and ISO 15926-2. They are distinct from the other Top-Level Ontologies in being 4-dimensionalist. These allow us to see individual objects as four-dimensional, having both spatial and temporal parts. We are therefore proceeding to develop the Foundation Data Model seed from these 4-dimensionalist Top-Level Ontologies. The Approach to Develop the Foundation Data Model for the Information Management Framework has been published here alongside the Surveys in the Gemini Commons/ IMF Technical documents. If you would like to ask any questions about the publication, the methods taken and choices made, head over to the IMF Community Network where the programme team are available to respond.
  15. Andy Parnell-Hopkinson

    Wrangling hostile data sources

    It certainly makes the case for automation. Imagine if your job was importing and structuring that data every day.
  16. The digital future of the built environment relies on the people that will create it. In our integrated world, over two thirds of UK leaders say their organisation is facing a digital skills gap (Microsoft, 2020) - we have a challenge and opportunity to close this gap whilst realising the benefits of the National Digital Twin. Working as part of the Mott MacDonald and Lane4 team appointed by the Construction Innovation Hub, we have developed a Skills and Competency Framework to raise awareness of the skills and roles needed to deliver a National Digital Twin. The skills and roles identified relate specifically to the Information Management Framework (IMF) - the core component of the National Digital Twin that will enable digital twins to speak the same language. The future of the National Digital Twin is in your hands Seize the opportunity to use this Skills and Competency Framework, to underpin digital twin development and IMF adoption. Without understanding the skills and roles required, there is a risk that organisations may deploy staff lacking sufficient skills to develop their digital twins. A skills gap could also risk poorly designed digital twins which do not support interoperability and connectivity with the IMF or failed digital twin pilots and projects which have direct economic consequences for those organisations. Accelerating progress with skills development With the Skills and Competency Framework, we can accelerate progress, reduce the rate of digital twin failure and ensure consistency in the approach to enable the National Digital Twin – all while establishing a pathway for digital skills and capability enhancement across the UK. We can do this by: Communicating the value of data as infrastructure, and the importance of literacy, quality and security Taking a systems-thinking approach to see data, technology and process as part of an interconnected ecosystem Having a collaborative and adaptable culture that is benefits driven, focused on outcomes to achieve and recognise the role people play in achieving this Find out how to achieve this by using the Skills and Competency Framework and stay tuned for a supporting Capability Enhancement Programme with role-based training plans and skill self-assessments. Learn by doing, Progress by sharing This Skills and Competency Framework is the first of its kind, but the topic of digital skills development in our industry is not. Throughout the development of the Framework, we have engaged with stakeholders and material from many bodies such as the Construction Innovation Training Board (CITB), Open Data Institute and other CDBB initiatives around skills. We intend to progress the Framework by sharing it with the industry and connecting to other bodies, industries and people with similar purposes and goals as CDBB. We are open, we are collaborative and we are ready to close the skills gap.
  17. Hi IMF Community, You may find this workshop interesting: "4-Dimensionalism in Large Scale Data Sharing and Integration" Full details and Registration can be found at: https://gateway.newton.ac.uk/event/tgmw80 . The workshop will feature six presentations on state-of-the–art research from experts on 4-Dimensionalism in large scale data sharing and integration followed by a chaired Presenter's Panel. Each presentation will cover aspects of 4-Dimensionalism from the basics to Top Level Ontologies and Co-Constructional Ontology with each answering the question posed by the previous presentation.
  18. Hello There are papers by AMRC (https://www.amrc.co.uk/files/document/404/1604658922_AMRC_Digital_Twin_AW.pdf), Leading Edge Forum (https://leadingedgeforum.com/insights/digital-twins-a-guide-to-the-labyrinth/), and Iotics is also featuring a series of podcasts (https://www.iotics.com/about/digital-reflections/) by thought leaders which are most inspiring in regard to the ongoing definition debate, future roadmap and art of the probable. I find these helpful. Best wishes Sophie
  19. The idea of a Digital Twin [DT] needs to advance in a standardized and formal manner. As @DRossiter87 pointed out, this is necessary to "support the consistent adoption of, and interoperability between, digital twins for the built environment within the UK" To aid in this endeavour, it is useful to delineate the difference between the following terms: “DT General Use Case” [GUC]: a very short sentence, possibly consisting of two or three words – ideally a verb followed by a noun – ultimately concerned with the brief and blunt statement of the central business aim that is motivating the use and implementation of a DT, e.g., ‘optimize traffic’. “DT Use Case Scenario” [UCS]: a documentation of the sequence of actions, or in other words, DT-user interactions executed through a particular DT case study or an actual DT real-world project. “DT Use”: a typical technical function or action executed by any DT throughout the course of any UCS. Accordingly, the DT uses are seen as the standard building blocks by which a standard common language can be founded. Such a standard language, which can possibly be machine-readable as well, can be used in documenting and detailing the DT-user interactions in a standard format to facilitate their publishing and sharing of knowledge. Below is a figure of a “DT uses taxonomy”. It is made up of three distinct hierarchical levels, these are respectively: ‘Included Uses’ containing four high-level cornerstone uses that are, besides rare exceptional cases, included in and executed by almost every DT throughout any UCS (i.e. Mirror, Analyse, Communicate and Control); ‘Specialized Uses’ including special forms of the Included Uses, where each specialized use enjoys unique strengths suitable for specific purposes; and “Specialized Sub-Uses” at the lowest hierarchical level of the taxonomy, which further differentiates between variant types within a Specialized Use at the higher level by virtue of very fine inherent variations that distinguish one type from another and thus, enhances the DT’s practical adequacy in dealing with alternative contexts and user specifically defined purposes. The table below include a formal definition of each DT Use. DT Use Definition Synonyms 01 Mirror To duplicate a physical system in the real world in the form of a virtual system in the cyber world. Replicate, twin, model, shadow, mimic 1.1 Capture Express in a digital format within the virtual world the status of a physical system at a point of time. (Usually, offline DT) collect, scan, survey, digitalize 1.2 Monitor Collecting information related to the performance of a physical system. (Online or Offline DT) Sense, observe, measure 1.3 Quantify Measure quantity of a physical system’s particulars, instances or incidents. (Online or Offline DT) Quantify, takeoff, count 1.4 Qualify Track the ongoing status of a physical system (Online or Offline DT) Qualify, follow, track DT Use Definition Synonyms 02 Analyse To create new knowledge and provide insights for users and stakeholders about a physical system. Examine, manage 2.1 Compute To perform conventional arithmetical calculations, traditional mathematical operations and functions and simple statistical techniques like correlations Calculate, add, subtract, multiply, divide 2.2 Mine To uncover, identify and recognize the web of interdependencies, interconnected mechanisms, complex processes, interwoven feedback loops, masked classes, clusters or typologies, hidden trends and patterns within the physical system. Learn, recognize, identify, detect, AI, ML, BDA 2.3 Simulate To explore and discover the implications and possible emerging behaviours of a complex web of interacting set of variables. 2.3.1 Scenario To find out the implications, impacts or consequences of implementing pre-defined scenarios (akin to non-destructive tests) What-if, evaluate, assess 2.3.2 Stress-Test To identify the scenarios that may lead to failure or breakdown of physical system (akin to destructive tests) Test, inspect, investigate 2.4 Predict Concerned with futures studies 2.4.1 Forecast to predict the most likely state of a real system in the future, by projecting the known current trends forward over a specified time horizon. foresee 2.4.2 Back-cast To question or prove in a prospective manner, how the physical system is operating towards achieving the pre-set aims and goals. manage, confirm 2.5 Qualitize Enhance and improve the quality of the outcomes or deliverables produced by an intervention in real world. 2.5.1 Verify Verify conformance and compliance of physical system with standards, specifications and best practice. Validate, check, comply, conform 2.5.2 Improve Inform the future updating, modifying or enhancing the current standards to be in better coherence and harmony with the actual operational and usage behaviours and patterns. Update, upgrade, revise DT Use Definition Synonyms 03 Communicate To exchange collected and analysed information amongst stakeholders. interact 3.1 Visualize To form and vision a realistic representation or model of current or predicted physical system. review, visioning 3.2 Immerse To involve interested stakeholders in real-like experiences using immersive technologies such as VR, AR and MR. involve 3.3 Document Document and represent gathered and/or analysed data in a professional manner and technical language, forms or symbols. Present 3.4 Transform To modify, process or standardize information to be published and received by other DT(s) or other DT users (e.g. a National DT) or overcome interoperability issues Translate, map 3.5 Engage To involve citizens and large groups of people including marginalized groups in policy and decision-making processes. Empower, include DT Use Definition Synonyms 04 Control To leverage the collected and analysed information to intervene back into the real world to achieve a desirable state. Implement, execute 4.1 Inform To support human decision making throughout the implementation of interventions in the real world. Support, aid 4.2 Actuate Using CPS and actuators to implement changes to physical system. Regulate, manipulate, direct, automate, self-govern Standardised set of ‘DT Uses’ can help avoid miscommunication and confusion while sharing or publishing DT Use Case Scenarios and their content explicitly explaining the 'know-how'. It can also support the procurement of DT services by ensuring the use of a one common language across the supply chain and stakeholders. Al-Sehrawy R., Kumar B. @Bimal Kumarand Watson R. (2021). Digital Twin Uses Classification System for Urban Planning & Infrastructure Program Management. In: Dawood N., Rahimian F., Seyedzadeh S., Sheikhkhoshkar M. (eds) Enabling The Development and Implementation of Digital Twins. Proceedings of the 20th International Conference on Construction Applications of Virtual Reality. Teesside University Press, UK.
  20. Good question! I consider the development of digital ecosystem or digital infrastructure can learn a lot from the physical infrastucture, including the skillsets and career paths: Similar to physical infrastucture, we need, CEO/CTO/CFO .. designer, architect, projectment, testing, engineers, services, serurity, human resources, customer care...... to name a few. Add data/digital in front of existing roles can form a good starting point, and see some gaps. To address the gap, actions are louder than any words. We should break seemingly unachieve ambitious targets into small and managable steps, fail fast, learn fast, adapt fast. A visual explanation stolen from the digital twin fan club's tweet. pic ref: https://twitter.com/thedigitaltwin/status/1337674430240186370?s=20
  21. I was reccently introduced to the work on Digital Twins that the City of Wellington is involved in. I share some links with the DT Hub community. Unlocking the Value of Data: Managing New Zealand’s Interconnected Infrastructure Plus, check out these links too.. which where shared with me by Sean Audain from Wellington City Council who is leading the Digital Twin activity in the city. "We have been on this trip for a while - here is an article on our approach https://www.linkedin.com/pulse/towards-city-digital-twins-sean-audain/ - the main developments since it was written was a split between the city twin and the organisational twin - something that will be formalised in the forthcoming digital strategy. To give you an idea of progress in the visualisation layer this is what the original looked like https://www.youtube.com/watch?v=IGRBB-9jjik&feature=youtu.beback in 2017 - the new engines we are testing now look like this https://vimeo.com/427237377 - there are a bunch of improvements in the open data and in the shared data systems." I asked Sean about the impact on the DT to city leaders decision making. This is his response... "In our system we are open unless otherwise stated. We have used it as a VR experience with about 7000 wellingtonians in creating the City Resilience Strategy and Te Atakura- the Climate CHange Response and Adaptation plan. There are more descrete uses such as the proposals for the Alcohol Bylaw - https://www.arcgis.com/apps/Cascade/index.html?appid=2c4280ab60fe4ec5aae49150a46315af - this was completed a couple fo years ago and used part of the data sharing arrangements to make liquor crime data available to make decisions. I have the advantage of being a part of local government in getting civic buy in. Every time our councillors are presented with this kind of information they want more." Alcohol Control Bylaw – New
  22. @David Willans of Anmut recently sent me this invitation and I thought I should share it here (with permission). On 24th February, 11am GMT, Anmut are running a webinar about data valuation. When we mention the term, people tend to think it’s about setting a price for monetisation. That is one benefit of doing valuation, but it’s a third order benefit at best. The first and second order benefits are much more valuable and best described with two words, translation and focus. Translation Businesses are, in a simplified way, about choosing which assets and activities to allocate limited capital and resources to, to get the desired results. Data is just one of those assets, a powerful one because it enhances all the others by making decisions better, and can identify unseen problems and new opportunities. These allocation decisions are made using the money as a measure, a language if you will – invest £XXX in product / advertising / a new team / training, to get £XXXX in return. Data doesn’t fit with how a business allocates capital, which makes realising the value of it much harder. When you value it, ‘it’ being the different data assets in a business, data can be compared to other assets. It fits the ways the business runs naturally. The second order impact of this is culture change. Suddenly the business understands it has a sizeable portfolio of data assets (in our experience this is approx 20 - 30% of the total business value) and, because businesses manages through money, the business starts to naturally manage data. One caveat though, for the translation effect to happen, the way data's valued matters. If it’s just a simple cost-based method, or linear, internal estimates of use case value, the resulting valuation won’t be accurate and people won't believe it, because the figures will be based factors heavily influenced by internal politics and issues. Focus Capital allocation is a game of constrained choices, of where to focus. When a business’ portfolio of data assets is valued, it becomes very clear where to focus investment in data to move the needle – on the most valuable data assets. Again, this puts more pressure on the valuation method, because it has to be based on the ultimate source of value truth – the stakeholders for whom the organisation creates value. If you need to translate the value of data so the rest of the business gets it, or need clearer focus on how to create more measurable value from your data, this webinar will help. Find out more here or sign up
  23. Worth a look at Digital twins, data quality and digital skills (pbctoday.co.uk)
  24. James Harris

    The National Digital Twin Legal Implications

    @James C, A very astute observation and excellent feedback. Thank you. You are entirely right that there is a story that needs to be told on how the emergence of an NDT works outside of the technical narrative that we have chosen to lead off with; keeping fairly close to home in recent times, to ensure that the message has the right starting point. There are of course significant areas of impact on areas such as finance and underwriting, where the availability of better decision making and risk visibility mechanisms could do lots for the reduction of WACC/hurdle rates for infrastructure investments, or in reducing project or through-life premiums. The potential impact on financial liquidity, investor confidence, project pipelines and therefore employment and ultimately, national economic output, should be clearer. In fact, the contributors to the roundtable raised this point on several occasions, and we will certainly be expanding our engagement in these areas in the next phase. Similarly, the social value of this improved transparency, in terms of how it can create new or improved ecosystems of digital services for the general public, is an interesting angle. How we plan for, and control this secondary market (or whether we should at all) is a multi-headed problem. A necessary one to start talking about though. There are parallels to be drawn in the way the energy sector is beginning to manage the emergence of its own secondary data market, enabled by technology such as the Smart Meters that many of us now have in our homes. We're working closely with Ofgem and others to learn how their own governance arrangements are maturing, to inform our thinking. The legal roundtable was the first step to pointing out where we had gaps. You are right that it does address threat rather than opportunity as a priority. We were looking primarily for the big hairy problems which might have caught us by surprise. There were also many areas of opportunity raised that were redolent with potential to explore further, but we chose to focus on the potential blockers this time around. We've yet to solidify our activity plan for the legal sub-stream in the next phase, but I would be highly supportive of your recommendations. A long winded response, sorry. @Sarah Rock, @Miranda Sharp, have I missed anything? James
  25. Matthew West

    Scientific publication on work to date

    Thanks for this Ian. We've taken a look at it, and this is clearly a SemWeb publication, and at the moment most of the work we are doing is in the Logic/Philosophical Ontology area, and other work in the Integration Architecture is not far enough advanced given the considerable effort involved in putting together a full academic paper. Perhaps next year.
  26. We live in a world abundant in data and technology. There are numerous ways to fake data of all kinds (think deep fake). Envisioning a future where data outputs become as common as a PDF report how do we enable the skills around critical thinking that will allow data professionals to know when something doesn't look right even though it may have already gone through data quality and data audit checks. Just a thought at this point but I would be interested in others thoughts.
  1. Load more activity
Top
×
×
  • Create New...