Jump to content
Start your business case for a digital twin using the DT Toolkit Collaborative Workshop ×

Search the Community

Showing results for tags 'Data sharing'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Themes

  • Member Forums
    • General discussion
    • Testing digital twin concepts
    • Digital twin competencies
    • Pathway to value
    • IMF Pathway
    • Videos
  • IMF Community's Forum
  • DT Toolkit's Case studies
  • DT Toolkit's Business case
  • DT Toolkit's Use cases
  • DT Toolkit's Roadmap
  • DT Toolkit's Network feedback
  • DT Toolkit's Toolkit
  • Data Value and Quality's Forum
  • The Defence Digital Network's Welcome!
  • The Defence Digital Network's Defence DT Roadmap
  • The Defence Digital Network's Acronym Buster
  • The Defence Digital Network's Open Forum
  • Open Innovation - Open (Citizen) Science - SDG's Open Innovation
  • Open Innovation - Open (Citizen) Science - SDG's Social Innovation Ecosystem
  • Open Innovation - Open (Citizen) Science - SDG's Events
  • Funding / Collaborative R&D Opportunities's Challenges
  • Funding / Collaborative R&D Opportunities's Funding
  • Italian DT Hub's Q&A
  • Italian DT Hub's News

Calendars

  • Community Calendar
  • DT Toolkit's Events
  • Italian DT Hub's Events

Categories

  • Articles
  • Blogs
  • Publications
  • Editorials
  • Newsletters
  • Shared by the Community

Categories

  • A Survey of Industry Data Models and Reference Data Libraries

Categories

  • Hub Insights
  • Digital Twin Talks: Series 2
  • Digital Twin Talks: Series 1

Categories

  • Member Resources
  • Public Resources
  • Guidance
  • IMF Community's Files
  • DT Toolkit's Files
  • Data Value and Quality's Shared Resources
  • Italian DT Hub's Risorse

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Found 24 results

  1. Visual intelligence is the ability to capture, connect and communicate information about spaces in real time. Then to instantly transform it through visualisation techniques into accurate, accessible, actionable data useable by anyone that needs. A process embedded in the simple digital twin but enabled by emerging technologies, specifically the digital integration between devices, enhanced by immersive technology and artificial intelligence. Think of visual intelligence as a compass. Without it, a vessel can’t make the most of its assets, is uncertain where it’s headed, how it will be impacted by the environment and how it can reach its destination with maximum efficiency and care for its crew. Businesses have to take certain actions to increase ROI, communicate to and manage disparate teams, automate with confidence, set out clear directions and grow faster. Connected and integrated data translated into visual intelligence enables these actions. It is the compass. Attached are some insight from a few companies who started with a simple digital twin – a connection of data – but who have embraced visual intelligence and what it means for them National Digital Twin presentation (1.1).pdf
  2. until
    The Climate Resilience Demonstrator (CReDo) project from the National Digital Twin programme is holding a webinar to launch the project to a global audience in conjunction with the COP26 climate conference on 2nd November at 10:30-12. This webinar replaces the weekly Gemini Call, and the DT Hub community are encouraged to sign up, as well as inviting their wider networks to attend. The climate emergency is here now, and connected digital twins are an important part of achieving net zero and climate resilience. The CReDo team will present how the project meets this urgent need, and will premiere two exciting outputs – a short film and an interactive visualisation of how connected data across three infrastructure networks can provide better insights and lead to better resilience of the system-of-systems overall. Only if we come together to securely share data across sectors can we plan a smarter, greener, more resilient built environment. Book your spot today! Keep an eye on the DT Hub website for updates about the CReDo programme.
  3. RachelJudson

    Planning Golden Thread

    Click here for video As citizens and professionals we accept that the planning process is there to uphold standards of safety, aesthetic, technical and social requirements. However, the planning process has suffered from many years of tinkering and making good. We now have a planning process that is dependent on outdated approaches and incompatible with the rest of the development industry. It is slow, which presents problems in the UK where we need to build, a lot, quickly. Planning risks preventing this building from happening at pace and of a higher quality. This situation presents, of course, a golden opportunity for a fully digitised end-to-end process which could: reduce the planning bottleneck automate those parts of the process that can be Increase transparency of the process open up new means of engaging stakeholders with the planning process, by for example visualising proposed developments and so increasing understanding allow us to see projects in context, with other proposed developments, rather than in isolation allow access to, and sharing of, crucial data (like structural and fire safety information) facilitate the use of modern methods of construction most importantly, give a more accurate understanding of build costs and timescales In order to bring this about, we have to standardise and digitise (as far as it is possible and desirable) the rules under which designs are created, assessed, and ultimately built. At the same time we have to find ways to generate and use interoperable data. This problem is what the group from Bryden Wood, 3D Repo, London Borough of Southwark and CDBB have been working on. We have developed a model which is open and based on the established BIM Collaboration Framework (BCF). It presents the data associated with planning so that it can be queried and interrogated. You can see a summary in the video above and read more about it here; Planning Golden Thread statement attached below 3DRepo technical write up Bryden Wood technical write up Bryden Wood Schema We know that many of the barriers associated with a change like this will be cultural rather than technical so we are seeking partners in the planning and development system who would like to test the model and collaborators who would like to fund the next stage of development. Please get in touch! You can also hear more about this on the Gemini Call on Tuesday, 18 May at 10:30 with Miranda Sharp and Jack Ricketts of Southwark Council. Link to DT Hub Calendar
  4. The building stock is a city’s most significant socio-cultural and economic resource and its largest capital asset. Buildings are also where we spend most of our lives and most of our money, and where enormous potential for energy and waste reduction lies. To help improve the quality, sustainability and resilience of building stocks, and to help reduce emissions from them, comprehensive information on their composition, operation and dynamic behaviour are required. However in many countries relevant data are extremely difficult to obtain, often highly fragmented, restricted, missing or only available in aggregated form. Colouring Cities sets out to address this issue. The initiative develops open code to facilitate the construction and management of low cost public databases, which double as knowledge exchange platforms, providing open data on buildings, at building level. These are provided to answer questions such as: How many buildings do we have? Which building types, uses, construction systems, ages, styles and sizes are located where? How repairable, adaptable and extendable are they? How long can they last if properly maintained? How energy efficient are they? Can they easily be retrofitted? Who built them and what is their ownership type, and how well do local communities think they work? Colouring Cities also looks to advance a more efficient, whole-of-society approach to knowledge sharing on buildings and cities, allowing for permanent databases to be collaboratively maintained and enriched, year-on-year, by citizens, academia, government, industry and the voluntary sector. Colouring London https://colouringlondon.org/, our live prototype, has been built and tested over the past five years using a step-by-step collaborative approach which has involved consultation with academia, government, industry, the voluntary sector and the community (working across science, the humanities and the arts). It looks to test four approaches to data provision-collation of existing open uploads, computational generation, local crowdsourcing and live streaming. In 2020 the Colouring Cities Research Programme was set up at The Alan Turing Institute to support international research institutions wishing to reproduce and co-work on Colouring Cities code at city or country level. We are currently collaborating with academic partners in Lebanon, Bahrain, Australia, Germany and Greece and Switzerland. Watch the Hub Insight to learn more about the project and the opportunity to get involved. If you'd like to get involved please do test our site and add any recommendations for features you would like in our discussion thread https://discuss.colouring.london/. Or, if you are a public body or DTHub industry member wishing to increase open access to your infrastructure datasets, and/or to digital twin visualisations, relating to the building stock, please contact Polly Hudson at Turing. Find out more:
  5. The pandemic has highlighted the need to make better, data-driven decisions that are focused on creating better outcomes. It has shown how digital technologies and the data that drives them are key to putting the right information in the right hands at the right time to ensure that we make the right decision to achieve the right outcomes. Connected ecosystems of digital twins, part of the cyber physical fabric, will allow us to share data across sectors, in a secure and resilient fashion, to ensure that we can make those important decisions for the outcomes that we need. They provide us with a transformative tool to tackle the major issues of our time, such as climate change, global healthcare and food inequality. We must use digital twins for the public good, as set out in “Data for the Public Good”, and we must also use those digital twins to create a better future for people and the planet. The recent publication of the Vision for the Built Environment sets out a pioneering vision for the built environment, and we want to see that vision expanded further, to include other sectors, such as health, education, manufacturing and agriculture. As the UK considers what a national digital twin might look like, we draw on the experience of the past three years to add to the discussion. A UK national digital twin must have a purpose-built delivery vehicle that works through coordination, alignment and collaboration. It needs to bring together those working in the field, across sectors, across industries, and across government departments. It must balance the need for research, both within academic institutions and industry, with the industry implementation and adoption that is already underway. And it must ensure that the programme is socio-technical in nature; if we concentrate solely on the technical factors, while failing to address the equally important social considerations, we risk creating a solution that cannot or will not be adopted – a beautiful, shiny, perfect piece of ‘tech’ that sits on a shelf gathering dust. There are many in the UK doing fantastic work in the digital twin space, and the wider cyber-physical fabric of which connected digital twins are a part. We know from experience that we get much better outcomes when we work together as a diverse team, rather than in siloes which lead to fragmentation. Industry is already creating digital twins and connecting them to form ecosystems. If we are to avoid divergence, we have to act now. To start the discussion and allow the sharing of thoughts and experience, the Royal Academy of Engineering has convened an open summit, hosted by the DT Hub on the 19th July from 10:00 – 16:00. The day will start with an introduction laying out the opportunities and challenges we face as a nation and as a planet. This will be followed by four expert-led panels, each with a Q&A session. The first is chaired by Paul Clarke CBE on the cyber physical fabric; followed by a panel on data and technical interoperability chaired by Professor Dame Wendy Hall; after lunch, Professor David Lane CBE will chair a panel on research; followed by a panel on adoption chaired by Mark Enzer OBE. The four panel chairs will convene a final plenary session. I do hope you will join us, to hear the experiences of others and to add your own expertise and knowledge to the conversation. To register for the Summit, click here.
  6. Dave Murray

    Test Engineering and DTs

    I am considering starting a network for topics related to Lifecycle V&V (Validation and Verification) centred on Evaluation and Testing, and this message is to poll the level of potential interest. I imagine the network would offer the following: · A place for Test Engineers from different market sectors to share experiences and gain knowledge · Support for those areas where DT activity is low but growing, the Defence Sector is an example, to benefit from the experiences of other sectors Test Engineers have a mix of technical and customer skills that are central to successful project implementation. The DT concept provides a lifecycle project-thread that provides Test Engineers with an unprecedented opportunity to exercise their skills. Maybe finding a way to maximise this opportunity might also attract more people to the career, and be a way to improve recruitment into the world of Engineering? If we launch this Network, would you consider joining it? Dave Murray
  7. until
    Speaker: Mark Enzer, CDBB and CTO, Mott MacDonald The National Digital Twin (NDT) is a huge idea using “data for the public good” at its heart. The NDT promises enormous value for the people of the UK, both in the delivery of new assets and in the performance of our existing infrastructure. The fundamental premise behind the NDT is: Better data + Better Analysis => Better Decisions => Better outcomes for people and society – which is the essential promise of the Information Age. The NDT is not envisaged as one massive model of everything, but as an ecosystem of connected digital twins. Connecting digital twins requires interoperability to enable secure resilient data to be shared across organisational and sector boundaries. However, interoperability requires a level of data quality and consistency that “the market” cannot achieve on its own; it requires government-level leadership to create the right conditions for “the market” to adopt and deliver to the standards required and in doing so develop and thrive. This presentation will: introduce the National Digital Twin, explain what it is and why we need it, and outline what is being done to deliver it. Register at the link below for Mark's presentation and others: Webinar: DMSG & DAMA collaboration event: Making data good for society | BCS
  8. https://gateway.newton.ac.uk/event/tgmw80 This one day workshop presents an opportunity to get up-to-date on the state of the art with 4-dimensionalism and its application. It is a joint collaboration between the Newton Gateway to Mathematics, GCHQ, UCL STEaPP, Southampton University, Warwick University and Brunel University. Current and potential applications of 4D/Digital Twin data modelling are wide ranging. In recent decades it has been used in both Oil and Gas and Defence/Security environments. Potential uses include the built environment and various engineering applications including aircraft engines, wind turbines, buildings and large structures, control systems. The Grenfell tragedy and subsequent enquiry has uncovered the failure to use information effectively by a complex ecosystem of organisations. At the same time the challenges posed by responding to Covid has resulted in the Royal Society DELVE group to state clearly that there is a lot to learn from the current shortcomings in the use of data. The integrated use of data to inform key decisions offers a lot of potential. However, integration of data turns out to be far harder than is generally assumed. This event will cover what it takes to address data integration and illustrate its grounding in both pure mathematics and philosophy. A programme of talks will outline recent advances in the 4-dimensional ecosystem, and how they are being taken up and applied within the National Digital Twin programme. This event should be of interest to Data Architects from multiple settings including, industry, business and the public sector. 4-Dimensionalism in large scale data sharing and integration will also have broad appeal to the mathematical sciences as it draws upon a surprising number of branches of pure mathematics disciplines in the construction of a formal model basis for data integration. It should be applicable to a wide range of applied mathematics fields, where the use of models and data to increasingly complex areas is vital, and supports improved and trusted human-centred decision making. Presentations will touch on set theory, topology, geometry, combinatorics and formal logic and explain why the need for consistency in data depends on harnessing them. A Provisional Programme is available here. Registration and Venue To register and for further information, please follow the registration link in the left hand panel. The workshop will be hosted virtually by the Newton Gateway to Mathematics and the joining instructions will be circulated prior to the event.
  9. David McK

    The value of, and from, Data

    For me, Digital Twins are for acquiring, maintaining and exploiting Data - as a means to an end. We need to shift the typical focus of many organisations away from technology and "IT" towards understanding this perspective. I think the real value comes from thinking about Data Flows and not just the Data (store / Lake / whatever). This is my perspective also in the context of Asset Management. I am not associated with Anmut, but I recommend this well-written Report. (They have collaborated with Highways England to do some extremely exciting and useful work re Gemini.) https://anmut.co.uk/insights/ https://www.linkedin.com/posts/guyjdavis96_data-research-datavalue-activity-6739116308098514944-l4Vo
  10. (8) Data wrangling - importing 300+ datasets a quarter - YouTube Is this making the case for bread and butter digital transformation?
  11. Hi IMF Community, You may find this workshop interesting: "4-Dimensionalism in Large Scale Data Sharing and Integration" Full details and Registration can be found at: https://gateway.newton.ac.uk/event/tgmw80 . The workshop will feature six presentations on state-of-the–art research from experts on 4-Dimensionalism in large scale data sharing and integration followed by a chaired Presenter's Panel. Each presentation will cover aspects of 4-Dimensionalism from the basics to Top Level Ontologies and Co-Constructional Ontology with each answering the question posed by the previous presentation.
  12. Icebreaker One has won a major UK Research and Innovation competition for the Open Energy project, which aims to revolutionise the way data is shared across the energy sector to make sure the UK achieves its net-zero goals. It means the project will receive £750k in UK Government funding to continue developing a standard that all organisations in the energy data ecosystem can use to search, share and access data. It’s also developing a prototype governance platform to make sure data is shared securely. Icebreaker One hosted a webinar on 16 February at 10am to share more information about its progress so far and plans for the future. View launch webinar (16 February 2021) View project summary briefing Open Energy aims to transform the way organisations exchange the information they need to phase out fossil fuels and implement renewable energy technology. Icebreaker One is aiming to roll out the Open Energy standards, guides and recommendations across the energy sector over the next year. Open Energy has been guided by industry advisory groups across the UK which include representatives from Ofgem, Scottish Power and SSE. It’s led by Gavin Starks, one of the key figures behind the Open Banking Standard that has revolutionised the banking sector over the past five years. Icebreaker One worked with project partners Open Climate Fix, Raidiam and PassivSystems, to win the Modernising Energy Data Access (MEDA) competition, run by Innovate UK as part of the Industrial Strategy Prospering from the Energy Revolution programme. A summary of the MEDA Phase Two work is available here. Gavin Starks, founder and CEO at Icebreaker One, said, “We’re delighted to have this backing to continue developing the data infrastructure to help unlock access to data to deliver efficiency and innovation across the energy sector. This will have a material impact on the UK’s ability to make the most of decentralised energy supply and consumption, help address the coming challenges of the transition to electric vehicles and catalyse the delivery of our net-zero targets. Our work will help unlock data discovery by enabling energy data search and usage by delivering a trusted ecosystem for decentralised data sharing.” Rob Saunders, Challenge Director, Prospering from the Energy Revolution at UKRI, said: “The MEDA competition was designed to accelerate innovative ways for energy data to be open-sourced, organised and accessed, providing a platform for new technology, services and more agile regulation within the energy sector. “The Icebreaker One project showed exactly what can be achieved through collaborative thinking and will help create a framework for all stakeholders to share data further for the common benefit – and ultimately for the UK’s net-zero ambitions. We are looking forward to working with them closely as the project develops further.” David Manning, Head of Data Management at SSE plc, said: “At SSE we recognise that becoming a data driven organisation is critical to our role in helping achieve a net zero world.” “Readily accessible and trusted data will be essential to building the decarbonised energy system of the future; ensuring flexibility, customisation and personalisation for energy users, large and small. It’s exciting to see the progress being made in this space.” https://energydata.org.uk/2021/02/03/open-energy-gets-uk-government-backing/
  13. I was reccently introduced to the work on Digital Twins that the City of Wellington is involved in. I share some links with the DT Hub community. Unlocking the Value of Data: Managing New Zealand’s Interconnected Infrastructure Plus, check out these links too.. which where shared with me by Sean Audain from Wellington City Council who is leading the Digital Twin activity in the city. "We have been on this trip for a while - here is an article on our approach https://www.linkedin.com/pulse/towards-city-digital-twins-sean-audain/ - the main developments since it was written was a split between the city twin and the organisational twin - something that will be formalised in the forthcoming digital strategy. To give you an idea of progress in the visualisation layer this is what the original looked like https://www.youtube.com/watch?v=IGRBB-9jjik&feature=youtu.beback in 2017 - the new engines we are testing now look like this https://vimeo.com/427237377 - there are a bunch of improvements in the open data and in the shared data systems." I asked Sean about the impact on the DT to city leaders decision making. This is his response... "In our system we are open unless otherwise stated. We have used it as a VR experience with about 7000 wellingtonians in creating the City Resilience Strategy and Te Atakura- the Climate CHange Response and Adaptation plan. There are more descrete uses such as the proposals for the Alcohol Bylaw - https://www.arcgis.com/apps/Cascade/index.html?appid=2c4280ab60fe4ec5aae49150a46315af - this was completed a couple fo years ago and used part of the data sharing arrangements to make liquor crime data available to make decisions. I have the advantage of being a part of local government in getting civic buy in. Every time our councillors are presented with this kind of information they want more." Alcohol Control Bylaw – New
  14. There are two reports launced by Geospatial Commission on 2020-11-24. They are keen to hear from people's feedbacks. (The reports are related to digital twin, although digital twin was not mentioned in either reports.) 1. Enhancing the UK's Geospatial Ecosystem PDF, 3.47MB, 20 pages 2.Frontier Economics Geospatial Data Market Study Report PDF, 1.95MB, 122 pages Reports download link, with html alternative formats: https://www.gov.uk/government/publications/enhancing-the-uks-geospatial-ecosystem Enhancing_the_UK_s_Geospatial_Ecosystem..pdf Frontier_Economics_-_Geospatial_Data_Market_Study.pdf
  15. RachelJudson

    Benefits Webinar - watch now!

    On the 20th October the NDTp were delighted to be joined by 100 attendees at a webinar to discuss the benefits of digital twins and connected digital twins. Miranda Sharp, NDTp Commons Lead, chaired a panel of experts: Leigh Dodds, ODI; Herman Heynes, Anmut; Peter Vale, Thames Tideway; Paul Hodgson, Greater London Authority The Webinar aimed to understand how to capture the benefit of digital twins and connected digital twins including; To create new revenue through data driven solutions Improved asset management Decision support and assurance Systems thinking; balancing the objectives of cost, safety, security and environmental sustainability New value from data driven solutions The Panel covered wide ranging subjects, responded to questions from the attendees sharing their views on the benefits of exchange of information.
  16. Enterprises creating digital twins have a need to understand the benefits their digital twins bring to their own operation but also the benefits which accrue to their customers, supply chain, local community, industry network and relevant government bodies.  An understanding and harnessing of these benefits has the potential to drive not only individual business cases but also impact regional development spend, regulatory price controls and national policy.  In response to this need, CDBB commissioned a piece of work to create a logic model to find a consistent way to describe the benefits of connecting digital twins.  That model has the potential to deliver both the forward view to guide investment decisions in connecting digital twins and also a retrospective assessment of the benefits achieved by connecting them. Read the CDBB blog, What is the point of a National Digital Twin?  to find out more about the logic model The NIC’s Data for the Public Good report and other publications have described benefits to the economy and enterprises from the sharing of data in a secure and resilient way.  As such, the National Digital Twin programme was set up to create the Information Management Framework to enable that secure resilient data sharing in the built environment and beyond.    The vision for the National Digital Twin is not a central repository of all data rather it is a ] principles principles based means to connect data or individual twins to create both public good and value.   The challenge is to understand where the greatest value can be created from the connection of individual twins.   The NDTp will be running a webinar on 20th October where we will discuss the challenges of valuing data assets, the good they deliver, and how connected digital twins may change the way we do business.   To receive the link to the webinar, register via Eventbrite; https://ndtbenefits.eventbrite.co.uk The Webinar will be held, 11:00 – 12:00, Tuesday 20th October, via Zoom Webinar Hosting and chairing the webinar will be the National Digital Twin programme’s Commons Stream Lead, Miranda Sharp. Joining Miranda will be a panel of experts; Leigh Dodds – ODI ; Leigh is Director of Delivery at the Open Data Institute. You can read about the ODI’s work on data institutions here: https://theodi.org/article/designing-sustainable-data-institutions-paper/   Herman Heyns – ANMUT Herman is CEO at Anmut and Member of Tech UKs Digital Twins Working Group. Anmut is a consultancy that enables organisations to manage data like any other asset. You can read more about how ANMUT value data on their website: https://anmut.co.uk/data-valuation-what-is-your-data-worth/ and https://anmut.co.uk/why-you-should-be-treating-your-data-as-an-asset/ Peter Vale – Thames Tideway; Peter has worked with a consortium at Tideway which has researched the benefits of digital transformation. We hope to see you there.
  17. A very sunny and warm hello to fellow enthusiasts !!! I have been reflecting on the quite unbelievable Digital Twin journey over the last 2 years and having to pinch myself at the progress that has been made. Within Costain, Digital Twin is recognised in all corners as something our industry needs to do, in the big wide world the same can also be said with what seems to be an ever increasing global drive! Wanting to challenge not just myself but the community at large, here are a two 'big ticket' items of reflection where it feels as though we haven't yet fully succeeded: 1 - Collaboration - now don't get me wrong, the collaboration from DT has been truly exceptional and is, I believe, changing industry. However are we still creating digital dots that are yet to be properly connected? For NDT to really be successful we need to reach a point where open conversations can take place between Gov, academia and industry in a way that has not traditionally happened. Discussions that are 'warts and all' that would not normally happen between say an owner / operator and supplier, or between Government and industry. We risk restricting the value of DT which will verge on the magical if truly transparent. 2. - Engineering mindset - No surprise here that as an engineering led industry the focus with DT's appears to be largely engineering focused. In a recent piece of work looking at data trusts, the legal complexity of scaled data sharing has been eye opening. What if a decision is made based on data from say 10 organisations, and that decision leads to an issue because of some low quality or incorrect data, who is then liable? Would it be possible to identify the bad quality data? Exposure to leading research has identified the complexity in privacy, ethics, trust, reliability, accountability and security in relation to collaborative data sharing. It was also interesting to hear about work by the Financial Conduct Authority looking at what a Data Conduct Authority might look like, where data might be monetised. As we look past the engineering foundations, there is a lot to do. I hope people do not view any of the above negatively, this all screams opportunity and is only natural as we lead the world in the development of scaled, federated Digital Twins. What other big challenges do people think need some focus? Regards Kevin
  18. On Monday 3rd August, Matthew West, the Technical Lead for the NDTp, gave a presentation to the extended CDBB staff about the first steps the NDTp is taking towards an Information Management Framework (IMF), the FDM Seed. We thought it might be helpful to share it here so you can see how the current work fits with our wider goals. https://youtu.be/NyWfbocL1es
  19. The Centre for Digital Built Britain’s National Digital Twin programme has launched an open consultation seeking feedback on the proposed approach to the development of an Information Management Framework for the built environment. A new report, The Pathway Towards an Information Management Framework: A Commons for a Digital Built Britain, sets out the technical approach for the development of an Information Management Framework (IMF) to enable secure, resilient data sharing across the built environment. The publication of the report by CDBB, in partnership with the Construction Innovation Hub, is a critical milestone towards a National Digital Twin. On the publication, Mark Enzer, Head of the National Digital Twin Programme said, “I would really like to thank everyone who has come together over the past 18 months to help develop this proposed pathway towards an Information Management Framework. It represents a huge amount of work and exemplifies the collaborative approach that will be needed as we seek to enable an ecosystem of connected digital twins – the National Digital Twin. “The challenge is sizeable, but the promise is huge: better outcomes coming from better decisions based on better connected data. And, working with industry, academia and Government all pulling together we can deliver it. So, I’d urge you to join with us on this journey and help us build consensus on the way forward.” The way that digital twins are connected is important to ensuring security and improving the resilience of assets and systems. The goal of the IMF is to establish a common language by which digital twins of the built and natural environment can communicate securely and effectively to support improved decision taking by those operating, maintaining and using built assets and the services they provide to society. Its development by CDBB was recommended by the National Infrastructure Commission in 2017’s Data for the Public Good report and HM Government’s Construction Sector Deal. As industry leaders, DT Hub members involved in planning, creating and managing the built environment are invited to provide feedback on the report here. The consultation questions are: It has been proposed that the Information Management Framework (IMF) should essentially consist of a Foundation Data Model (FDM), a Reference Data Library (RDL) and an Integration Architecture (IA). Do you agree with this overall framework? In your view, are there any key elements missing from this framework? In your view, is the proposed approach to the IMF consistent with the Gemini Principles? Are there any inconsistencies that should be addressed? Section 3.4 lists the models and protocols that would form part of the IMF. Is there anything that you would like to suggest to improve this list? Section 3.5 describes key concepts of a Foundation Data Model. Is there anything that you would like to suggest to improve this description? Section 3.6 describes key concepts of the Reference Data Library. Is there anything that you would like to suggest to improve this description? Section 3.7 describes key concepts of an Integration Architecture. Is there anything that you would like to suggest to improve this description? Section 4 proposes a pathway for developing the IMF. Do you agree with the proposed overall approach? In your view, are there any key tasks missing from this pathway? Would you suggest any improvements to the order in which the tasks are undertaken to develop the IMF? What do you see as the barriers to connecting digital twins within organisations and between different organisations/sectors? How can these barriers be overcome? In your experience what are the reasons why organisations invest in the creation of digital twins? Why would they invest in connecting digital twins? Do you have any other comments on the proposed approach to developing the information management framework? What opportunities do you see arising in your business from being able to connect Digital Twins and share and integrate data across them? The consultation on the IMF is open until 31 August and responses can be submitted here. Read a summary of the report here. the_pathway_towards_an_imf.pdf
  20. When asked by a relatively senior member of staff here what the Digital Twin is all about, and why they should care, I pulled together some SmartArt (pictured) to try to explain the component parts of an infrastructure organisation's twin. Keen to get the wider community's thoughts on this approach. Digital Twins are having a bit of moment here at Highways England, to the extent that our principle risk is not a lack of twins, but a surfeit of incompatible twins. I'm beginning to think that the ‘Digital Twin’ of a complex organisation such as HE will actually need to function as a hierarchical system of systems. We need to understand how our organisation functions and what it looks like from a conceptual data perspective (the Schema), we then need a single source of truth, preferably one structured as a graph to reflect the Ontology (the Data), and finally there will be the specific manifestations of the above for different parts of the business (e.g. BIM, digital product catalogues, design, porfolio management etc. etc.) which should be united by the common schema and data above.
  21. DRossiter87

    Connected Pathways

    Following input from DT Hub members into a community-driven document, we have proceeded to reduce the number of use cases identified during the Pathway to Value Workshop from 28 down to 12: Open Sharing of Data Asset Registration Scenario Simulation Occupant/User Management Environmental Management Traffic Management Process Optimization Asset Management Carbon Management Resource Management Resilience Planning Risk Management Using these use cases, we can begin to explore how the National Digital Twin (NDT) programme can support members of the DT Hub in realizing their value. One way of doing so is by identifying what parts of these use cases need to be developed via the Commons Stream as part of the Information Management Framework (IMF). The reasoning being these 12 use cases are: Horizontal. Meaning that they can be applied within several sectors and their respective industries; and High-value. Meaning that they can achieve a return on investment. Positively, these use cases have a strong synergy with a similar schedule presented by Bart Brink of Royal HaskoningDHV on a recent buildingSMART webinar on digital twins. By identifying DT Hub member horizontal, high-value, use cases we hope that their associated tasks, key performance indicators and federation requirements can be recommended for prioritization as part of the development of the Information Management Framework (IMF). At the beginning of June, CDBB released The Pathway Towards an Information Management Framework: A Commons for a Digital Built Britain, a report setting out the technical approach that will lead to the development of the National Digital Twin. Within the report it focuses on three key facets that will enable secure, resilient data sharing across the built environment: Reference Data Library. A taxonomy describing a common set of classes to describe the built environment; Foundation Data Model. An ontology outlining the relation between these classes or properties of these classes; and Integration Architecture. Exchange protocols to facilitate sharing of information, using these defined classes and relations between digital twins. As opposed to being released as a complete resource, we will likely see these facets developed organically as the NDT programme continues to follow its mantra of: As such, the key question isn’t “what should these facets include?” but “what should be included first?”. We hope to answer this question using these horizontal, high-value, use cases. EXAMPLE: “Environmental management”. At the beginning of 2020, news reports focused on air pollution and its link with infrastructure. In addition, many building assets may wish to monitor air quality due to its known impact on occupant performance. As a use case that is associated to regulatory compliance, productivity, and applicable to a breadth of assets Environmental Management may be a horizontal, high-value, use case. To support such a use case, the: Reference Data Library. May need to include classes such as: Temperature, Wind speed, Humidity, CO2, and PM2.5 as well as their associated units to enable the consistent recording of this information. Foundation Data Model. May need an ontology describing acceptable ranges and the relationship of air quality concepts to other classes such as Health and Productivity depending on the function being monitored; and Integration Architecture. May need to facilitate the sharing of information from sources such as other digital twins, as well as datasets from the Met Office and local governments. Simply put, by identifying these horizontal, high-priority, use cases, we may be able to begin accelerating the realization of their value by having the taxonomies, ontologies and protocols needed to facilitate them available at an earlier stage of the overall IMF development. And there we have it. As DT Hub members begin to consider how the information management framework may support their digital twin development as well as the national digital twin, which use cases do you think are the most horizontal and high-value? How do you think these facets might support your ability to undertake these use cases? Please feel free to add your thoughts below, or, alternatively, comment directly on the draft community-driven document which is, and will continue to be, progressively developed as member views are shared.
  22. The National Digital Twin Programme hosted a webinar on Monday 8th June 2020 to discuss and answer questions about the recently published Pathway towards an Information Management Framework. We were delighted to receive many questions during the webinar, and hope that those the panel were able to answer helped deepen understanding and expand interest in the Information Management Framework and the National Digital Twin Programme. We have added those, and the questions we couldn't get to in the available time, as topics within this forum, collated by subject. We would like to invite you to add your suggestions and to take part in the discussion on the DT Hub around the development of the National Digital Twin. We will use the discussions here to compliment the Open Consultation being run through the CDBB website on the IMF Pathway.. As Mark Enzer, the Head of the NDT Programme, said in the webinar, we need to continue to build consensus through collaboration, and progress through sharing and learning together. For those who missed the webinar, a video of the webinar is now available and attached below is a transcript of the the event. IMF Pathway Webinar 08062020 Transcript FINAL.pdf
  23. Development of common Ontologies and Taxonomies will be key to interoperability and data exchange between Digital Twins. So who's actively developing them? At Highways England we are actively developing a Domain Ontology to serve as the basis for our data modelling and information sharing within and outside our organisation. I'm keen to solicit as much input to this as possible, as well as learn from the efforts of other organizations. Protégé users (it's free) can review the Ontology as it develops by requesting access to: https://webprotege.stanford.edu/#projects/0b3be685-73bd-4d5a-b866-e70d0ac7169b/edit/Classes Let me know your username (you can reach me at ian.gordon@highwaysengland.co.uk) and I'll give you access. Feedback is much appreciated.
  24. During discovery interviews with DT Hub members, several barriers relating to the use of digital twins were identified. This blog post is one of a series which reflects on each barrier and considers related issues so that we can discuss how they may be addressed. As our members, and indeed other organisations active in the built environment, develop data and information about their assets, the ability to ensure that this data can be used within other tools is a priority. To do so, the data needs to be interoperable. One definition of interoperability is: In brief, if data can be shared between systems it is considered interoperable. Typically, this can be achieved in one of two ways: Both systems use the same formal description (schema) to structure the data; or One system transforms its data using an intermediate formal description (schema) to structure the data The simplest solution appears to be (1), to have all systems create, use and maintain information using the same schema. This would mean that information could be used in its default (native) format and there would be no risk of data being lost or corrupted during its transformation. However, this isn’t practicable as, from a technical perspective, it is unlikely that the broad range of information needed to support every possible purpose could be captured against the same schema. In addition, public procurement directives require performance-based technical specifications as opposed to naming specific software. This means that an organization may be challenged if they specify their supply chain use a particular piece of software as it would circumvent directives around competition and value for money. As it is not possible to guarantee that the same schema will be used throughout, it is far more practicable to identify which established industry schema is most suitable to accept data within (2) depending on the purpose of using this information. In doing so, there is an added benefit that the information you receive may be open data. Typically misused as a synonym for interoperability, open data is important for sharing but for a specific reason. Open data, in brief, is un-restricted data. By using proprietary software and systems the schema used to structure that data is hidden. As a user of that software you are effectively given permission by the vendor to use that structure to view your information. For built environment assets this can be a problem as the physical asset can outlast the software used to design and manage it. Meaning that in 50 years a tool that allows access to this information may not exist - or sooner given the cannibalistic nature of the software industry. Consider SketchUp for example. Since its release in 2000, it has been owned by three different organizations: @Last Software, Google, and Trimble. The permission to use the SKP schema has changed hands several times. Who will produce software to view these files in 30 years’ time? To ensure enduring access to asset information, either bespoke schemas need to be developed and maintained internally, or an established open schema needs to be used. However, while several open schemas are readily available (such as IFC, PDF, PNG, MQTT) they can raise concerns related to access, control and abuse of the data within. These concerns, thankfully, can be offset through control. Using open data structures, it is possible to ensure that only the information you wish to exchange is delivered. By using proprietary structures hidden information can also be exchanged which cannot be controlled; potentially causing a larger risk than their open counterparts. Conversely, to produce a “need-to-know” dataset an open data approach is, ironically, easier to control. When considering which methodologies to use, open data benefits typically outweigh its risks. The use of these open data structures will not only unlock interoperability between digital twins within an organization but will be the mechanism that enables a secure national digital twin. Access to appropriate data about our national infrastructure is currently held behind proprietary schema. Let’s make Britain’s data open again! We hope you enjoyed this short piece on breaking the barriers related to interoperability. What specific challenges have you faced relating to the implementation of interoperability? Do you consider open data in this content is an opportunity or a threat? Would you prefer the National Digital Twin to be based on an open or a propriety schema?
Top
×
×
  • Create New...