Jump to content

Search the Community

Showing results for tags 'Data value'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Themes

  • Member Forums
    • General discussion
    • Testing digital twin concepts
    • Digital twin competencies
    • Pathway to value
    • IMF Pathway
    • Videos
  • IMF Community's Forum
  • Data Value and Quality's Forum
  • 4-dimensionalism's Forum
  • The Defence Digital Network's Welcome!
  • The Defence Digital Network's Defence DT Roadmap
  • The Defence Digital Network's Acronym Buster
  • The Defence Digital Network's Open Forum
  • The Defence Digital Network's Documents
  • Open Innovation - Open (Citizen) Science - SDG's Open Innovation
  • Open Innovation - Open (Citizen) Science - SDG's Social Innovation Ecosystem
  • Open Innovation - Open (Citizen) Science - SDG's Events
  • Funding / Collaborative R&D Opportunities's Challenges
  • Funding / Collaborative R&D Opportunities's Funding
  • Italian DT Hub's Q&A
  • Italian DT Hub's News

Calendars

  • Community Calendar
  • Italian DT Hub's Events

Categories

  • Glossary

Categories

  • A survey of Top-level ontologies - Appendix D

Categories

  • Articles
  • Blogs
  • Publications
  • Editorials
  • Newsletters
  • Shared by the Community

Categories

  • A Survey of Industry Data Models and Reference Data Libraries

Categories

  • Climate Resilience Demonstrator (CReDo)
  • Gemini Call Feature Focus presentations
  • Hub Insights: Innovators
  • Hub Insights
  • Digital Twin Talks: Interconnection of digital twins
  • Digital Twin Talks: Exploring the definitions and concepts of digital twins
  • Other Digital Twin media

Categories

  • Guests

Categories

  • Member Resources
  • Public Resources
  • Guidance
  • IMF Community's Files
  • Data Value and Quality's Shared Resources
  • Italian DT Hub's Risorse

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Found 21 results

  1. Sensor technology has come a long way over the last 30 years, from the world’s first, bulky webcam at the University of Cambridge Computer Science Department to near ubiquitous networks of sleek sensors that can provide data at an unprecedented volume, velocity and quality. Today, sensors can even talk to each other to combine single points of data into useful insights about complex events. The new webcomic ‘Coffee Time’ by Dave Sheppard, part of the Digital Twin Journeys series, tells the story of this evolution and what it means for what we can learn about our built environment through smart sensors. Starting with a simple problem – is there coffee in the lab’s kitchen? – researchers in the early 1990s set up the world’s first webcam to get the information they wanted. Today, people in the Computer Lab still want to know when the coffee is ready, but there are more ways to solve the problem, and new problems that can be solved, using smart sensors. Smart sensors don’t just send information from point A to point B, providing one type of data about one factor. That data needed to be collated and analysed to get insights. Now sensors can share data with each other and generate insights more instantaneously. The West Cambridge Digital Twin team at the computer lab have looked at how specific sequences of sensor events can be combined into an insight that translates actions in the physical world into carefully defined digital events. When someone makes coffee, for example, they might turn on a machine to grind the coffee beans, triggering a smart sensor in the grinder. Then they’d lift the pot to fill it with water, triggering a weight sensor pad beneath to record a change in weight. Then they would switch the coffee machine on, triggering a sensor between the plug and the outlet that senses that the machine is drawing power. Those events in close succession, in that order, would tell the smart sensor network when the coffee is ready. These sequences of sensor triggers are known as complex events. Using this technique, smart sensors in the built environment can detect and react to events like changes in building occupancy, fires and security threats. One advantage of this approach is that expensive, specialist sensors may not be needed to detect rarer occurrences if existing sensors can be programmed to detect them. Another is that simple, off-the-shelf sensors can detect events they were never designed to. As the comic points out, however, it is important to programme the correct sequence, timing and location of sensor triggers, or you may draw the wrong conclusion from the data that’s available. Something as simple as wanting to know if the coffee is ready led to the first implementation of the webcam. Digital twin journeys can have simple beginnings, with solving a simple problem with a solution that’s accessible to you, sparking off an evolution that can scale up to solve a wide range of problems in the future. You can read and download the full webcomic here. You can read more from the West Cambridge Digital Twin project by visiting their research profile. This research forms part of the Centre for Digital Built Britain’s (CDBB) work at the University of Cambridge. It was enabled by the Construction Innovation Hub, of which CDBB is a core partner, and funded by UK Research and Innovation (UKRI) through the Industrial Strategy Challenge Fund (ISCF).
  2. Next week’s Gemini Call will include a presentation by Jack Ostrofsky, Head of Quality and Design at Southern Housing Group and Chair of BIM for Housing Associations. BIM for Housing Associations (BIM4HAs) is a client led and client funded initiative set up in 2018 to accelerate the uptake of consistent and open standards-based BIM processes across the Housing Association sector. An urgent priority for this group is building and fire safety, particularly in the context of the development of a Golden Thread of Building Safety Information which is part of the Building Safety Bill which is expected to receive Royal Assent in 2022. Understanding of BIM and Digital Twins in the residential housing sector is poor, yet as long-term owner-operators of built assets, housing associations are ideally placed to benefit from the efficiencies of BIM and Digital Twins. In June 2021 BIM4HAs published a Toolkit of resources for housing associations aimed at assisting them in the process of adopting ‘Better Information Management’. The toolkit, which is free to use, translates the requirements of the National BIM Framework into accessible language and practical tools for housing associations. Jack will describe an example of the challenge to housing associations to use structured data to manage their assets; the transfer of spatial information about buildings which designers and contractors label as ‘plots’, development managers and asset managers in housing associations have their own naming conventions which have evolved in a traditional and disjointed manner. As a result, the metadata links are severed at handover and a great deal of valuable, useable information is lost to the client. Jack’s employer Southern Housing Group has developed a spatial hierarchy and property reference numbering system which was published in the BIM4HAs Toolkit in June. The spatial hierarchy and naming system links to commonly understood asset management language and informs Asset Information Requirements that housing associations can use to instruct development and refurbishment projects. This process enables contractors to provide useable metadata to housing associations and will form an essential part of the implementation of a Golden Thread of Building Safety Information. In a further development Southern Housing Group, working with members of the BIM4HAs community, have developed and are implementing an Asset Information Model based on the Gemini Principles and aligned with the other BIM4HAs work. This Model will be published for free, for anyone to use, by BIM4HAs as part of an update to the BIM4HAs Toolkit in February. Please join us on the Gemini Call on 25th January at 10.30 to hear about the spatial hierarchy work and put your questions to Jack. Download the Spatial Hierarchy Document and ‘The Business Case for BIM’ Document from the links below. Both are part of the Toolkit. The whole Toolkit can be downloaded for free from the National Housing Federation website here: housing.org.uk/BIM4HAs BIM for Housing Associations Pt1 The Business Case for BIM.pdf SHG Spatial Hierarchy UPRN Procedures.pdf
  3. Digital twins are not just a useful resource for understanding the here-and-now of built assets. If an asset changes condition or position over its lifecycle, historical data from remote sensors can make this change visible to asset managers through a digital twin. However, this means retaining and managing a potentially much larger data set in order to capture value across the whole life of an asset. In this blog post, Dr Sakthy Selvakumaran, an expert in remote sensing and monitoring, tells us about the importance of curation in the processing of high-volume built environment data. There are many sources of data in the built environment, in increasing volumes and with increasing accessibility. They include sensors added to existing structures – such as wireless fatigue sensors mounted on ageing steel bridges – or sensors attached to vehicles that use the assets. Sources also include sensing systems including fibre optics embedded in new structures to understand their capacity over the whole life of the asset. Even data not intended for the built environment can provide useful information; social media posts, geo-tagged photos and GPS from mobile phones can tell us about dynamic behaviours of assets in use. Remote sensing: a high-volume data resource My research group works with another data source – remote sensing – which includes satellite acquisitions, drone surveys and laser monitoring. There have been dramatic improvements in spatial, spectral, temporal and radiometric resolution of the data gathered by satellites, which is providing an increasing volume of data to study structures at a global scale. While these techniques have historically been prohibitively expensive, the cost of remote sensing is dropping. For example, we have been able to access optical, radar and other forms of satellite data to track the dynamic behaviour of assets for free through open access policy of the European Space Agency (ESA). The ESA Sentinel programme’s constellation of satellites fly over assets, bouncing radar off them and generating precise geospatial measurements every six days as they orbit the Earth. This growing data resource – not only of current data but of historical data – can help asset owners track changes in the position of their asset over its whole life. This process can even catch subsidence and other small positional shifts that may point to the need for maintenance, risk of structural instability, and other vital information, without the expense of embedding sensors in assets, particularly where they are difficult to access. Data curation One of the key insights I have gained in my work with the University of Cambridge’s Centre for Smart Infrastructure and Construction (CSIC) is that data curation is essential to capture the value from remote sensing and other data collection methods. High volumes of data are generated during the construction and operational management of assets. However, this data is often looked at only once before being deleted or archived, where it often becomes obsolete or inaccessible. This means that we are not getting the optimal financial return on our investment on that data, nor are we capturing its value in the broader sense. Combining data from different sources or compiling historical data can generate a lot of value, but the value is dependent on how it is stored and managed. Correct descriptions, security protocols and interoperability are important technical enablers. Social enablers include a culture of interdisciplinary collaboration, a common vision, and an understanding of the whole lifecycle of data. The crucial element that ensures we secure value from data is the consideration of how we store, structure and clean the data. We should be asking ourselves key questions as we develop data management processes, such as: ‘How will it stay up to date?’ ‘How will we ensure its quality?’ and ‘Who is responsible for managing it?’ Interoperability and standardisation The more high-volume data sources are used to monitor the built environment, the more important it is that we curate our data to common standards – without these, we won’t even be able to compare apples with apples. For example, sometimes when I have compared data from different satellite providers, the same assets have different co-ordinates depending on the source of the data. Like ground manual surveying, remote measurements can be made relative to different points, many of which assume (rightly or wrongly) a non-moving, stationary point. Aligning our standards, especially for geospatial and time data, would enable researchers and practitioners to cross-check the accuracy of data from different sources, and give asset managers access to a broader picture of the performance of their assets. Automated processing The ever increasing quantity of data prohibits manual analysis by human operators beyond the most basic tasks. Therefore, the only way to enable data processing at this large scale is automation, fusing together remote sensing data analysis with domain-specific contextual understanding. This is especially true when monitoring dynamic urban environments, and the potential risks and hazards in these contexts. Failure to react quickly is tantamount to not reacting at all, so automated processing enables asset owners to make timely changes to improve the resilience of their assets. Much more research and development is needed to increase the availability and reliability of automated data curation in this space. If we fail to curate and manage data about our assets, then we fail to recognise and extract value from it. Without good data curation, we won’t be able to develop digital twins that provide the added value of insights across the whole life of assets. Data management forms the basis for connected digital twins, big data analysis, models, data mining and other activities, which then provide the opportunity for further insights and better decisions, creating value for researchers, asset owners and the public alike. You can read more from the Satellites project by visiting their research profile. This research forms part of the Centre for Digital Built Britain’s (CDBB) work at the University of Cambridge. It was enabled by the Construction Innovation Hub, of which CDBB is a core partner, and funded by UK Research and Innovation (UKRI) through the Industrial Strategy Challenge Fund (ISCF). For more on the Digital Twin Journeys projects, visit the project's homepage on the CDBB website.
  4. You’re invited to a webinar on 2nd March to find out how collaboration through connected digital twins can help plan resilient cities and infrastructure. The National Digital Twin programme has developed a Climate Resilience Demonstrator (CReDo), a pioneering climate change adaptation digital twin project that provides a practical example of how connected data can improve climate adaptation and resilience across a system of systems. Watch the film Tomorrow Today, and try the interactive app to see what CReDo has been working towards. The CReDo team will use synthetic data developed through the project to show how it is possible to better understand infrastructure interdependencies and increase resilience. Join the webinar to hear from the CReDo team about the work that has happened behind the scenes of developing a connected digital twin. CReDo is the result of a first-of-its-kind collaboration between Anglian Water, BT and UK Power Networks, in partnership with several academic institutions. The project has been funded by Connected Places Catapult (CPC) and the University of Cambridge, and technical development was led by CMCL and the Hartree Centre. This collaboration produced a demonstrator that looks at the impact of flooding on energy, water and telecoms networks. CReDo demonstrates how owners and operators of these networks can use secure, resilient, information sharing across sector boundaries to adapt to and mitigate the effect of flooding on network performance and service delivery. It also provides an important template to build on to turn it to other challenges, such as climate change mitigation and Net Zero. Hear from members of the CReDo team – including the asset owners, CPC, and the technical development team - about the demonstrator they have delivered and the lessons they learned. If you’re interested in using connected digital twins to forge the path to Net Zero, then this event is for you. Register for our end-of-project webinar on 2nd March, 10:30 – 12:00: https://www.eventbrite.co.uk/e/credo-collaborating-and-resilience-through-connected-digital-twins-tickets-228349628887
  5. Digital twins can help organisations achieve various goals. In some cases, the end goal is for buildings and infrastructure to last longer, use less energy, and be safer. In others, it is enhancing the lives of people who interact with the built environment and its services. As highlighted by the Gemini Principles, these are not mutually exclusive aims, so wherever you are on your digital twin journey, it is important to consider other perspectives on the hybrid digital and physical systems you create. How will your digital twin fit into a wider ecosystem that provides services to all kinds of people? How will your asset’s performance impact the wider built environment and those who need to navigate it? Whose lives will be better if you share data securely and purposefully. In the first output from the Digital Twin Journeys series, the team working on the Smart Hospital of the Future research project, enabled by the Construction Innovation Hub, shared case studies from two smart hospitals and reflect on the innovations they saw during the COVID-19 pandemic. In this two video mini-series, the research team shares insights about how existing digital maturity enabled these hospitals to respond to the pandemic in agile ways, transforming to a hybrid physical and digital model of care distributed across multiple sites. They also explored how individual asset digital twins fit into a wider landscape of ecosystem services, guiding how we approach interoperability to achieve better outcomes. These insights inform the way we think about the role of digital twins in the smart built environments of the future. Dr Nirit Pilosof reflects that, ‘Digital twin as a concept can promote the design of the new system, the design process of the built environment and the technologies, but also really help operate… the hybrid models looking at the physical and virtual environments together.’ If health care is enabled by connected digital twins, how could the design of hospitals – and whole cities – change? In the videos, the team also discusses the limitations and ethics of services enabled by digital data and the use of digital technologies to improve staff safety, from isolated COVID wards to telemedicine. They frame service innovation as an iterative and collaborative process, informed by the needs of digital twin users, whether those are the asset owners and operators, or the people benefitting from the services they provide. According to project co-lead Dr Michael Barrett, ‘The people who need to drive the change are the people who are providing the service.' After the COVID crisis, we can better recognise what we have learned from implementing digital services at scale, as more people than ever have relied on them. The team reflect that having the right people in the right roles enabled the smart hospitals in these cases to transform their services rapidly in response to the need. The same human and organisational infrastructure that is creating the smart hospital of the future is also needed to create the flexible, responsive built environments of the future. Digital Twin Journeys can start from the perspective of available technology, from a problem-solving perspective, or from the perspective of users experiencing a service ecosystem. The smart hospitals project demonstrates the value of the latter two approaches. Hospital staff were instrumental in shaping the digitally-enabled service innovation to keep them safe and offer better services on and offsite, but project co-lead Dr Karl Prince points out how people accessing those services have to navigate a variety of different services in the built environment to get there. As we begin to connect digital twins together, we need to consider not just our own needs but the needs of others that digital twins can address. For more on this project, including links to their publications, see the team’s research profile on the CDBB website. Keep up with the Digital Twin Journeys series on the CDBB website or here on the Digital Twin Hub blog.
  6. until
    Speaker: Mark Enzer, CDBB and CTO, Mott MacDonald The National Digital Twin (NDT) is a huge idea using “data for the public good” at its heart. The NDT promises enormous value for the people of the UK, both in the delivery of new assets and in the performance of our existing infrastructure. The fundamental premise behind the NDT is: Better data + Better Analysis => Better Decisions => Better outcomes for people and society – which is the essential promise of the Information Age. The NDT is not envisaged as one massive model of everything, but as an ecosystem of connected digital twins. Connecting digital twins requires interoperability to enable secure resilient data to be shared across organisational and sector boundaries. However, interoperability requires a level of data quality and consistency that “the market” cannot achieve on its own; it requires government-level leadership to create the right conditions for “the market” to adopt and deliver to the standards required and in doing so develop and thrive. This presentation will: introduce the National Digital Twin, explain what it is and why we need it, and outline what is being done to deliver it. Register at the link below for Mark's presentation and others: Webinar: DMSG & DAMA collaboration event: Making data good for society | BCS
  7. Hi all, I received this Digital Construction Week article today about some Procore research and thought it included some interesting conclusions.
  8. until
    Pre-register for the latest TwinTalks breakfast meeting by following the “Join me” link.This TwinTalks will feature an interview with Rachel Skinner, the 156th president of the Institution of Civil Engineers and executive director (transport) at global consultant WSP.Rachel is the youngest-ever president of the Institution of Civil Engineers and only the second woman to hold the post in its 203-year history. In 2016, she was named by the Daily Telegraph as one of the Top 50 Influential Women in Engineering. She was elected as a fellow of the Royal Academy of Engineering in 2019 and is a commissioner for the Infrastructure Commission for Scotland.This TwinTalks breakfast will pick up the major theme of Rachel’s presidential year and focus on the challenges and opportunities facing infrastructure professionals in meeting the global net-zero carbon target. It will include the steps that must be taken to understand what net zero means for infrastructure, what we can do to mitigate climate change, and what has to happen to enable our infrastructure to become more resilient to the inevitable change.The interview will also explore how the use of data and new digital systems can help the industry to accelerate its change towards a net-zero future and help the sector to better understand the impact of decisions taken across the planning, design, construction, and operation of assets. As usual, questions from delegates will be welcomed throughout the one-hour session. https://www.linkedin.com/events/twintalks-13avirtualbreakfastwi6774950486006095872/
  9. David McK

    The value of, and from, Data

    For me, Digital Twins are for acquiring, maintaining and exploiting Data - as a means to an end. We need to shift the typical focus of many organisations away from technology and "IT" towards understanding this perspective. I think the real value comes from thinking about Data Flows and not just the Data (store / Lake / whatever). This is my perspective also in the context of Asset Management. I am not associated with Anmut, but I recommend this well-written Report. (They have collaborated with Highways England to do some extremely exciting and useful work re Gemini.) https://anmut.co.uk/insights/ https://www.linkedin.com/posts/guyjdavis96_data-research-datavalue-activity-6739116308098514944-l4Vo
  10. (8) Data wrangling - importing 300+ datasets a quarter - YouTube Is this making the case for bread and butter digital transformation?
  11. I was reccently introduced to the work on Digital Twins that the City of Wellington is involved in. I share some links with the DT Hub community. Unlocking the Value of Data: Managing New Zealand’s Interconnected Infrastructure Plus, check out these links too.. which where shared with me by Sean Audain from Wellington City Council who is leading the Digital Twin activity in the city. "We have been on this trip for a while - here is an article on our approach https://www.linkedin.com/pulse/towards-city-digital-twins-sean-audain/ - the main developments since it was written was a split between the city twin and the organisational twin - something that will be formalised in the forthcoming digital strategy. To give you an idea of progress in the visualisation layer this is what the original looked like https://www.youtube.com/watch?v=IGRBB-9jjik&feature=youtu.beback in 2017 - the new engines we are testing now look like this https://vimeo.com/427237377 - there are a bunch of improvements in the open data and in the shared data systems." I asked Sean about the impact on the DT to city leaders decision making. This is his response... "In our system we are open unless otherwise stated. We have used it as a VR experience with about 7000 wellingtonians in creating the City Resilience Strategy and Te Atakura- the Climate CHange Response and Adaptation plan. There are more descrete uses such as the proposals for the Alcohol Bylaw - https://www.arcgis.com/apps/Cascade/index.html?appid=2c4280ab60fe4ec5aae49150a46315af - this was completed a couple fo years ago and used part of the data sharing arrangements to make liquor crime data available to make decisions. I have the advantage of being a part of local government in getting civic buy in. Every time our councillors are presented with this kind of information they want more." Alcohol Control Bylaw – New
  12. David Willans of Anmut recently sent me this invitation and I thought I should share it here (with permission). On 24th February, 11am GMT, Anmut are running a webinar about data valuation. When we mention the term, people tend to think it’s about setting a price for monetisation. That is one benefit of doing valuation, but it’s a third order benefit at best. The first and second order benefits are much more valuable and best described with two words, translation and focus. Translation Businesses are, in a simplified way, about choosing which assets and activities to allocate limited capital and resources to, to get the desired results. Data is just one of those assets, a powerful one because it enhances all the others by making decisions better, and can identify unseen problems and new opportunities. These allocation decisions are made using the money as a measure, a language if you will – invest £XXX in product / advertising / a new team / training, to get £XXXX in return. Data doesn’t fit with how a business allocates capital, which makes realising the value of it much harder. When you value it, ‘it’ being the different data assets in a business, data can be compared to other assets. It fits the ways the business runs naturally. The second order impact of this is culture change. Suddenly the business understands it has a sizeable portfolio of data assets (in our experience this is approx 20 - 30% of the total business value) and, because businesses manages through money, the business starts to naturally manage data. One caveat though, for the translation effect to happen, the way data's valued matters. If it’s just a simple cost-based method, or linear, internal estimates of use case value, the resulting valuation won’t be accurate and people won't believe it, because the figures will be based factors heavily influenced by internal politics and issues. Focus Capital allocation is a game of constrained choices, of where to focus. When a business’ portfolio of data assets is valued, it becomes very clear where to focus investment in data to move the needle – on the most valuable data assets. Again, this puts more pressure on the valuation method, because it has to be based on the ultimate source of value truth – the stakeholders for whom the organisation creates value. If you need to translate the value of data so the rest of the business gets it, or need clearer focus on how to create more measurable value from your data, this webinar will help. Find out more here or sign up
  13. @David Willans of Anmut recently sent me this invitation and I thought I should share it here (with permission). On 24th February, 11am GMT, Anmut are running a webinar about data valuation. When we mention the term, people tend to think it’s about setting a price for monetisation. That is one benefit of doing valuation, but it’s a third order benefit at best. The first and second order benefits are much more valuable and best described with two words, translation and focus. Translation Businesses are, in a simplified way, about choosing which assets and activities to allocate limited capital and resources to, to get the desired results. Data is just one of those assets, a powerful one because it enhances all the others by making decisions better, and can identify unseen problems and new opportunities. These allocation decisions are made using the money as a measure, a language if you will – invest £XXX in product / advertising / a new team / training, to get £XXXX in return. Data doesn’t fit with how a business allocates capital, which makes realising the value of it much harder. When you value it, ‘it’ being the different data assets in a business, data can be compared to other assets. It fits the ways the business runs naturally. The second order impact of this is culture change. Suddenly the business understands it has a sizeable portfolio of data assets (in our experience this is approx 20 - 30% of the total business value) and, because businesses manages through money, the business starts to naturally manage data. One caveat though, for the translation effect to happen, the way data's valued matters. If it’s just a simple cost-based method, or linear, internal estimates of use case value, the resulting valuation won’t be accurate and people won't believe it, because the figures will be based factors heavily influenced by internal politics and issues. Focus Capital allocation is a game of constrained choices, of where to focus. When a business’ portfolio of data assets is valued, it becomes very clear where to focus investment in data to move the needle – on the most valuable data assets. Again, this puts more pressure on the valuation method, because it has to be based on the ultimate source of value truth – the stakeholders for whom the organisation creates value. If you need to translate the value of data so the rest of the business gets it, or need clearer focus on how to create more measurable value from your data, this webinar will help. Find out more here or sign up
  14. “Data that is loved, tends to survive.” – Kurt Bollacker In our quest to transition ourselves from a nation that simply creates data, to one where we understand and exploit its value to the betterment of society, we still have much to learn about what constitutes ‘quality’ in data. The National Digital Twin programme wants to explore how quality can be defined, and how we can begin to build the tenets and processes for high-quality data into the way we operate in our daily lives, our corporate environments, and our national institutions. This network has been created as a place to focus discussion around how our collective approach to data governance, value and quality must evolve. It provides a central point for the storage of resources that are relevant to each topic, and a forum for the open sharing of ideas, research and case studies. We will explore case studies, debate how we have learned (or not) from the mistakes of the past, and try to bring together consensus over what constitutes best in class practices for governance, quality and ultimately, value. To help in guiding and shaping the work being done, the voices of broader stakeholder groups, expert communities and organisations is invaluable. To this end, the NDTp is establishing this new network, through the DT Hub. Who should join? This is an open group accessible to any member of the DTHub. This is an actively developing area and broad participation is widely encouraged from individuals from all backgrounds. Admin & Security This Community will be supported by CDBB and the National Digital Twin programme by a network manager (James Harris) and supported by the core NDTp team. Please note that due to the open nature of the DT Hub, the community is not suitable for the discussion of sensitive or commercial information.
  15. There are two reports launced by Geospatial Commission on 2020-11-24. They are keen to hear from people's feedbacks. (The reports are related to digital twin, although digital twin was not mentioned in either reports.) 1. Enhancing the UK's Geospatial Ecosystem PDF, 3.47MB, 20 pages 2.Frontier Economics Geospatial Data Market Study Report PDF, 1.95MB, 122 pages Reports download link, with html alternative formats: https://www.gov.uk/government/publications/enhancing-the-uks-geospatial-ecosystem Enhancing_the_UK_s_Geospatial_Ecosystem..pdf Frontier_Economics_-_Geospatial_Data_Market_Study.pdf
  16. RachelJudson

    Benefits Webinar - watch now!

    On the 20th October the NDTp were delighted to be joined by 100 attendees at a webinar to discuss the benefits of digital twins and connected digital twins. Miranda Sharp, NDTp Commons Lead, chaired a panel of experts: Leigh Dodds, ODI; Herman Heynes, Anmut; Peter Vale, Thames Tideway; Paul Hodgson, Greater London Authority The Webinar aimed to understand how to capture the benefit of digital twins and connected digital twins including; To create new revenue through data driven solutions Improved asset management Decision support and assurance Systems thinking; balancing the objectives of cost, safety, security and environmental sustainability New value from data driven solutions The Panel covered wide ranging subjects, responded to questions from the attendees sharing their views on the benefits of exchange of information.
  17. Enterprises creating digital twins have a need to understand the benefits their digital twins bring to their own operation but also the benefits which accrue to their customers, supply chain, local community, industry network and relevant government bodies.  An understanding and harnessing of these benefits has the potential to drive not only individual business cases but also impact regional development spend, regulatory price controls and national policy.  In response to this need, CDBB commissioned a piece of work to create a logic model to find a consistent way to describe the benefits of connecting digital twins.  That model has the potential to deliver both the forward view to guide investment decisions in connecting digital twins and also a retrospective assessment of the benefits achieved by connecting them. Read the CDBB blog, What is the point of a National Digital Twin?  to find out more about the logic model The NIC’s Data for the Public Good report and other publications have described benefits to the economy and enterprises from the sharing of data in a secure and resilient way.  As such, the National Digital Twin programme was set up to create the Information Management Framework to enable that secure resilient data sharing in the built environment and beyond.    The vision for the National Digital Twin is not a central repository of all data rather it is a ] principles principles based means to connect data or individual twins to create both public good and value.   The challenge is to understand where the greatest value can be created from the connection of individual twins.   The NDTp will be running a webinar on 20th October where we will discuss the challenges of valuing data assets, the good they deliver, and how connected digital twins may change the way we do business.   To receive the link to the webinar, register via Eventbrite; https://ndtbenefits.eventbrite.co.uk The Webinar will be held, 11:00 – 12:00, Tuesday 20th October, via Zoom Webinar Hosting and chairing the webinar will be the National Digital Twin programme’s Commons Stream Lead, Miranda Sharp. Joining Miranda will be a panel of experts; Leigh Dodds – ODI ; Leigh is Director of Delivery at the Open Data Institute. You can read about the ODI’s work on data institutions here: https://theodi.org/article/designing-sustainable-data-institutions-paper/   Herman Heyns – ANMUT Herman is CEO at Anmut and Member of Tech UKs Digital Twins Working Group. Anmut is a consultancy that enables organisations to manage data like any other asset. You can read more about how ANMUT value data on their website: https://anmut.co.uk/data-valuation-what-is-your-data-worth/ and https://anmut.co.uk/why-you-should-be-treating-your-data-as-an-asset/ Peter Vale – Thames Tideway; Peter has worked with a consortium at Tideway which has researched the benefits of digital transformation. We hope to see you there.
  18. A very sunny and warm hello to fellow enthusiasts !!! I have been reflecting on the quite unbelievable Digital Twin journey over the last 2 years and having to pinch myself at the progress that has been made. Within Costain, Digital Twin is recognised in all corners as something our industry needs to do, in the big wide world the same can also be said with what seems to be an ever increasing global drive! Wanting to challenge not just myself but the community at large, here are a two 'big ticket' items of reflection where it feels as though we haven't yet fully succeeded: 1 - Collaboration - now don't get me wrong, the collaboration from DT has been truly exceptional and is, I believe, changing industry. However are we still creating digital dots that are yet to be properly connected? For NDT to really be successful we need to reach a point where open conversations can take place between Gov, academia and industry in a way that has not traditionally happened. Discussions that are 'warts and all' that would not normally happen between say an owner / operator and supplier, or between Government and industry. We risk restricting the value of DT which will verge on the magical if truly transparent. 2. - Engineering mindset - No surprise here that as an engineering led industry the focus with DT's appears to be largely engineering focused. In a recent piece of work looking at data trusts, the legal complexity of scaled data sharing has been eye opening. What if a decision is made based on data from say 10 organisations, and that decision leads to an issue because of some low quality or incorrect data, who is then liable? Would it be possible to identify the bad quality data? Exposure to leading research has identified the complexity in privacy, ethics, trust, reliability, accountability and security in relation to collaborative data sharing. It was also interesting to hear about work by the Financial Conduct Authority looking at what a Data Conduct Authority might look like, where data might be monetised. As we look past the engineering foundations, there is a lot to do. I hope people do not view any of the above negatively, this all screams opportunity and is only natural as we lead the world in the development of scaled, federated Digital Twins. What other big challenges do people think need some focus? Regards Kevin
  19. One might argue that the foundation for any Digital twin is understanding what information is required for the business to exist and deliver on its strategy and client needs. Without this, how do we know what information to include in our Digital Twin and how our assets are performing in carrying out this objective? I'm delivering a 3 hours free webinar on the 12th August to show a simple method for extracting OIRs from an executive document and specifying what is critical to understanding the business benefits to owning a Digital Twin. be great if you can join me!https://lnkd.in/dxF6BEN
  20. When asked by a relatively senior member of staff here what the Digital Twin is all about, and why they should care, I pulled together some SmartArt (pictured) to try to explain the component parts of an infrastructure organisation's twin. Keen to get the wider community's thoughts on this approach. Digital Twins are having a bit of moment here at Highways England, to the extent that our principle risk is not a lack of twins, but a surfeit of incompatible twins. I'm beginning to think that the ‘Digital Twin’ of a complex organisation such as HE will actually need to function as a hierarchical system of systems. We need to understand how our organisation functions and what it looks like from a conceptual data perspective (the Schema), we then need a single source of truth, preferably one structured as a graph to reflect the Ontology (the Data), and finally there will be the specific manifestations of the above for different parts of the business (e.g. BIM, digital product catalogues, design, porfolio management etc. etc.) which should be united by the common schema and data above.
  21. The National Digital Twin Programme hosted a webinar on Monday 8th June 2020 to discuss and answer questions about the recently published Pathway towards an Information Management Framework. We were delighted to receive many questions during the webinar, and hope that those the panel were able to answer helped deepen understanding and expand interest in the Information Management Framework and the National Digital Twin Programme. We have added those, and the questions we couldn't get to in the available time, as topics within this forum, collated by subject. We would like to invite you to add your suggestions and to take part in the discussion on the DT Hub around the development of the National Digital Twin. We will use the discussions here to compliment the Open Consultation being run through the CDBB website on the IMF Pathway.. As Mark Enzer, the Head of the NDT Programme, said in the webinar, we need to continue to build consensus through collaboration, and progress through sharing and learning together. For those who missed the webinar, a video of the webinar is now available and attached below is a transcript of the the event. IMF Pathway Webinar 08062020 Transcript FINAL.pdf
Top
×
×
  • Create New...