Jump to content
Join live conversation in our weekly Gemini Call - knowledge sharing and digital twin presentations from the community ×

Search the Community

Showing results for tags 'Geographic Information Systems (GIS)'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Themes

  • Member Forums
    • General discussion
    • Testing digital twin concepts
    • Digital twin competencies
    • Pathway to value
    • IMF Pathway
    • Videos
    • Q & A
  • IMF Community's Forum
  • Data Value and Quality's Forum
  • 4-dimensionalism's Forum
  • The Defence Digital Network's Welcome!
  • The Defence Digital Network's Defence DT Roadmap
  • The Defence Digital Network's Acronym Buster
  • The Defence Digital Network's Open Forum
  • The Defence Digital Network's Documents
  • Open Innovation - Open (Citizen) Science - SDG's Open Innovation
  • Open Innovation - Open (Citizen) Science - SDG's Social Innovation Ecosystem
  • Open Innovation - Open (Citizen) Science - SDG's Events
  • Funding / Collaborative R&D Opportunities's Challenges
  • Funding / Collaborative R&D Opportunities's Funding
  • Italian DT Hub's Q&A
  • Italian DT Hub's News
  • Gemini Papers Community Review's Gemini Papers
  • DT Hub Community Champions's Discussion
  • Gemini Call's Gemini Snapshot and agenda
  • Gemini Call's Recordings

Calendars

  • Community Calendar
  • Italian DT Hub's Events
  • DT Hub Community Champions's Events

Categories

  • A survey of Top-level ontologies - Appendix D

Categories

  • Articles
  • Blogs
  • Publications
  • Editorials
  • Newsletters
  • Shared by the Community

Categories

  • A Survey of Industry Data Models and Reference Data Libraries

Categories

  • Climate Resilience Demonstrator (CReDo)
  • Gemini Call Feature Focus presentations
  • Hub Insights
  • Digital Twin Talks: Interconnection of digital twins
  • Digital Twin Talks: Exploring the definitions and concepts of digital twins
  • Other Digital Twin media

Categories

  • Member Resources
  • Public Resources
  • Guidance
  • IMF Community's Files
  • Data Value and Quality's Shared Resources
  • Italian DT Hub's Risorse
  • Gemini Call's Media
  • Gemini Call's Archive

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Found 5 results

  1. We all want the built environment to be safe and to last. However, minor movements over time from forces such as subsidence can impact how well our assets perform. It can also make connecting and modifying assets harder if they have shifted from the position in which they were built. If the assets are remote or hard to access, this makes tracking these small movements even more difficult. The latest instalment from the Digital Twin Journeys series is a video showing the construction and built environment sectors what they need to know about remote sensing and using satellite data, featuring the Construction Innovation Hub-funded research by the Satellites group based at the Universities of Cambridge and Leeds. Using satellite imaging, we may be able to detect some of the tell-tale signs of infrastructure failure before they happen, keeping services running smoothly and our built environment performing as it was designed over its whole life. You can read more from the Satellites project by visiting their research profile. This research forms part of the Centre for Digital Built Britain’s (CDBB) work at the University of Cambridge. It was enabled by the Construction Innovation Hub, of which CDBB is a core partner, and funded by UK Research and Innovation (UKRI) through the Industrial Strategy Challenge Fund (ISCF).
  2. Marek Suchocki

    Flexible Digital Twins

    A digital twin is a digital representation of something that exists in the physical world (be it a building, a factory, a power plant, or a city) and, in addition, can be dynamically linked to the real thing through the use of sensors that collect real-time data. This dynamic link to the real thing differentiates digital twins from the digital models created by BIM software—enhancing those models with live operational data. Since a digital twin is a dynamic digital reflection of its physical self, it possesses operational and behavioral awareness. This enables the digital twin to be used in countless ways, such as tracking construction progress, monitoring operations, diagnosing problems, simulating performance, and optimizing processes. Structured data requirements from the investor are crucial for the development of a digital twin. Currently project teams spend a lot of time putting data into files that unfortunately isn’t useful during the project development or ultimately to the owner; sometimes it is wrong, at other times too little, or in other cases an overload of unnecessary data. At the handover phase, unstructured data can leave owner/operators with siloed data and systems, inaccurate information, and poor insight into the performance of a facility. Data standards such as ISO 19650 directly target this problem that at a simple level require an appreciation of the asset data lifecycle that starts with defining the need in order to allow for correct data preparation. Implementing a project CDE helps ensure that the prepared data and information is managed and flows easily between various teams and project phases, through to completion and handover. An integrated connected data environment can subsequently leverage this approved project data alongside other asset information sources to deliver the foundation of a valuable useable digital twin. To develop this connected digital twin, investors and their supply chains can appear to be presented with two choices: an off-the-shelf proprietary solution tied to one vendor or the prospect of building a one-off solution with risk of long term support and maintenance challenges. However, this binary perspective is not the case if industry platforms and readily available existing integrations are leveraged to create a flexible custom digital twin. Autodesk has provided its customer base with the solutions to develop custom data integrations over many years, commencing with a reliable common data environment solution. Many of these project CDEs have subsequently migrated to become functional and beneficial digital twins because of a structured data foundation. Using industry standards, open APIs and a plethora of partner integrations, Autodesk’s Forge Platform, Construction Cloud and recently Tandem enable customers to build the digital twin they need without fear of near term obsolescence or over commitment to one technology approach. Furthermore partnerships with key technology providers such as ESRI and Archibus extend solution options as well as enhancing long term confidence in any developed digital twin. The promises of digital twins are certainly alluring. Data-rich digital twins have the potential to transform asset management and operations, providing owners new insights to inform their decision-making and planning. Although digital twin technologies and industry practice are still in their youth, it is clear that the ultimate success of digital twins relies on connected, common, and structured data sources based on current information management standards, coupled with adoption of flexible technology platforms that permit modification, enhancement or component exchange as the digital twin evolves, instead of committing up front to one data standard or solution strategy.
  3. Digital twins are not just a useful resource for understanding the here-and-now of built assets. If an asset changes condition or position over its lifecycle, historical data from remote sensors can make this change visible to asset managers through a digital twin. However, this means retaining and managing a potentially much larger data set in order to capture value across the whole life of an asset. In this blog post, Dr Sakthy Selvakumaran, an expert in remote sensing and monitoring, tells us about the importance of curation in the processing of high-volume built environment data. There are many sources of data in the built environment, in increasing volumes and with increasing accessibility. They include sensors added to existing structures – such as wireless fatigue sensors mounted on ageing steel bridges – or sensors attached to vehicles that use the assets. Sources also include sensing systems including fibre optics embedded in new structures to understand their capacity over the whole life of the asset. Even data not intended for the built environment can provide useful information; social media posts, geo-tagged photos and GPS from mobile phones can tell us about dynamic behaviours of assets in use. Remote sensing: a high-volume data resource My research group works with another data source – remote sensing – which includes satellite acquisitions, drone surveys and laser monitoring. There have been dramatic improvements in spatial, spectral, temporal and radiometric resolution of the data gathered by satellites, which is providing an increasing volume of data to study structures at a global scale. While these techniques have historically been prohibitively expensive, the cost of remote sensing is dropping. For example, we have been able to access optical, radar and other forms of satellite data to track the dynamic behaviour of assets for free through open access policy of the European Space Agency (ESA). The ESA Sentinel programme’s constellation of satellites fly over assets, bouncing radar off them and generating precise geospatial measurements every six days as they orbit the Earth. This growing data resource – not only of current data but of historical data – can help asset owners track changes in the position of their asset over its whole life. This process can even catch subsidence and other small positional shifts that may point to the need for maintenance, risk of structural instability, and other vital information, without the expense of embedding sensors in assets, particularly where they are difficult to access. Data curation One of the key insights I have gained in my work with the University of Cambridge’s Centre for Smart Infrastructure and Construction (CSIC) is that data curation is essential to capture the value from remote sensing and other data collection methods. High volumes of data are generated during the construction and operational management of assets. However, this data is often looked at only once before being deleted or archived, where it often becomes obsolete or inaccessible. This means that we are not getting the optimal financial return on our investment on that data, nor are we capturing its value in the broader sense. Combining data from different sources or compiling historical data can generate a lot of value, but the value is dependent on how it is stored and managed. Correct descriptions, security protocols and interoperability are important technical enablers. Social enablers include a culture of interdisciplinary collaboration, a common vision, and an understanding of the whole lifecycle of data. The crucial element that ensures we secure value from data is the consideration of how we store, structure and clean the data. We should be asking ourselves key questions as we develop data management processes, such as: ‘How will it stay up to date?’ ‘How will we ensure its quality?’ and ‘Who is responsible for managing it?’ Interoperability and standardisation The more high-volume data sources are used to monitor the built environment, the more important it is that we curate our data to common standards – without these, we won’t even be able to compare apples with apples. For example, sometimes when I have compared data from different satellite providers, the same assets have different co-ordinates depending on the source of the data. Like ground manual surveying, remote measurements can be made relative to different points, many of which assume (rightly or wrongly) a non-moving, stationary point. Aligning our standards, especially for geospatial and time data, would enable researchers and practitioners to cross-check the accuracy of data from different sources, and give asset managers access to a broader picture of the performance of their assets. Automated processing The ever increasing quantity of data prohibits manual analysis by human operators beyond the most basic tasks. Therefore, the only way to enable data processing at this large scale is automation, fusing together remote sensing data analysis with domain-specific contextual understanding. This is especially true when monitoring dynamic urban environments, and the potential risks and hazards in these contexts. Failure to react quickly is tantamount to not reacting at all, so automated processing enables asset owners to make timely changes to improve the resilience of their assets. Much more research and development is needed to increase the availability and reliability of automated data curation in this space. If we fail to curate and manage data about our assets, then we fail to recognise and extract value from it. Without good data curation, we won’t be able to develop digital twins that provide the added value of insights across the whole life of assets. Data management forms the basis for connected digital twins, big data analysis, models, data mining and other activities, which then provide the opportunity for further insights and better decisions, creating value for researchers, asset owners and the public alike. You can read more from the Satellites project by visiting their research profile. This research forms part of the Centre for Digital Built Britain’s (CDBB) work at the University of Cambridge. It was enabled by the Construction Innovation Hub, of which CDBB is a core partner, and funded by UK Research and Innovation (UKRI) through the Industrial Strategy Challenge Fund (ISCF). For more on the Digital Twin Journeys projects, visit the project's homepage on the CDBB website.
  4. Sheikh Fakhar Khalid of Sensat presents a Feature Focus about automation enabled digital twins for the DT Hub's regular Gemini Call.
  5. The Open Geospatial Consortium, an open standards consortium with an experimental innovation arm, invites digital twin enthusiasts to evaluate the use of APIs and web services to connect to a variety of information resources in the built environment. https://www.ogc.org/projects/initiatives/idbepilot At this stage, we are after use cases, ideas, datasets and establish the requirements for a 4 months pilot. Use cases may include Building Condition Assessments across larger portfolios and evaluating building occupancy under certain constraints such as social distancing. Response period ends September 30st, 2021.
Top
×
×
  • Create New...