Jump to content

Search the Community

Showing results for tags 'Infrastructure'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Themes

  • Member Forums
    • General discussion
    • Testing digital twin concepts
    • Digital twin competencies
    • Pathway to value
    • IMF Pathway
    • Videos
  • IMF Community's Forum
  • DT Toolkit's Case studies
  • DT Toolkit's Business case
  • DT Toolkit's Use cases
  • DT Toolkit's Roadmap
  • DT Toolkit's Network feedback
  • DT Toolkit's Toolkit
  • Data Value and Quality's Forum
  • The Defence Digital Network's Welcome!
  • The Defence Digital Network's Defence DT Roadmap
  • The Defence Digital Network's Acronym Buster
  • The Defence Digital Network's Open Forum
  • Open Innovation - Open (Citizen) Science - SDG's Open Innovation
  • Open Innovation - Open (Citizen) Science - SDG's Social Innovation Ecosystem
  • Open Innovation - Open (Citizen) Science - SDG's Events
  • Funding / Collaborative R&D Opportunities's Challenges
  • Funding / Collaborative R&D Opportunities's Funding
  • Italian DT Hub: The Next Step / Il passo successivo's Q&A

Categories

  • A Survey of Industry Data Models and Reference Data Libraries

Categories

  • Public Resources
  • Guidance
  • IMF Community's Files
  • DT Toolkit's Files
  • Data Value and Quality's Shared Resources

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Found 5 results

  1. Infrastructure projects inevitably cause impacts to our environment, The pressure is therefore on contractors to adhere to new policies using their own initiatives and tools. In addition to adhering to new net gains policies that will be released with the upcoming environment bill, other impacts of not considering our environment include project delays and increases in costs, and damage to a contractors social and sustainable reputation. Integrating GIS to current environmental planning methodologies improves on existing tools for evaluating and quantifying biodiversity and Natural Capital. By doing this we can visualise the areas that would potentially have the highest loss, and adapt the design to mitigate impacts and reduce those losses. Join the webinar 11th Dec at 3.30pm for the discussion: https://attendee.gotowebinar.com/register/1029016851016993804 @iain miskimmin @KReevesDigi
  2. Dear DT Hub Community Does it feel like getting started on a Digital Twin journey is too hard? Firstly I agree with you and believe that it won't stay this way for long (believe.. he says!). After several years working in UK infra, I now see more of the "realities" of digital twins.... I've worked as the buyer, seller, and builder of digital twins. I am trailing this theory with a major UK Infrastructure Operator (early stages), however, I would really like to determine the interest of this group because I want to draft up a proposal/action plan. To give you an idea the organisations that I have worked with, and who have influenced my thinking are: Clients - Network Rail, Highways England, Connect Plus Services, National Grid, Thames Water, Northumbrian Water, EDF Contractors - Costain, Balfour Beatty, BAM Nuttall, Kier..... Consultants - Mott MacDonald, Turner, and Townsend, PWC, Deloitte..... 1. Digitisation of existing physical assets is seen as a laborious process that doesn't deliver immediate results. This is primarily due to a lack of awareness of the people, processes, and products involved…. It is infact becoming far cheaper and more reliable than ever before: The range and “sophistication" of mass data capture systems and semi/fully autonomous mapping hardware, means that all visible assets can now be readily scanned/digitised/mapped to a high degree of fidelity. (See the wide range of ROVs, UAVs, autonomous submersibles, Network Route Scanning by land air and sea (rail, road, utilities)) Not to mention the advances in Persistent Scatter Interferometric Synthetic Aperture Radar. Which means creating an accurate digital replica of visible assets is become cheaper, safer, more available and more intelligent every 6 months. 2. Organisation and structuring of data are often viewed as highly necessary, however, due to its perceived complexity/risk/cost it is always approached hesitantly/using old thinking There are many options for automatically structuring/re-structuring/transforming data of most types. For fully autonomous data cleansing and structuring, there are now a several information classifier systems (See NET CAD and many more). Even for semi-autonomous cleansing, the functionality of Excel - Power BI - SQL Databases is fit for most of these activities and can drive immediate near term value using recently trained in house staff. This is commonplace and has been happening in other industries for decades. We recently completed an internal activity using an auto data cleanser and classifier and a standard taxonomy, lessons learned for us where there is no point getting the tech right if you forget to train the people! But we all make mistakes. 3. Analysis and use of data - Companies that don’t do much of this believe that to do a lot of it they need 1 of 3 things (Expert 3Rd Party Support/Extensively upskilled internal staff/Hired in Talent). Most organisation agonise between buying supplier services in to do this job quickly (and well!) and wanting to develop their in house data analysis skills (because who wants to be beholden to suppliers?) either through training or in hiring. There is a 4th option though!! Many many organisations around the world now focus on building data analysis capability at client organisations through the provision of "Standard Algorithm Market places". This is an online Ocado for algorithms. Sometimes you want to buy a ready meal (fully finished algorithm) and sometimes you want to build a soup from several base ingredients. In the modern-day, you don't need to know how to grow a carrot to make carrot soup, you just need to combine ingredients. And the same thinking should be applied to Analysis of Data, you don’t need to be able to write code, or even understand how code is written, to be able to develop very powerful algorithms. Just like the farmer who cultures the Carrot for you, these new suppliers culture very powerful modular algorithms that do specific things. Lets take Satellite imagery analysis for example, there are no longer complex codes, just drag and drop tools that do different things such as (Pick out trees, Trace out the Road markings, Determine total blacktop surface). So now all you need to do is plug these functional algorithms (tools) together and you are up and running! 4. Monetising the data is hard. This bit is actually very simple to do! This is the fun part, there are loads of Management Consulting and Business Startup exercises that help show how to monetise data, information, and analytics. If you are worried only about this bit then you haven't got much to worry about. 5. People, people people........people People need more attention! I have been told that this community is eager to hear from suppliers who are already delivering digital twin solutions, already delivering value, and want a large partner to work alongside? It is not unusual for a rapidly emerging industry to have sudden drops to the "barriers for entry". I believe we are entering one now so it is a really good time to talk about what might and might not work for us. Much respect, Peter Slater MIET MEng - "Digital Maturity has no finish line"
  3. Mark Coates

    Demonstrator project

    One for the calendar :- we recently did a demonstrator project with Microsoft’s Azure Digital Twins team for the Build Conference which will be featured as a Deep Dive on Microsoft’s Channel 9 IoT Show on August 3rd. https://channel9.msdn.com/Shows/Internet-of-Things-Show/Deep-Dive-Integrating-3D-Models-and-IoT-data-with-iTwin-and-Azure-Digital-Twins In this session we will demonstrate an application that combines 3D models, 2D maps, and reality mesh into a single environment for visualisation Within that environment we will demonstrate a live, real time, seamless visualization of IoT data streams. Next we will walk through the architecture that enables the application. We will show how we have mapped the IoT data to the assets within the digital twin. And we will show how to keep the digital twin in step with engineering changes. This is done by automating the generation of the Digital Twin Description Language (DTDL) .JSON. Bentley's iTwin platform and iModel.js open-source programming library provide powerful capabilities for aggregating 3D, 2D, reality data and other sources to link with IoT data for a "single pane of glass" visualization, analytics, and simulation so users can make more effective decisions in a timely manner. By integrating the iTwin platform with Azure Digital Twins and other Azure IoT services, Bentley and Microsoft are making it easier for developers, integrators, and customers to build digital twins of their infrastructure assets. It would be great to get feed back from as many as possible once the show has aired
  4. Ian Gordon

    IICC TIES project

    Attended a presentation from Costain / DfT today about their Intelligent Infrastructure Control Centre (architecture below). At first glance there is some substantial overlap with NDT, and apparently CDBB are involved. Is anyone here sighted on this work, and how best to align with the wider NDT IMF / Commons?
  5. I would like to know what the group thinks about creating stronger linkages between the built and natural environment. The below article is a quick read and should stimulate thinking - if we don't manage to link the two systems conceptually, semantically and eventually programmatically, we are are going to miss a big trick. https://www.researchgate.net/publication/328636876_The_Ground_Beneath_Our_Cities ConferencePaper_54thISOCARPCongress2018_Venvik_G_TheGroundBeneathOurCities (1).pdf
Top
×
×
  • Create New...