Jump to content

Search the Community

Showing results for tags 'Building Information Modelling (BIM)'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Themes

  • Member Forums
    • General discussion
    • Testing digital twin concepts
    • Digital twin competencies
    • Pathway to value
    • IMF Pathway
    • Videos
  • IMF Community's Forum
  • DT Toolkit's Case studies
  • DT Toolkit's Business case
  • DT Toolkit's Use cases
  • DT Toolkit's Roadmap
  • DT Toolkit's Network feedback
  • DT Toolkit's Toolkit
  • Data Value and Quality's Forum
  • The Defence Digital Network's Welcome!
  • The Defence Digital Network's Defence DT Roadmap
  • The Defence Digital Network's Acronym Buster
  • The Defence Digital Network's Open Forum
  • Open Innovation - Open (Citizen) Science - SDG's Open Innovation
  • Open Innovation - Open (Citizen) Science - SDG's Social Innovation Ecosystem
  • Open Innovation - Open (Citizen) Science - SDG's Events
  • Funding / Collaborative R&D Opportunities's Challenges
  • Funding / Collaborative R&D Opportunities's Funding

Categories

  • A Survey of Industry Data Models and Reference Data Libraries

Categories

  • Public Resources
  • Guidance
  • IMF Community's Files
  • DT Toolkit's Files
  • Data Value and Quality's Shared Resources

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Found 13 results

  1. RachelJudson

    Planning Golden Thread

    Click here for video As citizens and professionals we accept that the planning process is there to uphold standards of safety, aesthetic, technical and social requirements. However, the planning process has suffered from many years of tinkering and making good. We now have a planning process that is dependent on outdated approaches and incompatible with the rest of the development industry. It is slow, which presents problems in the UK where we need to build, a lot, quickly. Planning risks preventing this building from happening at pace and of a higher quality. This situation presents, of course, a golden opportunity for a fully digitised end-to-end process which could: reduce the planning bottleneck automate those parts of the process that can be Increase transparency of the process open up new means of engaging stakeholders with the planning process, by for example visualising proposed developments and so increasing understanding allow us to see projects in context, with other proposed developments, rather than in isolation allow access to, and sharing of, crucial data (like structural and fire safety information) facilitate the use of modern methods of construction most importantly, give a more accurate understanding of build costs and timescales In order to bring this about, we have to standardise and digitise (as far as it is possible and desirable) the rules under which designs are created, assessed, and ultimately built. At the same time we have to find ways to generate and use interoperable data. This problem is what the group from Bryden Wood, 3D Repo, London Borough of Southwark and CDBB have been working on. We have developed a model which is open and based on the established BIM Collaboration Framework (BCF). It presents the data associated with planning so that it can be queried and interrogated. You can see a summary in the video above and read more about it here; Planning Golden Thread statement attached below 3DRepo technical write up Bryden Wood technical write up Bryden Wood Schema We know that many of the barriers associated with a change like this will be cultural rather than technical so we are seeking partners in the planning and development system who would like to test the model and collaborators who would like to fund the next stage of development. Please get in touch! You can also hear more about this on the Gemini Call on Tuesday, 18 May at 10:30 with Miranda Sharp and Jack Ricketts of Southwark Council. Link to DT Hub Calendar
  2. David McK

    The value of, and from, Data

    For me, Digital Twins are for acquiring, maintaining and exploiting Data - as a means to an end. We need to shift the typical focus of many organisations away from technology and "IT" towards understanding this perspective. I think the real value comes from thinking about Data Flows and not just the Data (store / Lake / whatever). This is my perspective also in the context of Asset Management. I am not associated with Anmut, but I recommend this well-written Report. (They have collaborated with Highways England to do some extremely exciting and useful work re Gemini.) https://anmut.co.uk/insights/ https://www.linkedin.com/posts/guyjdavis96_data-research-datavalue-activity-6739116308098514944-l4Vo
  3. What are your views on creating DT of existing buildings and/or refurb projects?
  4. https://www.nationalresourcesreview.com.au/mag/AUGUST2020.html#pdfflip-PDFF/31/ I spoke to Genene on a recent DTFC podcast and really took a shine to her. The most contentious view she has is that a 3D model of an asset is a static digital twin. I'm with Michael Grieves on this, 'it's an analogy'. So does it even matter? calling a 3D model a static digital twin in engineering and built environment sectors gives customers a feeling of being on the journey towards a dynamic digital twin. Good article from a good egg.
  5. Vicki Reynolds

    Golden Thread Survey Responses

    Hi all, attached is the full set of survey responses from the Golden Thread Survey carried out by i3PT and the CIOB. There are some interesting responses to the questions about digital tools and capability. We’re only distributing this one to interested parties rather than marketing it more widely. That being said, it is in no way restricted, so please feel free to share with anyone that you think may be interested. I’ve also included the original summary report. Thanks COP1060 - Golden Thread - Complete Survey Res.pdf Golden-Thread-Review.pdf
  6. Hi guys, so my first post and an opening for discussion. Should the production of the digital twin be tied strictly to the physical construction? My reasoning for bringing this up is that all too often digital assets are left hanging and incomplete at the end of a project during handover, for a digital twin to be truly effective it needs to be commissioned just like the physical asset with remediation planned in, well into the operational phase. All too often the model authors and those intimate with the data have moved onto other projects with updates and changes being difficult to implement which means the model is already behind when it comes to operations. It's worth pointing out that my personal experience is in utilities and O&G where assets are large, complex and take time to transfer data to the asset management systems, having a separate and deliberate project plan that takes its queues from the master plan could alleviate many of these handover issues.
  7. Morning All, Currently investigating a case for a Digital Twin concept - is there any guidance relating to the structure of the data to align up to a more national structure? Any useful guidance to help the beginnings of this case move in the right direction? Thanks, Lewis
  8. I recently posed a question in this forum to clarify thoughts on the need for a digital twin ‘test’... a way of determining if a proposed digital twin is actually what everyone can agree upon and that matches expectations. A test will serve as an invaluable tool for educating and up-skilling, avoidingconfusion and set a direction for implementation. This is something particularly close to my heart as we’re currently (still) experiencing this in global BIM discussions. Whilst on the topic of BIM, the test could be a great way of identifying what a typical BIM process deliverable is and how a digital twin might differ. This is particularly pertinent as we’re currently observing digital twin negativity and the misconception that digital twins are ‘just BIM’. Take a look at the attached image, a snapshot of a Twitter Poll... this may be just a small sample, but of 113 people on twitter who responded to this tweet by a Canadian colleague, just over HALF of them think digital twins are software vendors marketing vaporware - a product that doesn’t come to fruition. The other half are of the impression that digital twins are a ‘technology’. Clearly there’s work to be done... Personally, I think we need a mutually agreed distinction to engage and involve a wider group of professionals from within our sector and outside of it to really progress and deliver the benefits outlined in The Gemini Principles. Comments you’ve provided so far suggest that a test could be helpful, although some of you share the concern that the time taken to form a test may be better spent developing a digital twin. Other comments have highlighted the need to avoid being short-sighted in the ‘boundaries’ of a test. If we are to develop a test, it will need to be flexible enough to cater for edge cases and to evolve over time as technologies and possibilities become more easily achievable - i.e. when the goal posts move! Do we need to define a baseline case, so that all proposed digital twins are measured against it? If so, what are the fundamentals? For example, which of the following might be considered a digital twin: • www.lightningmaps.com (near real-time data visualization of weather systems); • https://www.tidetimes.org.uk/ (log of expected highs and lows of a tidal system); and • www.googlemap.com (periodically updated traffic system with patterns and disruptions) Each of these are similar but constitute different fundamentals. LightningMapsuses weather station data, while TideTimes uses a database of pre-established tide peaks and throughs. Is the collection of (near) real-time data fundamental, or something that is only applicable to specific use case? Once we have the fundamentals, which digital twins need to be tested? If we are ultimately aiming for a national digital twin, surely we need to test all of them to ensure compatibility and value if it is to be included/connected to it? If this is the case, then I’m talking myself into the notion that a simple yes/no or pass/fail will never be enough... We need to find a way to identify and celebrate the (positive) extremes, to encourage the development of borderline cases to become true digital twins and to seek new directions and measures of ‘what looks good’ as the sector integrates digital twins into its decision-making. It looks like we have a LOT to discuss in the proposed workshop on the 17thNovember to explore why, what and how we should be measuring. Outline agenda below, to be informed by the ongoing forum discussions. The Why - Discussing the pros and cons of a digital twin test. Objectives & Activities for looking at intuitive tests for digital twins Summary of initial industry feedback. A Yes/No, Pass/Fail or a Sliding Scale? Existing 'test' examples that could be leveraged from other industries. Discussing what elements make up a digital twin. I hope you will continue the discussion on this thread, which will give us time to prepare the workshop materials and key discussion points and to do that, I have some questions to continue the discussion... 1. In YOUR role in either procuring, creating, maintaining or analysing/interacting with a digital twin, what should we be testing or measuring? Please let us know what your involvement (current or proposed) is and what we should be measuring/testing to help in that role. 2. What, in your opinion, makes a digital twin - real? Let’s keep this short, give me your top 5! 3. How do we best differentiate what we should typically deliver in a BIM process and a digital twin? Digital twins are a huge opportunity for bettering the entire built environment design, procurement, operation and provide tangible benefits to society. What therefore can we do to promote the relationship (and a distinct difference) between BIM and digital twins? The workshop will take place on the 17th November from 14:00 – 16:00. Register on Eventbrite to receive joining instructions. See you there! C
  9. BIM Twin What is What if Files Queries Physical Real Asset Function Time stamp Time graph Transaction Enterprise Outputs Outcomes As Designed? As Intended? (for discussion) I wanted to share some early thinking with you, and please consider this a consultation not a formal announcement of direction. Following the latency post from @DRossiter87 and some conversations with people in different markets. I have found a useful framework to separate BIM from Digital Twins. There is a caveat with the following, this is not a statement of which is better. Both BIM and Digital Twinning have key benefits. Much like a chef has a collection of knives for different use cases. The same is true for BIM and twins. BIM as defined in the standards available sets out how data can be procured in a transactional model. This is where a client can set the information requirements for a supply chain to author and deliver information for a particular purpose. The table above sets out a series of differences and I will work through them one by one to explain what they mean and how they differ. 1. What is vs What if A BIM will tell you what something is, it cannot answer the question what if. The IMF sets out a pathway for askign questions of datasets. For example “What if I turned this value off?”. 2. Files vs Queries Very similar to the above, but with a view on functionality. The BIM sets out the container of the data and the files within. These files include CSV files or a SQL databases for example. The query in the twin space is an operation on the dataset or file. 3. Physical vs Real The BIM space treats physical elements as assets. Those assets would be on some form of register which lists 'tangible things'. Those assets generally develop over time in line with the level of information need. In the twin space this representation of the physical is abstracted up into its function. The real aspect is how the object interacts with reality. This interaction is physically within the system (a pump pumping water) and is broader service / organisational purpose (the pump provides a minimum pressure to supply water to customers and is linked to the revenue stream that, for example, is charged by the cubic meter of water.) 4. Asset vs Function Related to the above, the asset focus is purely on the performance specification and range of the asset's performance in isolation. The twin considers the function the asset plays. @Simon Scott explained a great example of this. The function of a level crossing is to ensure two types ofmobile infrastrcture do not collide (please correct me if im wrong here simon), there is a difference between asking for a level crossing (an asset) and asking for two infrastructures not to collide (a function) are fundementally two differnt questions. 5. Time stamp vs time graph Time in BIM is a time stamp against a transaction or digital snapshot of an asset. The twin aspect is the time graph, the status of a person over time changes. The queries from the twin understand the historical elements of an asset. For example, when searching for an actor on google it can piece together data of that person from a series of datasets that allows a comprehensive history of that a actor to be rendered. 6. Transaction vs Enterprise The BIM standards describe a process for multiple parties to transact data. They set out how data can be procured, authored and delivered as a series of transactions. The twin represents an enterprise view where data flows with purpose aligned with agreed outcomes. 7. Outputs vs outcomes BIM through its focus on transactions and assets can only provide insight on outputs, where twins focus on functions and enterprise it can provide insights on outcomes. 8. 3D Rendition vs Abstracted BIM requires a 3D rendition of an asset as set out in the level of information need / requirements. For the digital twin, and to use @DRossiter87example of a BMS, there is no need for a full representation of the asset. All that is required is the data needed in order to execute a decision, either for a machine or human. Of course, if the what if statement includes a spatial requirement a boundary condition for the geometry is required. A non-geographic example, is that the BMS wants to know which rooms to heat for the day for a school, a key input could be the lesson plans from the teaching staff to understand occupancy of a space. On the other hand, a geographic example is if the AHU requires a filter replacement and the plantroom is tight for space. There would be a need for a physical representation of the space. I welcome the discussion and feedack!
  10. Dear DT Hub Community Does it feel like getting started on a Digital Twin journey is too hard? Firstly I agree with you and believe that it won't stay this way for long (believe.. he says!). After several years working in UK infra, I now see more of the "realities" of digital twins.... I've worked as the buyer, seller, and builder of digital twins. I am trailing this theory with a major UK Infrastructure Operator (early stages), however, I would really like to determine the interest of this group because I want to draft up a proposal/action plan. To give you an idea the organisations that I have worked with, and who have influenced my thinking are: Clients - Network Rail, Highways England, Connect Plus Services, National Grid, Thames Water, Northumbrian Water, EDF Contractors - Costain, Balfour Beatty, BAM Nuttall, Kier..... Consultants - Mott MacDonald, Turner, and Townsend, PWC, Deloitte..... 1. Digitisation of existing physical assets is seen as a laborious process that doesn't deliver immediate results. This is primarily due to a lack of awareness of the people, processes, and products involved…. It is infact becoming far cheaper and more reliable than ever before: The range and “sophistication" of mass data capture systems and semi/fully autonomous mapping hardware, means that all visible assets can now be readily scanned/digitised/mapped to a high degree of fidelity. (See the wide range of ROVs, UAVs, autonomous submersibles, Network Route Scanning by land air and sea (rail, road, utilities)) Not to mention the advances in Persistent Scatter Interferometric Synthetic Aperture Radar. Which means creating an accurate digital replica of visible assets is become cheaper, safer, more available and more intelligent every 6 months. 2. Organisation and structuring of data are often viewed as highly necessary, however, due to its perceived complexity/risk/cost it is always approached hesitantly/using old thinking There are many options for automatically structuring/re-structuring/transforming data of most types. For fully autonomous data cleansing and structuring, there are now a several information classifier systems (See NET CAD and many more). Even for semi-autonomous cleansing, the functionality of Excel - Power BI - SQL Databases is fit for most of these activities and can drive immediate near term value using recently trained in house staff. This is commonplace and has been happening in other industries for decades. We recently completed an internal activity using an auto data cleanser and classifier and a standard taxonomy, lessons learned for us where there is no point getting the tech right if you forget to train the people! But we all make mistakes. 3. Analysis and use of data - Companies that don’t do much of this believe that to do a lot of it they need 1 of 3 things (Expert 3Rd Party Support/Extensively upskilled internal staff/Hired in Talent). Most organisation agonise between buying supplier services in to do this job quickly (and well!) and wanting to develop their in house data analysis skills (because who wants to be beholden to suppliers?) either through training or in hiring. There is a 4th option though!! Many many organisations around the world now focus on building data analysis capability at client organisations through the provision of "Standard Algorithm Market places". This is an online Ocado for algorithms. Sometimes you want to buy a ready meal (fully finished algorithm) and sometimes you want to build a soup from several base ingredients. In the modern-day, you don't need to know how to grow a carrot to make carrot soup, you just need to combine ingredients. And the same thinking should be applied to Analysis of Data, you don’t need to be able to write code, or even understand how code is written, to be able to develop very powerful algorithms. Just like the farmer who cultures the Carrot for you, these new suppliers culture very powerful modular algorithms that do specific things. Lets take Satellite imagery analysis for example, there are no longer complex codes, just drag and drop tools that do different things such as (Pick out trees, Trace out the Road markings, Determine total blacktop surface). So now all you need to do is plug these functional algorithms (tools) together and you are up and running! 4. Monetising the data is hard. This bit is actually very simple to do! This is the fun part, there are loads of Management Consulting and Business Startup exercises that help show how to monetise data, information, and analytics. If you are worried only about this bit then you haven't got much to worry about. 5. People, people people........people People need more attention! I have been told that this community is eager to hear from suppliers who are already delivering digital twin solutions, already delivering value, and want a large partner to work alongside? It is not unusual for a rapidly emerging industry to have sudden drops to the "barriers for entry". I believe we are entering one now so it is a really good time to talk about what might and might not work for us. Much respect, Peter Slater MIET MEng - "Digital Maturity has no finish line"
  11. KReevesDigi

    Twin Tools

    Hello Everybody !! First topic on the lovely new Twin Hub and good to see fellow enthusiasts get engaged. I would be interested to know the types of tools people are using within their Digital Twin journeys and for what purpose, this is important as getting vendors included and supporting the IMF will be crucial success at scale. A few tools we are using in Costain to get the discussion moving (non-exhaustive, only scratching the surface!) : ArchiMate - We use this as the starting point aligned to the TOGAF architectural framework. This is really key as it starts with business outcomes / objectives and the required business architecture to support realising those outcomes. This then follows into the sexy technical architectures and associated data models, taking account of identity and access management, backups, archive etc etc Bentley iTwin - A familiar offering from Bentley, we are using iTwin on a couple of projects as a data platform, linking design data to other operational systems, with an inclusion of real time data to come shortly! Microsoft Azure - Perhaps not an obvious choice though very useful for enterprise scale data integration. Very effective for moving data around an organisation and driving insights, the same can be done with AWS, Bluemix or any of the other large platform providers. Modern cloud tools enable quick data integration from existing databases without any code or SQL queries, making this process very streamlined. SCADA - often get overlooked though supervisory control and data acquisition solutions (Rockwell, Iconics, Schneider, Siemens etc) have been delivering real time digital twins (albeit in 2D) for decades. Modern variants allow full ERP integration so that operational information (alarms, events etc) can be linked directly to maintenance tools / schedules for example. Iconics and others have embraced BIM where models and associated info can be imported into SCADA to give amazing O&M capability that is real time by default. We are driven by client demands and typically look to use what they have already, the key to scaled Digital Twin's is really in the data standards (IMF) but also the integration architecture. In the past I have worked on projects where standard visualisation objects, control code (function blocks), data models, design info and O&M info were all aligned to physical standard products. This is the absolute nirvana where the design, build, operate and maintain costs are all significantly reduced, delivering both a standardised physical asset and digital counterpart. What tools do you use and for what purpose? Do you look to create standard libraries or go for the bespoke option? Be really interested to hear from others on this super exciting topic !! Regards KevinReevesDigi
  12. DRossiter87

    BIM Interoperability Report

    Hi all, Today CDBB released their BIM interoperability report. While BIM is distinct from Digital Twins there will be a need to interoperate between them. As such, I think this report will be of use to members. The report is currently out for public consultation. Available at the link provided and attached below ☺️ https://www.cdbb.cam.ac.uk/news/bim-interoperability-expert-group-report cih_bim_interoperability_expert_group_report_april_2020_final.pdf
  13. A new Construction Innovation Hub research programme explores procurement strategies to incentivise collaborative delivery and optimise whole-life outcomes. Lead author of the review, Professor David Mosey from King’s College London Centre of Construction Law, considers the current procurement landscape and the benefits of... View the full article
Top
×
×
  • Create New...