Jump to content

Breaking Barriers: Interoperability

DRossiter87
 Share


Tammy Au
Message added by Tammy Au,

Please be aware that these comments were copied here from another source and that the date and time shown for each comment may not be accurate.

During our research activities within the DT Hub, several barriers relating to the use of digital twins were identified.  This blog post is one of a series which reflects on each barrier and considers related issues so that we can discuss how they may be addressed.

Share_Image.jpg

As our members, and indeed other organisations active in the built environment, develop data and information about their assets, the ability to ensure that this data can be used within other tools is a priority.  To do so, the data needs to be interoperable. One definition of interoperability is:

Quote

interoperability
capability of two or more functional units to process data cooperatively.
[SOURCE: ISO 2382:2015, 2120585 – Information Technology - Vocabulary]

In brief, if data can be shared between systems it is considered interoperable.  Typically, this can be achieved in one of two ways:

  1. Both systems use the same formal description (schema) to structure the data; or
  2. One system transforms its data using an intermediate formal description (schema) to structure the data

The simplest solution appears to be (1), to have all systems create, use and maintain information using the same schema.  This would mean that information could be used in its default (native) format and there would be no risk of data being lost or corrupted during its transformation.  However, this isn’t practicable as, from a technical perspective, it is unlikely that the broad range of information needed to support every possible purpose could be captured against the same schema.  In addition, public procurement directives require performance-based technical specifications as opposed to naming specific software. This means that an organization may be challenged if they specify their supply chain use a particular piece of software as it would circumvent directives around competition and value for money.

As it is not possible to guarantee that the same schema will be used throughout, it is far more practicable to identify which established industry schema is most suitable to accept data within (2) depending on the purpose of using this information.  In doing so, there is an added benefit that the information you receive may be open data.

Typically misused as a synonym for interoperability, open data is important for sharing but for a specific reason.

Quote

open data
data available/visible to others and that can be freely used, re-used, re-published and redistributed by anyone
[SOURCE: ISO 5127:2017, 3.1.10.13 – Information and Documentation – Foundation and Vocabulary]

Open data, in brief, is un-restricted data.  By using proprietary software and systems the schema used to structure that data is hidden.  As a user of that software you are effectively given permission by the vendor to use that structure to view your information.  For built environment assets this can be a problem as the physical asset can outlast the software used to design and manage it.  Meaning that in 50 years a tool that allows access to this information may not exist - or sooner given the cannibalistic nature of the software industry.  Consider SketchUp for example.  Since its release in 2000, it has been owned by three different organizations: @Last Software, Google, and Trimble.  The permission to use the SKP schema has changed hands several times.  Who will produce software to view these files in 30 years’ time?

To ensure enduring access to asset information, either bespoke schemas need to be developed and maintained internally, or an established open schema needs to be used.  However, while several open schemas are readily available (such as IFC, PDF, PNG, MQTT) they can raise concerns related to access, control and abuse of the data within. 

These concerns, thankfully, can be offset through control.  Using open data structures, it is possible to ensure that only the information you wish to exchange is delivered.  By using proprietary structures hidden information can also be exchanged which cannot be controlled; potentially causing a larger risk than their open counterparts.  Conversely, to produce a “need-to-know” dataset an open data approach is, ironically, easier to control.

When considering which methodologies to use, open data benefits typically outweigh its risks.  The use of these open data structures will not only unlock interoperability between digital twins within an organization but will be the mechanism that enables a secure national digital twin. 

Access to appropriate data about our national infrastructure is currently held behind proprietary schema.  Let’s make Britain’s data open again!

 

We hope you enjoyed this short piece on breaking the barriers related to interoperability.  What specific challenges have you faced relating to the implementation of interoperability?  Do you consider open data in this content is an opportunity or a threat? Would you prefer the National Digital Twin to be based on an open or a propriety schema?

Foundation Guide (2).jpg

the_pathway_towards_an_imf.pdf the_pathway_towards_an_imf.pdf DTHUb_NewbieGuide_May2020_(1).pdf HUB Version_DT Standards Roadmap_November 2020 (3).pdf DT Hub SII SUMMARY - Published.pdf

  • Like 1
 Share


User Feedback

Recommended Comments

One approach I am investigating is to adopt an Enterprise Service Bus type data broker.  

My take is that having a data broker as part of your DT solution appears to be a good fix for interoperability.  But.. I want to test this hypothesis.   I am coming from the world of an Enterprise Service Bus - where you can transform, translate, publish & subscribe all in one place.   
 
Check out the  DUET project.  They have implemented a data broker as part of their e2e solution.  https://www.digitalurbantwins.com

 

Link to comment
Share on other sites

I fully agree that the National Digital Twin must be based on an open schema to deliver interoperability and long term supportability. This doesn't mean that all data included in the National Digital Twin must be "Open Data", I'd fully expect some data sets to have restrictions on use cases and sharing, but the data schema used should be open so that anyone can design their own twins and applications to interface with it.

To begin with I'd expect almost all integrations to need transformation of most data fields to the common open schema, but as it matures and new customer twins are developed knowing that they'll rely on integration with the National Twin, these customer formats will start to converge (e.g. use of consistent data/time and geolocation formats) and the transformation will become simpler.

Link to comment
Share on other sites

I am hosting a Digital Twin interoperability discussion between the DUET (https://www.digitalurbantwins.com) and CRUNCH (http://www.fwe-nexus.eu) project.   DT Hub followers & members are welcome to participate.  

Sign up via the Doodle link here & drop me a mail chris.cooper@kn-i.com to add you to the distribution list if you wish to participate in this discussion.

https://doodle.com/poll/7ddmxy2yi4mp4wd9?utm_campaign=poll_added_participant_admin&utm_medium=email&utm_source=poll_transactional&utm_content=gotopoll-cta

We are looking at requirements and potential common purpose.  To that end a Google doc is here if you want to add any of your own requirements:  https://docs.google.com/document/d/1pq5XgtYviUyPeywAGyv91XyvUQgBJPjF5NZymJjMXgQ/edit?usp=sharing

Thanks Chris

  • Like 1
Link to comment
Share on other sites

A follow up call on Interoperability is going to be hosted on the 19th January at 11am GMT.  All DT members with an interest in this topic are very welcome to participate.  

Agenda: 

- Quick recap of the different projects

- a Discussion on the Interoperability requirements across DUET & CRUNCH

- a Discussion on answers to those requirements. What options are available today? 

- Wrap Up/ Next Steps

Please find the call info attached:  

Join Zoom Meeting 
https://zoom.us/j/95223403993?pwd=OTAyY1FBTGtEUHlrSWJMUlRYQm5nQT09

Meeting ID: 952 2340 3993 
Passcode: 540322 
One tap mobile 
+442039017895,,95223403993#,,,,*540322# United Kingdom 
+442080806591,,95223403993#,,,,*540322# United Kingdom 

Dial by your location 
        +44 203 901 7895 United Kingdom 
        +44 208 080 6591 United Kingdom 
        +44 208 080 6592 United Kingdom 
        +44 330 088 5830 United Kingdom 
        +44 131 460 1196 United Kingdom 
        +44 203 481 5237 United Kingdom 
        +44 203 481 5240 United Kingdom 
Meeting ID: 952 2340 3993 
Passcode: 540322

  • Like 1
Link to comment
Share on other sites

Andy Parnell-Hopkinson

Posted

@DRossiter87 just found this post thanks to the link you posted in the DT call just now (love this site but it's not easy to monitor). 

When we talk about interoperability in relation to built environment assets it's really important to identify when in the asset's lifecycle the data is to be shared as the creators and consumers of the asset data have different priorities and needs during its lifecycle. Enforcing interoperability or open standards through the whole asset lifecycle can be counter-productive to each stakeholders' priorities. Interoperability and open standards need to be aligned against those specific stage-based needs, rather than lobotomise or shackle specialist software tools that create the data more efficiently.

To summarise, don't throw the baby out with the bathwater. 🙂

Link to comment
Share on other sites


Top
×
×
  • Create New...