Homepage › Forums › General Discussion › Bring out your Digital Twin Challenges!
-
Bring out your Digital Twin Challenges!
Posted by Katie Walsh on September 8, 2021 at 1:37 pmInteresting Question:
What is one difficulty that you’ve encountered while trying to create a Digital Twin?
Context:
We’ve heard that creating a Digital Twin can be a bumpy road. Various challenges can get in the way no matter what sort of Digital Twin you’re trying to set up or why. We’ve noticed in various conversations on the DT Hub that there is a wide range of these challenges, from technical or cultural to those related to resources or supply chains, and so many more. We’d like to hear about your experiences, so please share them with us here. Just a few guidelines before you start:
One example at a time please – no lists! However, multiple posts are welcomed
Please cite the industry you’re talking about
Please: Your posts need to be pithy:
· Give each post a title that sums up your blocker
· Limit each post to 100 words or so, or supply a short summary at the top if you can’t.
· Please include an image, it helps your post stand out
We encourage you to like, or vote, on each other’s posts if you agree with them, your facilitator Joao and the DT Hub/ 100%Open are looking forward to reading your input.
Thank you.
Katie Walsh replied 3 years ago 1 Member · 18 Replies -
18 Replies
-
Specifying the key decisions / interventions that your digital twin will make and the information needed to support them
A challenge that, as the IMF team, we would like to bring to the forefront:
The starting point of a digital twin (DT) project is an original issue or purpose, a use case or use cases the project needs to address.
Once the original purpose for the DT project is outlined, we have found that a challenging step for organisations is to define the information requirements that will ensure that the DT (or DTs) resulting from the project collects the right information and information of the right quality to support the decisions/interventions it must take/make to be fit for purpose.
We believe that the following process-model based methodology provides an efficient route to specifying these information requirements:
- identify the core process(es), lifecycle processes (for instance periodic lifecycles like budgeting, asset lifecycles …), common processes (procurement, recruitment …) involved in the use case. You will likely need a DT (or DTs) of these core and lifecycle processes (or phases of them) and/or of the assets involved.
-
develop the models of these processes to at least the level where you can identify the key decisions/interventions
- specify the decisions/interventions that the DT will take/make
-
develop and document the requirements for the information needed to support these decisions/interventions
- specify the processes that the DT will use to create/capture the required data
Processes across organisations within a same industry and even across industries bear many commonalities. We believe that organisations would greatly benefit from the provision of standard process models that could be tailored to their specific context, helping them to identify the right information requirements for their DT projects.
-
Value doesn’t materialise where the efforts are
This is a fairly general challenge when trying to make data resources useful for purposes other they were created for.
Data is often created for specific purposes and there is typically additional effort in changing or optimising data, worrying about IP, data protection etc. and then publishing data. If data does get published, it’s likely that someone else will benefit. For data that is simple in structure, a by-product of other work and not sensitive, the barrier to making data Findable, Accessible, Interoperable and Reusable might be quite low–the more complex the datasets get, the higher the effort and therefore the barrier gets bigger.
So, how can organisations and individuals be incentivised to publish digital twins or data that contributes to digital twins when others might reap the benefits? I think the answer is to build communities of data sharers, align them to a common goal and create a common understanding for future value where everyone will benefit at some point.
This is possible but, drawing on experience with utility data, a really hard and lengthy process.
-
Will do and thank you @Katie Walsh.
I have just observed your learned contribution this morning on the Gemini Call [21.09.21] _ pls note my recent post following my in-person contribution at Housing 2021, Manchester a couple of weeks ago.
As a Chartered Surveyor, my principal focus is asset management [and legislative compliance] in a post-Grenfell world.
Principal blocker:
As with many things [to with innovation / transformation and technology led disruption] within and across the built environment & construction sector, the largest blocker is CULTURAL [as well as the need to understand – at an organisational level] their collective why?
I have recently been commissioned to lead on a ‘Leadership Programme’ for the social housing sector; any thoughts on overcoming ‘cultural reluctance’ due principally to fear of the unknown in an inherently risk averse sector [and one that continues to waste billions of £’s per annum]?
-
On 21/09/2021 at 10:56, JoaoF said:
Hi @CRT, thank you for your post. Are you saying that the effort involved in producing and sharing DT data, together with the perception of its value (benefiting others) is the roadblock you would like to highlight? Thanks
@JoaoFIt’s more that the generation of value and the effort needed to make this happen are often disconnected. If you actually share data, there won’t necessarily be an immediate payback.
-
Here are some challenges that our researchers have brought up in developing digital twins, paraphrased by me, so if they are in error the fault is mine and I welcome corrections:
- The value of digital twins in providing the right information at the right time, so a key challenge is determining the frequency and timeliness of data collection to provide useful, valuable insights to asset owners.
- With satellites, InSAR and other earth observation technologies, a challenge is in processing the high volume of data needed to quality-check the measurements taken in a timely manner.
- In creating a digital twin of a building, existing asset management processes have been established to take advantage of the knowledge of human asset managers and the data provided by building management systems. A key challenge is to develop digital twins that are capable of complementing these existing sources of knowledge and data by adding new value.
- Computer vision can help identify events and behaviours in the built environment without capturing footage of people, making it more acceptable from a privacy perspective. One important challenge to address is giving machine learning algorithms a fully representative training dataset so that biases are not introduced into the resulting data.
- Each sensor in a building or asset may only be able to detect one factor or phenomenon in isolation, but if multiple sensors become networked together in ‘smart’ ways, they may be able to detect ‘complex events’, events characterised by multiple phenomena happening in a specific order, time frame or physical orientation. Understanding how to combine sensors and human understanding into truly ‘smart’ buildings that can detect complex events and respond appropriately is a challenge.
- One promise of connected digital twins is seamless services provided to the public through digital technologies in the built environment. When designing a comprehensive service ecosystem enabled by connected digital twins, it is difficult to break down existing siloes: from a technical data sharing and interoperability standpoint; from a regulatory and geographical standpoint; and from the standpoint of existing processes and business models.
- When designing services based on connected digital twins, it is important to acknowledge the inequalities in access to digital technology based on socio-economic, geographic, age, education, ability and other factors. Exclusion from services or inequality of service provision based on these factors is a major issue to consider in the governance and development of connected digital twins for the public good.
-
On 21/09/2021 at 14:48, Kirsten Lamb said:
@Kirsten Many thanks for all the below. We are encouraging one thought on each post – but your list is fantastically clear and will help us a lot to kick start the first Jam, so thank you for posting.
Here are some challenges that our researchers have brought up in developing digital twins, paraphrased by me, so if they are in error the fault is mine and I welcome corrections:
- The value of digital twins in providing the right information at the right time, so a key challenge is determining the frequency and timeliness of data collection to provide useful, valuable insights to asset owners.
- With satellites, InSAR and other earth observation technologies, a challenge is in processing the high volume of data needed to quality-check the measurements taken in a timely manner.
- In creating a digital twin of a building, existing asset management processes have been established to take advantage of the knowledge of human asset managers and the data provided by building management systems. A key challenge is to develop digital twins that are capable of complementing these existing sources of knowledge and data by adding new value.
- Computer vision can help identify events and behaviours in the built environment without capturing footage of people, making it more acceptable from a privacy perspective. One important challenge to address is giving machine learning algorithms a fully representative training dataset so that biases are not introduced into the resulting data.
- Each sensor in a building or asset may only be able to detect one factor or phenomenon in isolation, but if multiple sensors become networked together in ‘smart’ ways, they may be able to detect ‘complex events’, events characterised by multiple phenomena happening in a specific order, time frame or physical orientation. Understanding how to combine sensors and human understanding into truly ‘smart’ buildings that can detect complex events and respond appropriately is a challenge.
- One promise of connected digital twins is seamless services provided to the public through digital technologies in the built environment. When designing a comprehensive service ecosystem enabled by connected digital twins, it is difficult to break down existing siloes: from a technical data sharing and interoperability standpoint; from a regulatory and geographical standpoint; and from the standpoint of existing processes and business models.
- When designing services based on connected digital twins, it is important to acknowledge the inequalities in access to digital technology based on socio-economic, geographic, age, education, ability and other factors. Exclusion from services or inequality of service provision based on these factors is a major issue to consider in the governance and development of connected digital twins for the public good.
-
We need Integrated modelling of Resources and Infrastructure with Coupled Iteration. Especially for Energy. Coupled Digital Twins.
The IEA Smart Grid Network (ISGAN) has done some coupled modelling and I have experience with Iterative Generation-Fuel optimisation models (albeit late 1970s)
Future and Fast Actions and Strategy papers attached. These and associated documents are linked at http://www.eleceffic.com -
Fractured/Lack of communication between the Digital and Physical
Industry: Defence
No sure how much of a Roadblock this is in the commercial world but it is definitely a Roadblock within a defence operational environment.
Assets that are provided to the MOD do not always have the capability to transfer data in real time; the lack of logistic communication has always been an issue when dealing with the A2 Echelons (Frontline) and further back down the Forward and Reverse Supply Chain, until good communication is established.
As part of the Design & Manufacturing Phase of a project, these requirements are often traded out because of cost and the known lack of ability to transfer this data. So even when operating where communications are good, there is no ability on the asset to automatically connect and transfer the data to the Digital.
HUMS data is a prime example of this information transfer Roadblock. The platforms do have the ability to capture this data but getting it off the platform and dealing with the different Security Classifications and aggregation of the data is another problem.
A lot of work has already been completed as part of the Logistic Coherence Information Architecture (LCIA) (Subject to Change) with regard to the data and what is required where.
Just a bit on the CADMID life cycle for Integrated Logistic Support (ILS) attached.
Hope this hits the mark and is an interesting discussion point 🙂
Regards
Rich
-
John Lewis Partnership – Retail (and other areas)
For me the bigest blocker is piority. There is very little money in retail and we are working with very lean teams to deliver just the day to day work. If I go to a manager to ask for permission to set up a Digital Twin I will be told no, we don’t have the time (FTE, money) or the money. I’ll also get the same response from out internal IT team, there isn’t the time (FTE money) to spend on projects like this, we need to keep the wheels on the bus.
Plus I’m often told stop talking about Star Wars stuff, we don’t need this.
-
Biggest blocker = quick fix
Let’s do it the ‘old’ way to solve one of the immediate problems because it’s quick and easy, despite the fact that it addresses none of the overarching or longer-term ambitions of a project and certainly does not allow any further growth in benefits.
This is our greatest challenge and can be a result of tech teams being poorly briefed or not bought in to the overall vision of a DT project, and simply see it as an integration problem which is solved by a crowbar.
-
On 20/09/2021 at 12:15, JoaoF said:
Hi @Anne thank you for sharing this great insight. It sounds like the process-model based methodology can work as an overall roadmap to identify and address roadblocks at different stages on the DT journey and help starting off the DT project.
You mention the definition of the information requirements as a major challenging step for organisations – is this the blocker you want us to take to the workshop? Thank you!
Indeed @JoaoF in the context of the first workshop, we would like to raise the definition of information requirements as a key challenge. We believe that applying a process-model based methodology can help organisations to overcome this challenge, by offering a systematic approach to identifying the information requirements and when information is most cost-effectively created.
-
Roadblock #1
1. the value proposition. We still cannot (collectively) articulate the cold hard cash value proposition to business leaders, in their language. if we had, the take up would be universal. Too much academia and not enough business talk.
-
Roadblock #2
The information itself. that contained in product data templates is only valuable to the manufacturers that populate it, COBie has little or no value to the actual maintainer (detach yourself from the mantra and actually ask a spanner wielder rather than a manager or academic) – Information costs to gather, manage and disseminate, so to make this worthwhile each piece needs to be valuable to someone. This definitive data dictionary that defines what is valuable to the end users throughout the lifecycle does not exist. This needs to be rectified!! (otherwise everything else is pointless!)
-
Many thanks @iain, we will add these in. And you are on the invite list for Jam 2.
-
On 05/10/2021 at 16:51, Katie Walsh said:
Many thanks @iain, we will add these in. And you are on the invite list for Jam 2.
Jam2?
Log in to reply.