Jump to content

Articles & Publications      Blogs

    Sensor technology has come a long way over the last 30 years, from the world’s first, bulky webcam at the University of Cambridge Computer Science Department to near ubiquitous networks of sleek sensors that can provide data at an unprecedented volume, velocity and quality. Today, sensors can even talk to each other to combine single points of data into useful insights about complex events. The new webcomic ‘Coffee Time’ by Dave Sheppard, part of the Digital Twin Journeys series, tells the story of this evolution and what it means for what we can learn about our built environment through smart sensors.  
    Starting with a simple problem – is there coffee in the lab’s kitchen? – researchers in the early 1990s set up the world’s first webcam to get the information they wanted. Today, people in the Computer Lab still want to know when the coffee is ready, but there are more ways to solve the problem, and new problems that can be solved, using smart sensors. Smart sensors don’t just send information from point A to point B, providing one type of data about one factor. That data needed to be collated and analysed to get insights. Now sensors can share data with each other and generate insights more instantaneously. 
    The West Cambridge Digital Twin team at the computer lab have looked at how specific sequences of sensor events can be combined into an insight that translates actions in the physical world into carefully defined digital events. When someone makes coffee, for example, they might turn on a machine to grind the coffee beans, triggering a smart sensor in the grinder. Then they’d lift the pot to fill it with water, triggering a weight sensor pad beneath to record a change in weight. Then they would switch the coffee machine on, triggering a sensor between the plug and the outlet that senses that the machine is drawing power. Those events in close succession, in that order, would tell the smart sensor network when the coffee is ready. 
    These sequences of sensor triggers are known as complex events. Using this technique, smart sensors in the built environment can detect and react to events like changes in building occupancy, fires and security threats. One advantage of this approach is that expensive, specialist sensors may not be needed to detect rarer occurrences if existing sensors can be programmed to detect them. Another is that simple, off-the-shelf sensors can detect events they were never designed to. As the comic points out, however, it is important to programme the correct sequence, timing and location of sensor triggers, or you may draw the wrong conclusion from the data that’s available. 
    Something as simple as wanting to know if the coffee is ready led to the first implementation of the webcam. Digital twin journeys can have simple beginnings, with solving a simple problem with a solution that’s accessible to you, sparking off an evolution that can scale up to solve a wide range of problems in the future. 
    You can read and download the full webcomic here.
    You can read more from the West Cambridge Digital Twin project by visiting their research profile. 
    This research forms part of the Centre for Digital Built Britain’s (CDBB) work at the University of Cambridge. It was enabled by the Construction Innovation Hub, of which CDBB is a core partner, and funded by UK Research and Innovation (UKRI) through the Industrial Strategy Challenge Fund (ISCF). 
    Read more...
    By 2050, an estimated 4.1 million people will be affected by sight loss in the UK, making up a portion of the 14.1 million disabled people in the UK. How might digital twins create opportunities for better accessibility and navigability of the built environment for blind and partially sighted people? A new infographic presents a conception of how this might work in the future.
    In their work with the Moorfields Eye Hospital in London, the Smart Hospitals of the Future research team have explored how user-focused services based on connected digital twins might work. Starting from a user perspective, the team have investigated ways in which digital technology can support better services, and their ideas for a more accessible, seamless experience are captured in a new infographic. 
    In the infographic, service user Suhani accesses assistive technology for blind people on her mobile phone to navigate her journey to an appointment at an eye hospital. On the way, she is aided by interoperable, live data from various digital twins that seamlessly respond to changing circumstances. The digital twins are undetectable to Suhani, but nevertheless they help her meet her goal of safely and comfortably getting to her appointment. They also help her doctors meet their goals of giving Suhani the best care possible. The doctors at the eye hospital are relying on a wider ecosystem of digital twins beyond their own building digital twin to make sure this happens, as Suhani’s successful journey to the hospital is vital to ensuring they can provide her with care. 
    Physical assets, such as buildings and transport networks, are not the only things represented in this hypothetical ecosystem of connected digital twins. A vital component pictured here are digital twins of patients based on their medical data, and the team brings up questions about the social acceptability and security of digital twins of people, particularly vulnerable people. 
    No community is a monolith, and disabled communities are no exception. The research team acknowledges that more research is needed with the user community of Moorfields to understand the variety of needs across the service pathway that digital twins could support. As such, developers need to consider the range of users with different abilities and work with those users to design a truly inclusive ecosystem of digital twins. The work by the Smart Hospitals research team raises wider questions about the role of digital technology both in creating more physical accessibility in the built environment but also potentially creating more barriers to digital accessibility. It is not enough to create assistive technologies if not everyone can – or wants to – have access to those technologies.  
    ‘The role of digital technologies in exacerbating potentially digital inequalities is something that needs to be looked at from a policy perspective, both at the hospital level, but also more generally, from a government Department of Health perspective,’ says Dr Michael Barrett, the project’s principal investigator.  
    Dr Karl Prince, co-investigator, reflects that, ‘The traditional questions when it comes to this type of technology are raised as to: do they have access to equipment, and do they have the technical ability?’ The lesson is that you can build digital twins that create a better experience for people if you design digital systems from the perspective of an ecosystems of services, with input from users of that ecosystem.  
    Through exciting case studies, the project raises vital questions about digital ethics and the potentially transformative effects of digital twins on the physical built environment.
    To read the infographic in detail, click here.
    You can read more from the Smart Hospitals project by visiting their research profile page. 
    This research forms part of the Centre for Digital Built Britain’s (CDBB) work at the University of Cambridge. It was enabled by the Construction Innovation Hub, of which CDBB is a core partner, and funded by UK Research and Innovation (UKRI) through the Industrial Strategy Challenge Fund (ISCF).  
    To join the conversation with others who are on their own digital twin journeys, join the Digital Twin Hub.

     
     
     
     
     
     
     
    Read more...
    Digital twins are not just a useful resource for understanding the here-and-now of built assets. If an asset changes condition or position over its lifecycle, historical data from remote sensors can make this change visible to asset managers through a digital twin. However, this means retaining and managing a potentially much larger data set in order to capture value across the whole life of an asset. In this blog post, Dr Sakthy Selvakumaran, an expert in remote sensing and monitoring, tells us about the importance of curation in the processing of high-volume built environment data.
    There are many sources of data in the built environment, in increasing volumes and with increasing accessibility. They include sensors added to existing structures – such as wireless fatigue sensors mounted on ageing steel bridges – or sensors attached to vehicles that use the assets. Sources also include sensing systems including fibre optics embedded in new structures to understand their capacity over the whole life of the asset. Even data not intended for the built environment can provide useful information; social media posts, geo-tagged photos and GPS from mobile phones can tell us about dynamic behaviours of assets in use.
    Remote sensing: a high-volume data resource
    My research group works with another data source – remote sensing – which includes satellite acquisitions, drone surveys and laser monitoring. There have been dramatic improvements in spatial, spectral, temporal and radiometric resolution of the data gathered by satellites, which is providing an increasing volume of data to study structures at a global scale. While these techniques have historically been prohibitively expensive, the cost of remote sensing is dropping. For example, we have been able to access optical, radar and other forms of satellite data to track the dynamic behaviour of assets for free through open access policy of the European Space Agency (ESA).
    The ESA Sentinel programme’s constellation of satellites fly over assets, bouncing radar off them and generating precise geospatial measurements every six days as they orbit the Earth. This growing data resource – not only of current data but of historical data – can help asset owners track changes in the position of their asset over its whole life. This process can even catch subsidence and other small positional shifts that may point to the need for maintenance, risk of structural instability, and other vital information, without the expense of embedding sensors in assets, particularly where they are difficult to access.
    Data curation
    One of the key insights I have gained in my work with the University of Cambridge’s Centre for Smart Infrastructure and Construction (CSIC) is that data curation is essential to capture the value from remote sensing and other data collection methods. High volumes of data are generated during the construction and operational management of assets. However, this data is often looked at only once before being deleted or archived, where it often becomes obsolete or inaccessible. This means that we are not getting the optimal financial return on our investment on that data, nor are we capturing its value in the broader sense.
    Combining data from different sources or compiling historical data can generate a lot of value, but the value is dependent on how it is stored and managed. Correct descriptions, security protocols and interoperability are important technical enablers. Social enablers include a culture of interdisciplinary collaboration, a common vision, and an understanding of the whole lifecycle of data. The crucial element that ensures we secure value from data is the consideration of how we store, structure and clean the data. We should be asking ourselves key questions as we develop data management processes, such as: ‘How will it stay up to date?’ ‘How will we ensure its quality?’ and ‘Who is responsible for managing it?’
    Interoperability and standardisation
    The more high-volume data sources are used to monitor the built environment, the more important it is that we curate our data to common standards – without these, we won’t even be able to compare apples with apples. For example, sometimes when I have compared data from different satellite providers, the same assets have different co-ordinates depending on the source of the data. Like ground manual surveying, remote measurements can be made relative to different points, many of which assume (rightly or wrongly) a non-moving, stationary point. Aligning our standards, especially for geospatial and time data, would enable researchers and practitioners to cross-check the accuracy of data from different sources, and give asset managers access to a broader picture of the performance of their assets.
    Automated processing
    The ever increasing quantity of data prohibits manual analysis by human operators beyond the most basic tasks. Therefore, the only way to enable data processing at this large scale is automation, fusing together remote sensing data analysis with domain-specific contextual understanding. This is especially true when monitoring dynamic urban environments, and the potential risks and hazards in these contexts. Failure to react quickly is tantamount to not reacting at all, so automated processing enables asset owners to make timely changes to improve the resilience of their assets. Much more research and development is needed to increase the availability and reliability of automated data curation in this space.
    If we fail to curate and manage data about our assets, then we fail to recognise and extract value from it. Without good data curation, we won’t be able to develop digital twins that provide the added value of insights across the whole life of assets. Data management forms the basis for connected digital twins, big data analysis, models, data mining and other activities, which then provide the opportunity for further insights and better decisions, creating value for researchers, asset owners and the public alike.
     
    You can read more from the Satellites project by visiting their research profile.
    This research forms part of the Centre for Digital Built Britain’s (CDBB) work at the University of Cambridge. It was enabled by the Construction Innovation Hub, of which CDBB is a core partner, and funded by UK Research and Innovation (UKRI) through the Industrial Strategy Challenge Fund (ISCF).
    For more on the Digital Twin Journeys projects, visit the project's homepage on the CDBB website.

    Read more...
    Our latest output from the Digital Twin Journeys series is a webcomic by David Sheppard. 'Now We Know' tells the story of a fictional building manager, Hank, who isn't sure how a building digital twin can help him in his work when the existing building management system tells him what he thinks he needs to know. 
    This same tension plays out around real-world digital twin development, as advocates point to the promise of perfect, right-time information to make better decisions, while others remain unconvinced of the value that digital twins can add. As the West Cambridge Digital Twin research team developed a prototype digital twin, they encountered this barrier, and found that working with the building-manager-as-expert to co-develop digital twin capability is the way to go. While they grounded iterations of the prototype in the building managers' present needs, they were also able to present the potential capability of the digital twin in ways that demonstrated its value. This is mirrored in the fictional narrative of the comic in the consultation between the Cambridge Digital Twin Team expert and the building manager, Hank.
    Involving end users, like building occupants and managers, in the design and development of digital twins will ensure that they meet real-world information needs. Both people and data bring value to the whole-life management of assets. Many uncertainties exist in the built environment, and in many cases when pure data-driven solutions get into trouble (e.g. through poor data curation or low data quality), expertise from asset managers can bolster automated and data-driven solutions. Therefore, incorporating the knowledge and expertise of the frontline managers is crucial to good decision-making in building operations. 
    The benefits of this hybrid approach work in the other direction as well. While the knowledge developed by building managers is often lost when people move on from the role, the digital twin enables the curation of data over time, making it possible to operate buildings beyond the tenure of individual staff members based on quality data.
    At present, the knowledge of experienced asset managers in combination with existing building information, is greater than the insights that early-stage digital twins can offer. But that does not mean that the promise of digital twins is a false one. It simply means that there is still a long way to go to realise the vision of right-time, predictive information portrayed in the comic. Digital twin prototypes should be developed in partnership with these experienced stakeholders.
    You can read more from the West Cambridge Digital Twin project by visiting their research profile, and find out about more Digital Twin Journeys on the project's homepage.
    This research forms part of the Centre for Digital Built Britain’s (CDBB) work at the University of Cambridge. It was enabled by the Construction Innovation Hub, of which CDBB is a core partner, and funded by UK Research and Innovation (UKRI) through the Industrial Strategy Challenge Fund (ISCF).
    Read more...
    When we travel by train, we expect that we will arrive at our destination safely and on time. Safety and performance of their service network is therefore a key priority for Network Rail. Our latest video in the Digital Twin Journeys series tells the story of how researchers have inherited two intensively instrumented bridges and are transforming that high volume and velocity of data into a digital twin showing the wear and pressures on the bridges, as well as other information that can help the asset owners predict when maintenance will be required and meet their key priorities.
    Remote monitoring has several benefits over using human inspectors alone. Sensors reduce the subjectivity of monitoring. Factors such as light levels, weather and variations in alertness can change the subjective assessments made by human inspectors. They may also be able to identify issues arising before visual inspection can detect them by monitoring the stresses on the bridge. A human inspector will still be sent to site to follow up on what the remote sensing has indicated, and engineers will of course still need to perform maintenance. However, remote monitoring allows the asset owners to be smarter about how these human resources are deployed. 
    One important insight for Network Rail is based on more accurate data about the loads the bridges are experiencing, and the research team have developed a combination of sensors to make a Bridge Weigh-In-Motion (B-WIM) Technology. As shown in the video, a combination of tilt sensors, bridge deformation and axle location sensors to calculate the weight of passing trains. As the accuracy of weight prediction data is impacted by changes to ambient humidity and temperature, sensors were added that detect these factors as well. Accelerometers were added to calculate rotational restraints at the boundary conditions to improve the accuracy of weight predictions and cameras were installed so that passing trains can be categorised by analysing the video footage.   
    The digital twin of the Staffordshire Bridges centres on a physics-based model for conducting structural analysis and load-carrying capacity assessments. The site-specific information, such as realistic loading conditions obtained by the sensors, will be fed into the physics-based model to simulate the real structure and provide the outputs of interest. A digital twin replica of the structure will be able to provide bridge engineers with any parameter of interest anywhere on the structure, including in non-instrumented locations.
    All of the sensors on these bridges produce a high volume of data at a high velocity. Without data curation, we could easily be overwhelmed by the volume of data they produce, but the research team is learning to narrow down to managing the right data in ways that provide the right insights at the right time. Working with Network Rail, this project will demonstrate the use of real-time data analytics integrated with digital twins to provide useful information to support engineers and asset managers to schedule proactive maintenance programmes and optimise future designs, increasing safety and reliability across their whole portfolio of assets. 
    You can read more from the Staffordshire Bridges project by visiting their research profile.
    This research forms part of the Centre for Digital Built Britain’s (CDBB) work at the University of Cambridge. It was enabled by the Construction Innovation Hub, of which CDBB is a core partner, and funded by UK Research and Innovation (UKRI) through the Industrial Strategy Challenge Fund (ISCF).
    To see more from the Digital Twin Journeys series, see the homepage on the CDBB website.

    Read more...
    A new infographic, enabled by the Construction Innovation Hub, is published today to bring to life a prototype digital twin of the Institute for Manufacturing (IfM) on the West Cambridge campus. Xiang Xie and Henry Fenby-Taylor discuss the infographic and lessons learned from the project.
    The research team for the West Cambridge Digital Twin project has developed a digital twin that allows various formats of building data to function interoperably, enabling better insights and optimisation for asset managers and better value per whole life Pound.  
    The graphic centres the asset manager as a decision maker as a vital part of this process, and illustrates that each iteration improves the classification and refinement of the data. It also highlights challenges and areas for future development, showing that digital twin development is an ongoing journey, not finite destination. 
    The process of drawing data from a variety of sources into a digital twin and transforming it into insights goes through an iterative cycle of:  
    Sense/Ingest - use sensor arrays to collect data, or draw on pre-existing static data, e.g. a geometric model of the building  Classify - label, aggregate, sort and describe data  Refine - select what data is useful to the decision-maker at what times and filter it into an interface designed to provide insights  Decide – use insights to weigh up options and decide on further actions  Act/Optimise - feed changes and developments to the physical and digital twins to optimise both building performance and the effectiveness of the digital twin at supporting organisational goals.  Buildings can draw data from static building models, quasi-dynamic building management systems and smart sensors, all with different data types, frequencies and formats. This means that a significant amount of time and resources are needed to manually search, query, verify and analyse building data that is scattered across different databases, and this process can lead to errors. 
    The aim of the West Cambridge Digital Twin research facility project is to integrate data from these various sources and automate the classification and refinement for easier, more timely decision-making. In their case study, the team has created a digital twin based on a common data environment (CDE) that is able to integrate data from a variety of sources. The Industry Foundation Classes (IFC) schema is used to capture the building geometry information, categorising building zones and the components they contain. Meanwhile, a domain vocabulary and taxonomy describe how the components function together as a system to provide building services. 
    The key to achieving this aim was understanding the need behind the building management processes already in place. This meant using the expertise and experience of the building manager to inform the design of a digital twin that was useful and usable within those processes. This points to digital twin development as a socio-technical project, involving culture change, collaboration and alignment with strategic aims, as well as technical problem solving.
    In the future, the team wants to develop twins that can enhance the environmental and economic performance of buildings. Further research is also needed to improve the automation at the Classify and Refine stages so they continue to get better at recognising what information is needed to achieve organisational goals. 
    You can read more from the West Cambridge Digital Twin project by visiting their research profile. 
    This research forms part of the Centre for Digital Built Britain’s (CDBB) work at the University of Cambridge. It was enabled by the Construction Innovation Hub, of which CDBB is a core partner, and funded by UK Research and Innovation (UKRI) through the Industrial Strategy Challenge Fund (ISCF).  
    To see more from the Digital Twin Journeys series, see the homepage on the CDBB website.
    Read more...
    Digital twins enable asset owners to use better information at the right time to make better decisions. Exploring the early stages of a digital twin journey – understanding the information need – are Staffordshire Bridges researcher Dr Farhad Huseynov and Head of Information Management Henry Fenby-Taylor.
    Network Rail manages over 28,000 bridges, with many being more than 150 years old. The primary means of evaluating the condition of the bridges is through two assessment programmes; visual examination and Strength Capability Assessment. Every conceivable form of bridge construction is represented across Network Rail’s portfolio of assets, from simple stone slabs to large estuary crossings, such as the Forth Bridge. Managing a portfolio of this diversity with frequent and extensive assessments is a considerable challenge.
    Condition monitoring
    The current process for condition monitoring involves visual examination by engineers and takes place every year, along with a more detailed examination every six years. The visual inspection provides a qualitative outcome and does not directly predict the bridge strength; it is conducted to keep a detailed record of visible changes that may indicate deterioration. The load-carrying capacity of bridges is evaluated every five years through a Strength Capability Assessment, conducted in three levels of detail:
    Level 1 is the simplest, using safety assumptions known to be conservatively over-cautious (i.e. 1-dimensional structural idealisation). Level 2 involves refined analysis and better structural idealisation (i.e. grillage model). This level may also include the use of data on material strength based on recent material tests, etc. Level 3 is the most sophisticated level of assessment, requiring bridge-specific traffic loading information based on a statistical model of the known traffic.  Understanding the information and insights that asset owners require helps shape what data is needed and how frequently it should be collected – two essential factors in creating infrastructure that is genuinely smart. During the discussions with Network Rail, the research team found that Level 3 assessment is only used in exceptional circumstances. This is because there is no active live train load monitoring system on the network; hence there is no site-specific traffic loading information available for the majority of bridges. Instead, bridges failing Level 2 assessment are typically put under weight and/or speed restrictions, reducing their ability to contribute to the network. This means that there is potentially huge value in providing Level 3 assessment at key sites with greater frequency.
    Digital twins for condition assessment
    The Stafford Area Improvement Programme was setup to remove a bottleneck in the West Coast Main Line that resulted in high-speed trains being impeded by slower local passenger and goods trains. To increase network capacity and efficiency, a major upgrade of the line was undertaken, including the construction of 10 new bridges. Working with Atkins, Laing O’Rourke, Volker Rail and Network Rail, a research team including the Centre for Smart Infrastructure and Construction (CSIC), the Centre for Digital Built Britain (CDBB) and the Laing O’Rourke (LOR) Centre for Construction Engineering and Technology at the University of Cambridge is collaborating with Network Rail to find a digital twin solution for effective condition monitoring.
    Two bridges in the scheme were built with a variety of different sensors to create a prototype that would enable the team to understand their condition, performance and utilisation. Both bridges were densely instrumented with fibre optic sensors during construction, enabling the creation of a digital twin of the bridges in use. The digital twin’s objective is to provide an effective condition monitoring tool for asset and route managers, using the sensor array to generate data and derive insights.
    Identifying challenges and solutions
    Meetings were held with key stakeholders including route managers and infrastructure engineers at Network Rail to learn the main challenges they face in maintaining their bridge stock, and to discover what information they would ideally like to obtain from an effective condition monitoring tool. The team liaised closely with the key stakeholders throughout to make sure that they were developing valuable insights.
    Through discussions with Network Rail about the team’s work on the two instrumented bridges in the Staffordshire Bridges project the following fundamental issues and expected outcomes were identified:
    A better understanding of asset risks: How can these be predicted? What precursors can be measured and detected? A better understanding of individual asset behaviour Development of sensor technology with a lifespan and maintenance requirement congruent with the assets that they are monitoring How structural capability be calculated instantly on the receipt of new data from the field Development of a holistic system for the overall health monitoring and prognosis of structures assets Realistic traffic population data in the UK railway network. (Can this be predicted with sufficient accuracy for freight control and monitoring purposes?) To address these issues, the team instrumented one of the bridges with the following additional sensors, which, combined, produce a rich dataset:
    Rangefinder sensors to obtain the axle locations. A humidity and temperature sensor to improve the accuracy of weight predictions against variations in ambient temperature. Accelerometers to calculate rotational restraints at the boundary conditions and therefore improve the accuracy of weight predictions. Cameras to categorise passing trains.  
    Data from these sensors feeds into a finite element model structural analysis digital twin that interprets the data and provides a range of insights about the performance of the bridge and the actual strain it has been put under.
    Applying insights to other bridges
    Significantly, information from the instrumented bridge sites is relevant to adjacent bridges on the same line. Having one bridge instrumented on a specific route would enable Level 3 assessment for other structures in their portfolio and those of other asset owners, including retaining walls, culverts, and other associated structures. Just as the new bridges relieved a service bottleneck, digital twins can resolve procedural and resource bottlenecks by enabling insights to be drawn about the condition of other assets that weren’t instrumented.
    This is a valuable insight for those developing their own digital twins, because given that one bridge is instrumented it follows that where trains cannot have diverted course, then any other bridges along that same stretch of track will be undergoing the same strain from the same trains. This insight will enable teams implementing sensors to be able to efficiently implement a sensor network across their own assets.
    One of the outcomes of the Staffordshire Bridges project is development towards a holistic approach for the overall health monitoring and prognosis of bridge stocks. Such changes improve workforce safety by reducing the requirement for costly site visits while maintaining a healthy bridge network.
    You can read more from the Staffordshire Bridges project by visiting their research profile.
    This research forms part of the Centre for Digital Built Britain’s (CDBB) work at the University of Cambridge. It was enabled by the Construction Innovation Hub, of which CDBB is a core partner, and funded by UK Research and Innovation (UKRI) through the Industrial Strategy Challenge Fund (ISCF). 
    To keep up with the Digital Twin Journeys project, check out the Digital Twin Journeys home page.
    Read more...
    Digital twins can help organisations achieve various goals. In some cases, the end goal is for buildings and infrastructure to last longer, use less energy, and be safer. In others, it is enhancing the lives of people who interact with the built environment and its services. As highlighted by the Gemini Principles, these are not mutually exclusive aims, so wherever you are on your digital twin journey, it is important to consider other perspectives on the hybrid digital and physical systems you create. How will your digital twin fit into a wider ecosystem that provides services to all kinds of people? How will your asset’s performance impact the wider built environment and those who need to navigate it? Whose lives will be better if you share data securely and purposefully.
    In the first output from the Digital Twin Journeys series, the team working on the Smart Hospital of the Future research project, enabled by the Construction Innovation Hub, shared case studies from two smart hospitals and reflect on the innovations they saw during the COVID-19 pandemic. In this two video mini-series, the research team shares insights about how existing digital maturity enabled these hospitals to respond to the pandemic in agile ways, transforming to a hybrid physical and digital model of care distributed across multiple sites. They also explored how individual asset digital twins fit into a wider landscape of ecosystem services, guiding how we approach interoperability to achieve better outcomes.


    These insights inform the way we think about the role of digital twins in the smart built environments of the future. Dr Nirit Pilosof reflects that, ‘Digital twin as a concept can promote the design of the new system, the design process of the built environment and the technologies, but also really help operate… the hybrid models looking at the physical and virtual environments together.’ If health care is enabled by connected digital twins, how could the design of hospitals – and whole cities – change? 
    In the videos, the team also discusses the limitations and ethics of services enabled by digital data and the use of digital technologies to improve staff safety, from isolated COVID wards to telemedicine. They frame service innovation as an iterative and collaborative process, informed by the needs of digital twin users, whether those are the asset owners and operators, or the people benefitting from the services they provide. 
    According to project co-lead Dr Michael Barrett, ‘The people who need to drive the change are the people who are providing the service.' After the COVID crisis, we can better recognise what we have learned from implementing digital services at scale, as more people than ever have relied on them. The team reflect that having the right people in the right roles enabled the smart hospitals in these cases to transform their services rapidly in response to the need. The same human and organisational infrastructure that is creating the smart hospital of the future is also needed to create the flexible, responsive built environments of the future.
    Digital Twin Journeys can start from the perspective of available technology, from a problem-solving perspective, or from the perspective of users experiencing a service ecosystem. The smart hospitals project demonstrates the value of the latter two approaches. Hospital staff were instrumental in shaping the digitally-enabled service innovation to keep them safe and offer better services on and offsite, but project co-lead Dr Karl Prince points out how people accessing those services have to navigate a variety of different services in the built environment to get there. As we begin to connect digital twins together, we need to consider not just our own needs but the needs of others that digital twins can address. 
    For more on this project, including links to their publications, see the team’s research profile on the CDBB website. Keep up with the Digital Twin Journeys series on the CDBB website or here on the Digital Twin Hub blog.
    Read more...
    This week marks the one-year anniversary of the National Digital Twin programme’s (NDTp) weekly Gemini Call – an online progress update from the NDTp with a feature spot for members of the Digital Twin Hub to showcase projects and share digital twin knowledge and experiences. DT Hub Chair, Sam Chorlton, tells us about the call, its beginnings and the latest move to the DT Hub.
    There’s no doubt that the Gemini Call has been a game-changing addition to the NDTp. Brought about by CDBB CReDo Lead, Sarah Hayes, we launched it in September 2020 as part of the Gemini programme to inform our friends and followers about programme developments.
    In its early days, the call also played a major part in opening the dialogue for creation and delivery of NDTp projects, notably the Digital Twin Toolkit project, which resulted in a report and template package to help organisations build a business case for a digital twin. (We’re excited that the template has since been downloaded approaching 1,000 times.)
    We could not have achieved the Toolkit project without the input of supporters across 17 DT Hub member organisations, and it was the members’ pro bono contributions and willingness to collaborate in this venture that enabled us to open up opportunities for knowledge sharing and discussions about digital twin journeys.
    By the community, for the community
    Today, the half-hour Gemini Call brings in around 60 participants each week, and over the year nearly 300 members have attended at least once. This year we have changed the agenda to allow for a feature focus by DT Hub member organisations to present digital twin projects or research, followed by a forum for Q&A. To date, there have been 16 digital twin presentations given by organisations worldwide. It is this free exchange of knowledge and open discussion between members of the community that is pushing progress on an international scale.
    Sarah Hayes gives her take on the year, “We’re thrilled with what has happened with the call and we are telling everyone to come and get involved. We have over 2,000 members from government, public and private industry sectors and academia, and there is so much we can all learn from one another. Right now, there is a ground swell of connected digital twin development, and the DT Hub community can access this first hand.”
    Gemini Call chair and Digital Energy leader at Arup, Simon Evans, said, “The call has been an excellent forum to bring industry together, whatever the experience or involvement with digital twins, and provide that regular knowledge-share and update on leading international digital twin developments.”
    The Gemini Call sits centre stage within the DT Hub community as a member-focused exchange to help organisations increase their digital twin knowhow - it is a focal point for the community as we experience and drive digital transformation. Come and join the conversation!
    Progressing by sharing challenge
    One year on, we set this challenge to our members: invite a guest from your network to the next Gemini Call so we can expand the discussion and break down the barriers to sharing data.
    Become a DT Hub member
    Sign up to join the Gemini Call
     

    Read more...
    An update from the Information Management Framework Team of the National Digital Twin programme
    The mission of the National Digital Twin programme (NDTp) is to enable the National Digital Twin (NDT), an ecosystem of connected digital twins, where high quality data is shared securely and effectively between organisations and across sectors. By connecting digital twins, we can reap the additional value that comes from shared data as opposed to isolated data: better information, leading to better decisions through a systems thinking approach, which in turn enable better outcomes for society, the economy and our environment.
    The NDTp’s approach to data sharing is ambitious: we are aiming for a step change in data integration, one where the meaning is captured sufficiently accurately that it can be shared unambiguously. Conscious that “data integration” may justifiably mean different things to different people, we would like to shed some light on our current thinking and present one of the tools we are currently developing to help us articulate the need for this step change. It is a scheme for assessing the level of digitalisation of data items based upon four classifiers: the extent of what is known, media, form, and semantics. The scheme entails the 8 levels below - which are likely to be finetuned as we continue to apply the scheme to assess real data sets:
    Levels of digitalisation: towards grounded semantics

    We trust that the first levels will resonate with your own experience of the subject:
    Extent: as it is not possible to represent what is unknown, the scheme starts by differentiating the known from the unknown. By looking into the information requirements of an organisation, “uncharted territories” may be uncovered, which will need to be mapped as part of the digitalisation journey.
      Media: information stored on paper (or only in brains) must be documented and stored in computer systems.
      Form: information held in electronic documents such as PDFs, Word documents, and most spreadsheets, needs to be made computer readable, i.e. moved to information being held as data, in databases and knowledge graphs for example.
      Semantics: the progression towards “grounded semantics” and in particular the step from the “explicit” level to the “grounded” level is where, we believe, the fundamental change of paradigm must occur. To set the context for this step, it is worth going back to some fundamental considerations about the foundational model for the Integration Architecture of the NDT. From a Point-to-Point model to a Hub and Spoke model empowered by grounded semantics

    A key challenge at present is how to share data effectively and efficiently. What tends to happen organically is that point-to-point interfaces are developed as requirements are identified between systems with different data models and perhaps reference data. The problem is that this does not scale well. As more systems need to be connected, new interfaces are developed which share the same data to different systems, using different data models and reference data. Further there are maintenance problems, because when a system is updated, then its interfaces are likely to need updating as well. This burden has been known to limit the effective sharing of data as well as imposing high costs.
    The alternative is a hub and spoke architecture. In this approach, each system has just one interface to the hub, which is defined by having a single data model and reference data, that all systems translate into and out of. It is important to note, that although this could be some central system, it does not need to be, the hub can be virtual with data being shared over a messaging system according to the hub data model and reference data. This reduces costs significantly and means that data sharing can be achieved more efficiently and effectively. Neither is this novel. The existing Industry Standard Data Models were developed to achieve exactly this model. The new piece is that the requirement now is to be able to share data across sectors, not just within a single sector, and to meet more demanding requirements.
    Thus, the National Digital Twin programme is developing a Foundation Data Model (a pan-industry, extensible data model), enabling information to be taken from any source and amendments to be made on a single node basis.
    But what would differentiate the NDT's common language - the Foundation Data Model - from existing industry data models?
    Our claim is that the missing piece in most existing industry data models which have “explicit semantics”, is an ontological foundation, i.e. ”grounded semantics”.
    Experience has shown us that although there is just one real world to model, there is more than one way to look at it, which gives way to a variety of data models representing the same “things” differently and eventually, to challenges for data integration. To tackle them, we recommend to clarify ontological commitments (see our first conclusions on the choice of a Top Level Ontology for the NDT’s Foundation Data Model) so that a clear, accurate and consistent view of “the things that exist and the rules that govern them” can be established. We believe that analysing datasets through this lens and semantically enriching them is a key step towards better data integration.
    As we begin to accompany organisations in their journey towards “grounded semantics”, we are looking forward to sharing more details on the learnings and emerging methodologies on the DT Hub. We hope this window into our current thinking, which is by no mean definitive, has given you a good sense of where the positive disruption will come from … We are happy to see our claims challenged, so please do share your thoughts and ask questions.
    Read more...
    Creating Digital Twins can be like sailing in uncharted waters, so how do you handle it when unforeseen challenges rock the boat? Can you even predict what kinds of things will disrupt your journey? We’ve noticed in various conversations on the DT Hub that no matter what sort of Digital Twin you’re trying to set up or why, there is an incredibly wide range of potential disruptions. From technical to cultural, from resources to supply chains, almost every avenue is susceptible to producing a challenge somewhere. Many examples that we’ve already seen have only become apparent once the people developing Digital Twins are up against them in real time, so that’s why the DT Hub has launched this new activity, Defining Our Digital Twin Challenges!
     
    We would like to know about the challenges you’ve encountered on your DT journey in order to make the overall roadmap easier to follow. 
     
    The information you provide will help us to ultimately define our common challenges so we can start to solve them together. This series of thematic workshops, run by the DT Hub, will progress the conversation around the Digital Twin Journey, and surface some of the challenges that organisations are still facing whilst embarking on their journey. Each Challenge will culminate in an Activity, where we will present the specific challenge areas that you have brought to us to a select group in order to provide constructive feedback. The outcome of these workshops will be to share insights from inside and outside the community for the benefit of the community as a whole.
     
    You can use this activity Bring out your Digital Twin Challenges to explore your challenges with others, and our crowd facilitator, Joao, will be interacting with you to make sure you get the best experience possible. Joao is a former market researcher, court interpreter and has been a brilliant member of our team for years as a 100%Open Associate. We look forward to your invaluable contributions, and in turn the exponential development of the DT journey.

     

    Read more...
    Since its creation in 2018, the National Digital Twin programme (NDTp) has had three objectives: 
    Enable a National Digital Twin – an ecosystem of connected digital twins to foster better outcomes from our built environment 
    Deliver an Information Management Framework – ensure secure resilient data sharing and effective information management 
    Align a Digital Framework Task Group to [provide coordination and alignment among key players. 
    In 2021 and with the Digital Framework Task Group of senior leaders from industry, academia and government overseeing progress, it is at a point where key projects are being realised and support for its work is gathering momentum. Here is a summary of the latest developments. 
    The Digital Twin Hub community is now in excess of 2,000 members and its remit to create technical foundations and to provide a co-ordinated community in which to share expertise and knowhow on digital twins is being met with enthusiasm and support from a diverse range of participants across the UK and beyond.  
    This year is proving pivotal in terms of active engagement with our members to better understand their digital maturity and needs, especially through surveys, community activities and international summits. And in parallel is the publication of key documents and resources including the Digital Twin Toolkit and upcoming Collaborative Workshop to help companies make their business cases, and the Digital Twin Standards roadmap, a culmination of work by the British Standards Institute (BSI), which enables a framework for information management and sets out our programme for the next few years.  
    Key to these activities is the willingness of members from both academic and industrial fields to share their own knowledge and experiences. The DT Hub is launching a new series titled Digital Twin Journeys to focus on academic research and lessons learned from digital twin projects focused on construction: buildings, infrastructure and industrial, and satellite applications. In parallel, we will engage with industry to run a consultation on our Flex 260 Standards as well as a second Smart Infrastructure Index (SII) Survey which tracks, in the first instance, digital and organisational maturity levels of asset owner and operator members.  
    At the end of August, we also announced the launch of three thematic workshops to address Digital Twin Roadblocks by progressing the conversation and surfacing the challenges faced by organisations while embarking on their digital twin journeys. The aim is for members to discuss experiences and to elicit the main challenges and blockers encountered in their programmes to date. These monthly workshops will commence at the end of September 2021.  
    Our work on the Information Management Framework, to allow the smooth adoption of digital twin technologies, has also gathered pace with the introduction of a methodology to divide the information management space into manageable segments. The 7 circles approach provides the building blocks for informed decision making and will deliver better information management and information sharing on a national scale. 
    The NDTp’s CReDo project will be running a webinar on 2 November 2021 to coincide with COP26 to give insight into our plans to develop a digital twin across water, energy and telecoms to improve resilience across the infrastructure system. CReDo – Climate Resilience Demonstrator – is applying an Information Management Framework approach to share data across water, energy and telecoms service providers, combined with hydrology and climate data from the Met Office, to help plan for and adapt to the cascading effects of increased flooding due to climate change. Registration for the webinar will be opening soon. 

    Read more...
    Matthew West, Technical Lead, National Digital Twin Programme, introduces a video on the 7 circles of Information Management and Process Model Information Requirements.
    Join Matthew and Al Cook, a member of the technical team of the NDTp and an expert in data integration activities and information security, as they take you through key elements of the Information Management Framework and detail a new approach to effective information management. 
    A video is available to view below, with a live Q&A session from 10:00 to 10:30 on Thursday 15 July 2021.
    Access to quality and well-managed information in organisations is key to support decision making and optimise outcomes at all levels. Decisions based on poor quality information, or no information at all, can significantly increase the risk of mistakes or even disasters.
    Systematically implementing information management ensures the ability to deliver the right information to the right decision-makers, at the right time. It is a critical success factor for the National Digital Twin (NDT), an ecosystem of connected Digital Twins where high-quality data is shared securely, on a massive scale to improve decision making across the UK.
    The “7 circles of Information Management”: developing the Information Management Framework
    The Information Management Framework (IMF), a collection of open, technical and non-technical standards, guidance and common resources, is intended to enable better information management and information sharing at a national scale and provide the building blocks to be (re)used by those wishing to be part of the NDT.

     
    The scope of the IMF is broad and the “7 circles diagram” that I introduce in the video below is a pragmatic way to divide the Information Management space into areas of concern that can be addressed separately as well as supporting each other. It is intended to help identify areas and associated NDTp deliverables that are of particular relevance to you.
    The technical aspects of the IMF may come first to your mind. On top of “information transport” mechanisms, together with authorisation and security protocols, to ensure that information can be accessed seamlessly, the NDT needs a language, an inter-lingua, so that data can be shared consistently and used to support decisions without requiring any further “data wrangling”. To develop this common language (the NDT’s ontology) the team is pursuing a principled approach, deeply grounded in mathematics and science to ensure that it is as extensible and all-encompassing as possible. This is what the deepest circles of the 7 circles diagram are about.
    There is, however, much more to the Information Management Framework than the purely technical aspects, and as part of the highest circles of the 7 circles diagram, we are developing guidance on how to systematically improve information management so that producing data that meets the quality standards required to be part of the NDT becomes part of “business as usual”.
     
    A first step towards better information management: defining your information requirements
    This means that while the NDT’s ontology is being developed, steps can be taken to work towards better information management. Organisations need to reach a point of recognition that there is a need to address data quality in a way that enables improved decisions within their own business and with those they have data-based relationships with. And defining Information Requirements (the second circle in the stack) is a key starting point.
    Process Model based Information Requirements
    Too often, information requirements are incomplete or even absent in organisations, with the implication that if requirements are not identified and agreed there is no reason that they would be met. As part of the second circle of the “7 circles diagram”, the team has released a paper outlining the proposed approach to developing information requirements, based on the analysis of process models. This is a novel approach, ensuring the systematic identification of information needed (no more, no less) to support decisions and to identify where it is captured.
    I encourage you to watch Al Cook’s presentation in the second part of the video to find out more about this approach.
    The team and I hope to share more detailed guidance on information management in the near future, helping you to assess your organisation’s current information management maturity, prioritise areas for obvious improvements in decision-making and start addressing them, so that mistakes can be avoided and better outcomes achieved.
    And as we continue to further develop the Information Management Framework, we look forward to accompanying you through the discovery of other circles among the 7 circles of Information Management.
    This video contains an introduction to the 7 circles of Information Management presented by Matthew West followed by a presentation by Al Cook on a suggested approach to define information requirements. Al and Matthew look forward to answering your questions and talking about next steps in a live Q&A session on the DT Hub, on the 15/07 from 10:00 to 10:30.
     
     
    Read more...
    The vision of a National Digital Twin as an ecosystem of connected digital twins enabling better social and economic outcomes across the built environment continues to gain wide support. But to make it a reality, we need people with the right skills to put it into play.  
    “Collaborate on the rules and compete on the game” is a phrase we use to describe how we want connected digital twins to evolve. The sporting analogy carries over well into skills. We want the best teams to deliver on the National Digital Twin, not just a team of strikers or goalkeepers but diverse teams with a range of skillsets and capabilities. Diversity has to be at the heart of a skills strategy ensuring that the future workforce is more effective. 
    The skills & competency framework sets out the skills that are needed to manage information and work with data in the future. These aren’t just what we might see as hardcore technical skills such as data modelling and analytics which are described as digital skills but also business skills like transformational leadership which recognises the benefits of getting information management right. 
    The capability enhancement programme sets out pathways for individuals and organisations to get the right skills in place depending upon aspirations both at the personal level and the organisational level. Have a go at the self-assessment questionnaire to assess what training might be helpful to you and take a look at the training register to find a suitable course. 
    The National Digital Twin is a long term journey and there is time to get the right skills in place to reach our destination. 
    Read more...
    The DT toolkit is a simple guide to the things you need to think about on your digital twin journey. It arose from the request we’ve heard from the DT Hub community “how do I make the business case for a digital twin”. In response, through the Gemini programme of the National Digital Twin programme, we’ve been able to bring together people who have been through or are going through the digital twin journey from different perspectives: consulting, technology development, legal and academic and who are willing to share their learning and expertise with the DT Hub community.
    The team first met back in September 2020 to discuss how we might put together a toolkit for making the business case. We discussed how it would need to focus on purpose, relevant examples, a roadmap and a usable business case template. We debated the use cases for articulating digital twin purpose and took this to a DT Hub workshop to garner community input to what is now a use case framework which is starting to resonate in meetings and presentations. We presented and discussed case studies of digital twins that have been developed or are being developed which can be found on the DT Hub. We spent a long time talking through the steps organisations need to go through to implement a digital twin and as a result produced the roadmap which you can find in the report. We talked about digital twin sophistication levels. And members of the team worked together to really think through what a business case template might look like and what you would need to put together to get sign off for your digital twin. This template is now freely available as a resource for you to download and use.
    This DT toolkit report is a true collaboration of diverse minds working in the field of digital twins who are open to challenge and debate. The result is a toolkit that you can use to set you and your team on your digital twin journey. As with all journeys, the toolkit is now at its first pit-stop and the toolkit team are going to use it with their clients and provide feedback on how to improve and fine tune it. We invite you to do the same, read the toolkit report, try it out and tell us what you think.  
    We are very grateful for the passion and dedication that the Toolkit Team have shown towards putting the toolkit together. Working with limited resources we have been reliant on our volunteer’s goodwill and conviction that the work of the National Digital Twin programme is something they want to be involved in and contribute to.  Drawing from across the disciplines and different organisations, we’ve been really boosted by the support we’ve received from a  team of people going through the digital twin journey and enthusiastic about sharing their experience and ideas with the wider community.  
     If you would like to share your learning and experience with the community and take part in the next iteration of the Toolkit, reach out to us. We can all work together to make this a valuable community resource. 
     
    “The toolkit captures know-how and insights from people with experience of developing and using digital twins.  Steps are given that provide the reader with valuable guidance for justifying, building and exploiting twins, increasing value and reducing the risk of change.” Dr Peter van Manen, Frazer-Nash Consultancy
    “Working collaboratively with people from a variety of industries and experiences, has been not only invaluable to the construct of the Toolkit but also, fun, inspiring and wholesome to participate in.” Peter Curtis, Sopra Steria
    “Working on the DT toolkit has been an excellent way to socialise my thoughts and the Catapult’s work on Digital Twins, while increasing my understanding of DTs, through discussions with the other team members.” Ron Oren – Connected Places Catapult
    Read more...
    Standards make everyday life work. They may decide the size or shape of a product or system. They can specify terms so that there are no misunderstandings. They are crucial for instilling confidence and consistency for both providers and users. This is why we have made the development of a set of standards a crucial component of our journey towards building a National Digital Twin.
    In conversations we’ve had in the Digital Twin (DT) Hub and the wider Centre for Digital Built Britain (CDBB) community, there have been significant concerns about the costs involved in investing in a digital twins. We believe, that to mitigate the risk and avoid the need to make changes down the line, standards are of vital importance. We need a shared foundation and framework to support the end goal of secure data exchange and interoperability.

    We’ve made significant progress towards that goal and it’s exciting to be pioneers in establishing what will hopefully be a common language - guidelines that can be used, not just here in the UK, but globally.

    To start with, we’ve needed to gain a thorough understanding of what the current standards landscape looks like and the CDBB commissioned the British Institute of Standards (BSI) to do the research. Their initial scoping exercise is complete and BSI and CDBB are now reviewing the results of this exercise to identify if and where standards are needed to overcome a specific challenge or fulfil a purpose. We’ve also looked to other sectors to see if existing standards can be applied or modified to work in the built environment.
    We are now in the process of creating a clear roadmap that prioritises standards to be developed. The document will be accompanied by a report to include the narrative, justification and rationale behind the roadmap. It will be presented through a series of thematic areas: Digital Twins, Data, ICT, Application, and Outcomes as well as multiple sub-topic themes, to help enable users to locate key standards.
    The end goal is a very practical guide. It will cover everything from a shared vocabulary, to ensure consistent definitions of assets, to recommended data formats, user case methodology, a code of practice on information exchange and so on.

    A vital part of the process is involving stakeholders and we’re very grateful for all the feedback we’ve received so far. We have recently had the opportunity to share the latest review with DT Hub members as well as those within the wider digital twin community. Attendees of the recent workshop, hosted by BSI, had the opportunity to both critique and verify the findings as well as to share their views on some of the priorities for standards to support successful digital twins in the built environment.  This has been a valued opportunity to really shape the direction of these important developments as we can’t do it alone.

    A great example of the impact standards can make is one I came across from the early 1900s when the BSI developed a standard for tram gauges at a time when, in the UK alone, there were 75 different widths of gauge! They succeeded in reducing it down to five recommended widths. These became the standards going forward and greatly boosted the industry’s fortunes increasing compatibility between networks and rolling stock. As the British standard was adopted abroad, the UK tram market enjoyed more opportunities to trade and business flourished.

    We hope to make a similar kind of impact – we want to see all developers of digital twins flourish and benefit from the advantages that sharing data and ideas can bring. But in order to do that successfully, the whole process needs to be underpinned by standards that have been formed out of thorough research and review and have the support and involvement of as many people as possible. We look forward to seeing you around the DT Hub!
    Samuel Chorlton, Chair of the Digital Twin Hub
    Read more...
    During our research activities within the DT Hub, several barriers relating to the use of digital twins were identified.  This blog post is one of a series which reflects on each barrier and considers related issues so that we can discuss how they may be addressed.

    As our members, and indeed other organisations active in the built environment, develop data and information about their assets, the ability to ensure that this data can be used within other tools is a priority.  To do so, the data needs to be interoperable. One definition of interoperability is:
    In brief, if data can be shared between systems it is considered interoperable.  Typically, this can be achieved in one of two ways:
    Both systems use the same formal description (schema) to structure the data; or One system transforms its data using an intermediate formal description (schema) to structure the data The simplest solution appears to be (1), to have all systems create, use and maintain information using the same schema.  This would mean that information could be used in its default (native) format and there would be no risk of data being lost or corrupted during its transformation.  However, this isn’t practicable as, from a technical perspective, it is unlikely that the broad range of information needed to support every possible purpose could be captured against the same schema.  In addition, public procurement directives require performance-based technical specifications as opposed to naming specific software. This means that an organization may be challenged if they specify their supply chain use a particular piece of software as it would circumvent directives around competition and value for money.
    As it is not possible to guarantee that the same schema will be used throughout, it is far more practicable to identify which established industry schema is most suitable to accept data within (2) depending on the purpose of using this information.  In doing so, there is an added benefit that the information you receive may be open data.
    Typically misused as a synonym for interoperability, open data is important for sharing but for a specific reason.
    Open data, in brief, is un-restricted data.  By using proprietary software and systems the schema used to structure that data is hidden.  As a user of that software you are effectively given permission by the vendor to use that structure to view your information.  For built environment assets this can be a problem as the physical asset can outlast the software used to design and manage it.  Meaning that in 50 years a tool that allows access to this information may not exist - or sooner given the cannibalistic nature of the software industry.  Consider SketchUp for example.  Since its release in 2000, it has been owned by three different organizations: @Last Software, Google, and Trimble.  The permission to use the SKP schema has changed hands several times.  Who will produce software to view these files in 30 years’ time?
    To ensure enduring access to asset information, either bespoke schemas need to be developed and maintained internally, or an established open schema needs to be used.  However, while several open schemas are readily available (such as IFC, PDF, PNG, MQTT) they can raise concerns related to access, control and abuse of the data within. 
    These concerns, thankfully, can be offset through control.  Using open data structures, it is possible to ensure that only the information you wish to exchange is delivered.  By using proprietary structures hidden information can also be exchanged which cannot be controlled; potentially causing a larger risk than their open counterparts.  Conversely, to produce a “need-to-know” dataset an open data approach is, ironically, easier to control.
    When considering which methodologies to use, open data benefits typically outweigh its risks.  The use of these open data structures will not only unlock interoperability between digital twins within an organization but will be the mechanism that enables a secure national digital twin. 
    Access to appropriate data about our national infrastructure is currently held behind proprietary schema.  Let’s make Britain’s data open again!
     
    We hope you enjoyed this short piece on breaking the barriers related to interoperability.  What specific challenges have you faced relating to the implementation of interoperability?  Do you consider open data in this content is an opportunity or a threat? Would you prefer the National Digital Twin to be based on an open or a propriety schema?

    the_pathway_towards_an_imf.pdf the_pathway_towards_an_imf.pdf DTHUb_NewbieGuide_May2020_(1).pdf HUB Version_DT Standards Roadmap_November 2020 (3).pdf DT Hub SII SUMMARY - Published.pdf
    Read more...
    Is it? Or is it not?
     For a few years now, parts of our sector and indeed other sectors, have been researching, defining and promoting digital twins.  If we observe anything, it’s that chatter (including within the DT Hub) has been rife with the ‘what is/isn’t a digital twin...’
    I’m no expert, and don’t yet claim to offer a resolution to clarify the topic, but I do think a discussion hosted within the DT Hub would be of use.  This discussion is something that will provide greater clarity and implementation for those less involved in this definition process and yet vitally important to the delivery of whatever a digital twin of the future is destined to be.
    Let’s learn from BIM implementation
    I wear many hats in my career and most of them are related to the implementation and ‘normalisation’ of BIM processes. As Vice Chair of the UK BIM Alliance and Chair of the UK & Ireland Chapter of buildingSMART International, I’m afforded a view of the sector from various different levels of stakeholders and the challenges they face in an ever-changing world as they prepare to digitalise.  The silent majority are perhaps the key to unlocking the transformation to a digital sector and it’s vital that the BIM message reaches them and connects in a meaningful way to each and every one of them... BIM in the UK has been ongoing for over a decade and my feeling is that there is at least another to go before we reach ‘business as usual’.  It’s exactly the same for Digital Twins.
    All vocal parties involved here in the DT Hub seem keen to navigate more smoothly through the same sectoral challenges and one of those, in a similar vain to BIM, is “is this a Digital Twin or not”?
    Acknowledging that BIM in the UK has formerly been going through the same sector engagement, we can also see similar issues appearing now with the concept behind Digital Twins being taken over by technology providers rather than sector stakeholders and subsequently being marketed in that way.  It’s by no means a UK-only challenge, with many global discussions observed.
    Hence, we’re rapidly on the way to Digital Twins being defined by technologies rather than their use and value to us as people.  A human-centric approach to any digital transformation will almost always achieve greater adoption and ultimately ‘success’ than one led purely by technology. Hence the CDBB National Digital Twin Programme envisages the built environment as a system of systems, comprising economic infrastructure, social infrastructure and the natural environment.  The CDBB Gemini Principles neatly position Digital Twins in a way that forces one to consider the overall business need (the ‘why’) and all the potential societal benefits.
    Other DT Hub discussions have touched on the possibility of a Turing-type test.  The original Turing test was created by Alan Turing to determine whether or nota machine was discernible from a human.  Whilst the test is valuable for determining artificial intelligence, it’s also one that is evaluated by humans and hence quite challenging to ensure all evaluators are equal. Perhaps a technology-driven test that provides both a score and a ‘time taken’, introducing a level of competition between creators of Digital Twin systems might help adoption.
     
    So here’s the proposition... we hold a workshop (or two) to discuss and investigate the need for a test, the type of test, ‘what’ is being tested, what the thresholds might be, and anything else that’s relevant to the topic of ascertaining whether or not someone’s proposed Digital Twin is actually a Digital Twin.
    I have three questions to start the discussion here in this thread...
    1. Do you feel the need for a ‘test’ to determine whether or not a Digital Twin is a Digital Twin? Can we continue without a formal ‘test’ or should we actively seek to develop something absolute to filter out what we’ve been able to do for many years and focus on true Digital Twin solutions and the search for the allusive Digital Twin unicorn?!
     
    2. If we do need a test, will a simple yes/no suffice? Or does a ‘score have more longevity? If you ever saw the HBO series Silicon Valley, you may be familiar with the Weismann Score, a fictional test and score for online file compression.  It enabled the fictional tech companies to demonstrate the success of their software and algorithms by testing their performance for file compression.  Would a similar test be suitable for our purposes, with a threshold for determining if a proposed Digital Twin is a Digital Twin and would it then cater for future digital developments and emerging technologies?
     
    3. Finally, are you keen and able to join a virtual workshop?  
     

    the_pathway_towards_an_imf.pdf the_pathway_towards_an_imf.pdf DTHUb_NewbieGuide_May2020_(1).pdf HUB Version_DT Standards Roadmap_November 2020 (3).pdf
    Read more...
    Following input from DT Hub members into a community-driven document, we have proceeded to reduce the number of use cases identified during the Pathway to Value Workshop from 28 down to 12:
    Open Sharing of Data Asset Registration Scenario Simulation Occupant/User Management Environmental Management Traffic Management Process Optimization Asset Management Carbon Management Resource Management Resilience Planning Risk Management Using these use cases, we can begin to explore how the National Digital Twin (NDT) programme can support members of the DT Hub in realizing their value.  One way of doing so is by identifying what parts of these use cases need to be developed via the Commons Stream as part of the Information Management Framework (IMF).
    The reasoning being these 12 use cases are:
    Horizontal. Meaning that they can be applied within several sectors and their respective industries; and High-value. Meaning that they can achieve a return on investment. Positively, these use cases have a strong synergy with a similar schedule presented by Bart Brink of Royal HaskoningDHV on a recent buildingSMART webinar on digital twins.

    By identifying DT Hub member horizontal, high-value, use cases we hope that their associated tasks, key performance indicators and federation requirements can be recommended for prioritization as part of the development of the Information Management Framework (IMF).
    At the beginning of June, CDBB released The Pathway Towards an Information Management Framework: A Commons for a Digital Built Britain, a report setting out the technical approach that will lead to the development of the National Digital Twin.  Within the report it focuses on three key facets that will enable secure, resilient data sharing across the built environment:
    Reference Data Library.  A taxonomy describing a common set of classes to describe the built environment; Foundation Data Model.  An ontology outlining the relation between these classes or properties of these classes; and Integration Architecture.  Exchange protocols to facilitate sharing of information, using these defined classes and relations between digital twins.
    As opposed to being released as a complete resource, we will likely see these facets developed organically as the NDT programme continues to follow its mantra of:
    As such, the key question isn’t “what should these facets include?” but “what should be included first?”.  We hope to answer this question using these horizontal, high-value, use cases. 
    EXAMPLE:
    “Environmental management”.  At the beginning of 2020, news reports focused on air pollution and its link with infrastructure.  In addition, many building assets may wish to monitor air quality due to its known impact on occupant performance.  As a use case that is associated to regulatory compliance, productivity, and applicable to a breadth of assets Environmental Management may be a horizontal, high-value, use case.
    To support such a use case, the:
    Reference Data Library.  May need to include classes such as: Temperature, Wind speed, Humidity, CO2, and PM2.5 as well as their associated units to enable the consistent recording of this information. Foundation Data Model.  May need an ontology describing acceptable ranges and the relationship of air quality concepts to other classes such as Health and Productivity depending on the function being monitored; and Integration Architecture.  May need to facilitate the sharing of information from sources such as other digital twins, as well as datasets from the Met Office and local governments. Simply put, by identifying these horizontal, high-priority, use cases, we may be able to begin accelerating the realization of their value by having the taxonomies, ontologies and protocols needed to facilitate them available at an earlier stage of the overall IMF development.
    And there we have it.  As DT Hub members begin to consider how the information management framework may support their digital twin development as well as the national digital twin, which use cases do you think are the most horizontal and high-value? How do you think these facets might support your ability to undertake these use cases?
    Please feel free to add your thoughts below, or, alternatively, comment directly on the draft community-driven document which is, and will continue to be, progressively developed as member views are shared.

    the_pathway_towards_an_imf.pdf the_pathway_towards_an_imf.pdf DTHUb_NewbieGuide_May2020_(1).pdf HUB Version_DT Standards Roadmap_November 2020 (3).pdf
    Read more...
    As the National Digital Twin (NDT) programme develops its thinking around the Commons, several resources to support the implementation of digital twins within the built environment will be developed.  The first of which, the Glossary, is readily available for members to engage with.  Further resources will likely include ontologies, schema and other key data infrastructure elements required to enable the NDT. 
    To ensure that these resources are fit-for-purpose, they need to align to the needs of the DT Hub members; supporting use cases.  As such, this article uses the output of the Theme 3 webinar to explore and begin to identify horizontal, high-value, use cases for prioritization.   
    The outcome of this work will be a community-driven document (draft under development here) to inform the Commons on which use cases should be considered a priority when developing resources. 

    During the Theme 3 webinar, a total of 28 use cases were identified by members. 
    Open Sharing of Data 
    Data-sharing Hub 
    Health and Safety 
    Social Distancing 
    Customer Satisfaction 
    Behavioural Change 
    National Security 
    Traffic Management 
    Incident Management 
    Efficiency Monitoring 
    Condition Monitoring 
    Scenario Simulations 
    Rapid Prototyping 
    Asset Optimization 
    Investment Optimization 
    Preventative Maintenance 
    Carbon Management 
    Service Recovery 
    Decision Support 
    National Efficiency 
    ‘Live’ in-use Information 
    Logistic / Transit Tracing 
    Natural Environment Registration 
    Pollution Monitoring 
    Air Quality Monitoring 
     
    Resilience Planning 
    Resource Optimization 
    Service Electrification 
     
    This initial schedule demonstrates the breadth of value that a digital twin can facilitate.  However this list can be refined as some of these use cases: 
    Overlap and can be consolidated through the use of more careful terminology.  For example both Pollution Monitoring and Air Quality Monitoring were identified.  However it is likely that the system, sequence of actions, as well as any associated key performance indicators will be shared between these use cases.  Therefore they could be consolidated under a single use case Environmental Monitoring. 
      May be specific to some members or some sectors.  For example, Customer Satisfaction Monitoring is a vital use case for DT Hub members who directly engage with a user-base within a supplier market (for example, utility companies and universities).  However, many organizations manage assets and systems whose actors do not include a customer (for example, floor defence systems, and natural assets).  Likewise, Service Electrification is a use case that is only applicable for assets and systems which rely on fossil fuels (for roads and railways).  As such, while Customer Satisfaction Monitoring and Service Electrification are vital use cases which must remain within scope of the overall programme, they may not be appropriate for prioritization. 
      Are aspects as opposed to a stand-alone use case.  For example, ‘Live’ In-use Information may be a requirement of several use cases such as Traffic Management and National Security but does not in itself constitute a sequence of actions within a system.  By identifying the use cases that are most common to DT Hub members as well as eliminating duplicates, it is hoped that a refined schedule can be produced; limited to high-value, horizontal use cases.  Such a schedule will be valuable to: 
    The NDT programme to understand what use cases the IMF Pathway will need to support;  Asset owner/operators to identify and articulate the value-case for implementing digital twins; and  Suppliers to demonstrate the validity of their software in realizing these values.  Furthermore, once a streamlined schedule has been developed, further research can be undertaken to identify the typical key performance indicators used to measure and monitor systems that support these use cases. 
     
    And there we have it, useful use cases.  Of the 28 use cases identified which do you think are the most horizontal? Which do you think are high-value (priority) use cases? Which do you think could be aggregated together? 
    Please feel free to add your thoughts below, or, alternatively, comment directly on the draft community-driven document which will be progressively developed as member views are shared.  Feel free to comment on the content included and suggest how to refine the schedule. 

    the_pathway_towards_an_imf.pdf DTHUb_NewbieGuide_May2020_(1).pdf HUB Version_DT Standards Roadmap_November 2020 (3).pdf
    Read more...

    Breaking Barriers: Skills

    During our research activities within the DT Hub, several barriers relating to the use of digital twins were identified.  This blog post, which looks at digital skills, reflects on skill as a barriers and considers related issues so that we can discuss how they may be addressed.

    As organizations develop a wide array of datasets and supporting tools, a key concern has been the capability of the people using these resources to make decisions and take action.  To do so, these people need to be sufficiently skilled.
    Industry reports, such as the Farmer Review, have consistently identified skills shortage as a key issue within the built environment.  This figure below, produced by the Construction Products Association (CPA), shows the proportion of firms who have had difficulties in recruiting traditional trades.  For example, in the first quarter of 2017, over 60% of firms had difficulty recruiting bricklayers.
    A cause of this shortage is the lack of training being provided by organizations within the built environment.  As shown in the figure below from the Farmer Review, workforces within the built environment are some of the least trained.  While an obvious solution may be simply to provide more training, the issue is confounded by the fact that we need to inject a new set of skills in to the sector; increasing the amount of training required.

    In 2018, The World Economic Forum produced their Future of Jobs Report.  It considered what are the current emerging and declining skills as a result of digital transformation, automation and the fourth industrial revolution.
    These, are highlighted in the table below.

    Considering the results provided, the need for manual skills as well as installation and maintenance skills are declining rapidly.  As such there is a risk that any immediate training to fill our skills gap may not be suitable for future employment needs.  As initiatives such as the Construction Innovation Hub and Active Building Centre consider Design for Manufacture and Assembly (DfMA) and other more modern methods, perhaps the focus should be on which skills are needed for the future.
    Digital twins, as representations of physical assets, processes or systems, will need to be interfaced with built environment professionals of the future.  The question however, is in what capacity?  Let’s consider a scenario:
    Cardiff University has a digital twin of their campus.  Within this twin, they have included sensors to record the usage and occupancy of lecture halls to access space optimization.
    For an estate manager to be able to use this twin, they may benefit from:
    Software skills, to interface with the incoming data.  This software may not be part of their core asset management system; needing additional knowledge and skills to use. Analytical thinking, to allow them to test scenarios.  For example, to test what would happen to usage if a single building was changed to private rent from external customers; improving the universities income generation. Creative thinking, to allow them to consider new ideas.  For example, to use the timetable to place lectures that straddle lunch across-campus; increasing foot-traffic past the university lunch hall. Intuitive thinking, to allow them to question outputs.  For example, to be able to identify when a faulty sensor may have led to data discrepancies or when an analysis programme has identified importance solutions due to its correlative nature such as starting lectures at 6am to free up more rooms for private rent. Ultimately, the reason for adopting digital twins will be to provide value for an organization and its wider ecosystem.  As such, problem-solving skills, critical thinking, systems analysis and analytical thinking will likely become core competencies.  For organizations with critical long-term planning requirements, future employees need to be taught these skills now so that they are appropriately competent for the future.
     
    And there we have it, breaking the barriers related to skills.  How relevant do you think the WEF top 10 growing skills will be for future consumers of digital twin content?  What skills do you consider to be core to future digital twin users?
     

    the_pathway_towards_an_imf.pdf DTHUb_NewbieGuide_May2020_(1).pdf HUB Version_DT Standards Roadmap_November 2020 (3).pdf
    Read more...
    A lot of the early thinking on digital twins has been led by manufacturers. So, what do digital twins mean to them and what insights could this provide for the built environment?
    This blog is the second in series that looks at what we can learn from the development of digital twins in other sectors. It draws on key findings from a report by the High Value Manufacturing Catapult. This includes industry perspectives on:
    The definition of digital twins Key components of digital twins Types of twin and related high-level applications and value The report “Feasibility of an immersive digital twin: The definition of a digital twin and discussions around the benefit of immersion” looks partly at the potential for the use of immersive environments. But, in the main, it asks a range of questions about digital twins that should be of interest to this community. The findings in the report were based on an industry workshop and an online survey with around 150 respondents.
    We’ve already seen that there are many views on what does or does not constitute a digital twin. Several options were given in the survey, and the most popular definition, resonating with 90% of respondents was:
    A virtual replica of the physical asset which can be used to monitor and evaluate its performance
    When it comes to key components of digital twins, the report suggests that these should include:
    A model of the physical object or system, which provides context Connectivity between digital and physical assets, which transmits data in at least one direction The ability to monitor the physical system in real time. By contrast, in the built environment, digital twins may not always need to be “real-time”. However, looking at the overall document, the position appears to be more nuanced and dependent on the type of application. In which case, “real-time” could be interpreted as “right-time” or “timely”.
    In addition, analytics, control and simulation are seen as optional or value-added components. Interestingly, 3D representations are seen by many as “nice to have” – though this will vary according to the type of application.
    In a similar fashion to some of our discussions with DT Hub members, the report looks at several types of digital twin (it is difficult to think of all twins as being the same!). The types relate to the level of interactivity, control and prediction:
    Supervisory or observational twins that have a monitoring role, receiving and analysing data but that may not have direct feedback to the physical asset or system Interactive digital twins that provide a degree of control over the physical things themselves Predictive digital twins that use simulations along with data from the physical objects or systems, as well as wider contextual data, to predict performance and optimise operations (e.g. to increase output from a wind farm by optimising the pitch of the blades). These types of twin are presented as representing increasing levels of richness or complexity: interactive twins include all the elements of supervisory twins; and predictive twins incorporate the capabilities of all three types.
    Not surprisingly, the range of feasible applications relates to the type of twin. Supervisory twins can be used to monitor processes and inform non-automated decisions. Interactive twins enable control, which can be remote from the shop-floor or facility. Whereas, predictive twins support predictive maintenance approaches, and can help reduce down-time and improve productivity. More sophisticated twins – and potentially combining data across twins – can provide insight into rapid introduction (and I could imagine customisation) of products or supply chains.
    Another way of looking at this is to think about which existing processes or business systems could be replaced or complemented by digital twins. This has also come up in some of our discussions with DT Hub members and other built environment stakeholders – in the sense that investments in digital twins should either improve a specific business process/system or mean that that it is no longer needed (otherwise DT investments could just mean extra costs). From the survey:
    Over 80% of respondents felt that digital twins could complement or replace systems for monitoring or prediction (either simple models or discrete event simulation) Around two-thirds felt the same for aspects related to analysis and control (trend analysis, remote interaction and prescriptive maintenance) with over half seeing a similar opportunity for next generation product design While remote monitoring and quality were seen as the areas with greatest potential value. Cost reduction in operations and New Product Development (NPD) also feature as areas of value generation, as well as costs related to warranty and servicing. The latter reflects increasing servitisation in manufacturing. This could also become more important in the built environment, with growing interest in gain-share type arrangements through asset lifecycles as well as increasing use of components that have been manufactured off-site.
    It would be great if you would like to share your views on any of the points raised above. For example, do you think built environment twins need the same or different components to those described above? And can digital twins for applications like remote monitoring and quality management also deliver significant value in the built environment?

    the_pathway_towards_an_imf.pdf DTHUb_NewbieGuide_May2020_(1).pdf HUB Version_DT Standards Roadmap_November 2020 (3).pdf
    Read more...
    This blog was first produced following discussions with digital twin owners about the importance of learning more from other industries. It also relates to the first “theme” that we identified as a priority for the DT Hub, which looks at digital twin definitions and concepts. We hope you enjoy reading this piece and welcome your comments as well as your thoughts on other topics where you would like to hear more from us.
    The idea of digital twins in space may seem like science fiction – or at least a long way removed from the day-to-day challenges of the built environment. But, in fact, the aerospace industry has been at the forefront of many of the technology innovations that have transformed other areas. Before Michael Grieves coined the term digital twin in 2002, NASA was using pairing technology to operate and repair remote systems in space.
    Digital twins, in the aerospace sector, have since gone way beyond simulations. This is driven by a need to accurately reflect the actual condition of space craft and equipment and predict potential future issues. While the crew of Apollo 13 may have relied on a physical double as well as digital data, future space stations and trips beyond our atmosphere will be using digital twins to deliver the right kinds of insights, decision support and automation needed to achieve their missions.
    Despite the great distances and the technological advancement of space technologies there are valuable parallels with industries back on earth. For example, digital twins of remote and autonomous vehicles (like the Mars Exploration Rover) could provide useful lessons for similar vehicles on earth, from robots in nuclear facilities and sub-sea environments, through to delivery vehicles in a logistics centre or drones on a building site.
    More specifically, a 2012 paper co-authored by NASA provided several insights into the approach to digital twins in aerospace,  including the following definition:
    A Digital Twin is an integrated multiphysics, multiscale, probabilistic simulation of an as-built vehicle or system that uses the best available physical models, sensor updates, fleet history, etc., to mirror the life of its corresponding flying twin
    Digital twins could represent a significant shift away from a heuristic (i.e. past-experience based) approach to one using sophisticated modelling combined with real-life data. This shift impacts design and build, certification and ongoing operation. The drivers behind this change include a need to withstand more extreme conditions, increased loads and extended service life. (Imagine a manned trip to Mars, or one of the new commercial space ventures that call for vehicles to be used again and again).
    The paper also looked at some of the needs and priority areas for digital twins, including:
    more accurate prediction of potential materials failures; as well as the condition of other systems in space vehicles by connecting multiple models with data from the physical twin. If digital twins can add value in the harshest environment imaginable, what applications could this have for the built environment? One example is the interesting parallels between assessment of the risks of cracks and failures in long-life space vehicles and long-term structural monitoring of bridges and other infrastructure. The required level of fidelity (i.e. the level of detail and accuracy) as well as the extent to which real-time data is needed, may vary considerably – but many of the same principles could apply. 
    More widely, the authors of the paper felt that the parallels and benefits from developing digital twins for aerospace could extend across manufacturing, infrastructure and nanotechnology.
    The ideas explored in the paper also go well beyond monitoring and towards automation. For complex space missions, vehicles may not be able to get external help and will need to be self-aware, with “real-time management of complex materials, structures and systems”. As the authors put it:
    “If various best-physics (i.e., the most accurate, physically realistic and robust) models can be integrated with one another and with on-board sensor suites, they will form a basis for certification of vehicles by simulation and for real-time, continuous, health management of those vehicles during their missions. They will form the foundation of a Digital Twin.”
    Such a digital twin could continuously forecast the health of vehicles and systems, predict system responses and mitigate damage by activating self-healing mechanisms or recommend in-flight changes to the mission profile.
    While the context may be very different, our discussions with DT Hub  members and others in the market suggest that built environment infrastructure owners and operators are aiming to achieve many of the same aspirations as NASA – from better prediction of potential issues through to actuation and self-healing.
    Which space twin applications and ideas do you think we could apply to the built environment?
    We would welcome your comments on this piece as well as your thoughts on other topics where you would like to hear more from us.
     
     
     

    the_pathway_towards_an_imf.pdf DTHUb_NewbieGuide_May2020_(1).pdf HUB Version_DT Standards Roadmap_November 2020 (3).pdf
    Read more...
    Our collective understanding of digital twins is rather nascent.  To ensure that we operate under the same base information there is a need to periodically reflect on the concepts and principles we have outlined.  This blog post is one in a series which reflects on previously published concepts to consider whether our collective thinking has advanced.

    As we develop the thinking, tools, and resources relating to digital twins, a lot of discussion is taking place regarding their scope, scale and accuracy.  Within the Gemini Principles it stated that a digital twin is:
    I want to reflect on this statement.  In particular, the use of “realistic”.
    For something to be realistic, according to the Oxford English Dictionary, it must represent something in a way that is accurate and true to life.  For example, for something to be “photo-realistic” it must appear as if it was a photograph.
    However, the Gemini Principles state that a digital twin must represent physical reality at the level of accuracy suited to its purpose. Interestingly, while undertaking discovery interviews with DT Hub members we saw this issue realized.
    Interview Insight
    "Several members commented on how people in their organizations would try to extend the use of their digital twins beyond their intended purposes."
    This was seen as both a positive and a negative outcome.  The positive being that members of these organizations saw the value in these digital twins and wanted to harness their insight.  The negative being that these digital twins did not have the information or, when available, did not have level of accuracy required to be used for these extended purposes.  For these extended needs, these digital twins were not realistic.
    Amongst DT Hub members there appears to be a shared view that digital twins are, fundamentally, purpose-driven.  Therefore, digital twins might not be “real” representations, but instead the “right” representation to support a purpose.
    Consider an example.  An air traffic control system utilizes a “digital twin” of runways, aircraft and their flight paths along with sensor information (e.g. weather and radar) to assist with preventing collisions, organize and control the landing and departing of aircraft.  In this example while real-time information and analytics are used, none of the physical elements (planes, control towers) have realistic representations, they instead use basic representations to support the air traffic controller.  Instinctually an air traffic control system does everything we want a digital twin to do, it is a digital representation of physical assets which also includes sensor information where the physical assets provide a link back to the digital twin.  Given this, it should be fairly clear that an air traffic control system would be considered a digital twin.  However, this does not appear to be the case.

    A poll was placed on twitter asking “would you consider an air traffic control system a digital twin”.  After 62 votes were cast, the result was exactly 50:50.  What does this tell us?  Perhaps public messages on what a digital twin is aren’t sufficiently defined?  Perhaps the question was poorly worded? Or perhaps, for some, the lack of a realistic representation is the reason they said no?  Unfortunately, context for each vote isn’t available.  At the very least we can be sure that our shared view may not be shared by everyone. 
    In an age where many consider data to be the new oil perhaps we should consider using our data sparingly.  So long as the data provided is sufficient for its intended purpose, a realistic representation may not always be required.
     
    And there we have it, realism and its place within Digital Twins.  Do you believe that a digital twin has to be realistic?  Can something be a digital twin without being a realistic representation?  Had you voted on this poll, would you have considered an air traffic control system a digital twin?
     
     
    Read more...
Top
×
×
  • Create New...