Jump to content

Articles & Publications      Shared by the Community

    The climate emergency and the transition to a net zero economy means businesses, governments and individuals need access to new information to ensure that we can mitigate and adapt to the effects of environmental and climate change. Environmental Intelligence will be a critical tool in tackling the climate and ecological crises, and will support us as we move towards more sustainable interaction with the natural environment, and delivery of net zero.
    Environmental Intelligence is a fast-developing new field that brings together Environmental data and knowledge with Artificial Intelligence to provide the meaningful insight to inform decision-making, improved risk management, and the technological innovation that will lead us towards a sustainable interaction with the natural environment. It is inherently inter-disciplinary and brings together research in environment, climate, society, economics, human health, complex eco-systems, data science and AI.
    The Joint Centre for Excellence in Environmental Intelligence (JCEEI) is a world-leading collaboration between the UK Met Office and the University of Exeter, together with The Alan Turing Institute and other strategic regional and national collaborators. This centre of excellence brings together internationally renowned expertise and assets in climate change and biodiversity, with data science, digital innovation, artificial intelligence and high-performance computing.
    The JCEEI’s Climate Impacts Mitigation, Adaption and Resilience (CLIMAR) framework uses Data Science and AI to integrate multiple sources of data to quantify and visualise the risks of climate change on populations, infrastructure and the economy in a form that will be accessible to a wide variety of audiences, including policy makers, businesses and the public.
    CLIMAR is based on the Intergovernmental Panel on Climate Change’s (IPCC; https://www.ipcc.ch) risk model that conceptualises the risk of climate-related impacts as the result of the interaction of climate-related hazards (including hazardous events and trends) with the vulnerability and exposure of human and natural systems.  Hazards are defined as ‘the potential occurrence of a natural or human-induced physical event or trend or physical impact that may cause loss of life, injury, or other health impacts as well as manage and loss to property, infrastructure, livelihoods, service proposition, ecosystems, and environmental services.’; exposures ‘The presence of people, livelihoods, species or ecosystems, environmental functions, services, and resource, infrastructure, or economic, social or cultural assets in places and settings that could be adversely affected.’; and vulnerability ‘The propensity or predisposition to be adversely affected’, which encompasses sensitivity or susceptibility to harm and lack of capacity to cope and adapt.
    A mathematical model is used to express the risk of a climate related impact, e.g. an adverse health outcome associated with increased temperatures or a building flooding in times of increased precipitation. Risk is defined as the probability that an event happens in a defined time period and location and is a combination of the probabilities of the hazard occurring together with probability models for exposure and vulnerability. In the simplest case, the probabilities (of hazard, exposure and vulnerability) would be treated as independent, but in reality the situation is much more complex and the different components will often be dependent on each other), which requires conditional probability models to be used. For example people’s exposures to environmental hazards (e.g. air pollution) may be dependent on their vulnerability (e.g. existing health conditions.
    The UKCP18 high-resolution climate projections are used to inform models for hazards and provide information on how the climate of the UK may change over the 21st century (https://www.metoffice.gov.uk/research/approach/collaboration/ukcp/index). This enables the exploration of future changes in daily and hourly extremes (e.g. storms, summer downpours, severe wind gusts), hydrological impacts modelling (e.g. flash floods) and climate change for cities (e.g. urban extremes). The headline results from UKCP18 are a greater chance of warmer, wetter winters and hotter, drier summers, along with an increase in the frequency and intensity of extremes. By the end of the 21st century, all areas of the UK are projected to be warmer and hot summers are expected to become more common. The projections also suggest significant increases in hourly precipitation extremes, with the rainfall associated with an event that occurs typically once every 2 years increasing by 25%, and the frequency of days with hourly rainfall > 30 mm/h almost doubling, by the 2070s; increasing from the UK average of once every 10 years now to almost once every 5 years.
    CLIMAR is currently being used in a range of real-world applications based on the UKCP18 projections across sectors that will be affected by changes in the climate, including energy system security, telecommunications, critical infrastructure, water and sewage networks, and health. Two examples are:
    working with Bristol City Council on the effects of climate change on urban heat, inequalities between population groups and the efficacy of methods for adapting building stock (e.g. improved ventilation, double glazing) to keep people cool, and safe, in periods of extreme heat;
      working with a consortium led by the National Digital Twin Programme and the Centre for Digital Built Britain to develop a Climate Resilience Demonstrator, integrating climate projections with asset information and operational models to develop a Digital Twin that can be used to assess the future risks of flooding on critical infrastructure including energy, communications and water and sewage networks. This will provide a step-change in our understanding of the potential effects of climate change on critical infrastructure and demonstrates the power of inter-disciplinary partnerships, spanning academia and industry, that will be crucial in unlocking the enormous potential for Digital Twins to enhance our resilience to climate change across a wide variety of sectors. For further information on CLIMAR and associated projects, please see https://jceei.org/projects/climar/ and for information on the National Digital Twin Climate Resilience Demonstrator (CreDo) see https://digitaltwinhub.co.uk/projects/credo/
    The bigger and more complicated the engineering problem, the more likely it is to have a digital twin. Firms that build rockets, planes and ships, for example, have been creating digital twins since the early 2000s, seeing significant operational efficiencies and cost-savings as a result. To date, however, few firms have been able to realise the full potential of this technology by using it to develop new value-added services for their customers. This article describes a framework designed to help scale the value of digital twins beyond operational efficiency towards new revenue streams.
    In spite of the hype surrounding digital twins, there is little guidance for executives to help them make sense of the business opportunities the technology presents, beyond cost savings and operational efficiencies. 
    Many businesses are keen to get a greater return on their digital twins’ investment by capitalising on the innovation – and revenue generating - opportunities that may arise from a deeper understanding of how customers use their products. However, because very few firms are making significant progress in this regard, there is no blueprint to follow. New business models are evolving but the business opportunities for suppliers, technology partners and end-users is yet to be fully documented. 
    Most businesses will be familiar with the business model canvas as a tool to identify current and future business model opportunities. Our 4 Values (4Vs) framework for digital twins is a more concise version of the tool, developed to help executives better understand potential new business models. It was designed from a literature review and validated and modified through industry interviews. 
    The 4Vs framework covers: the value proposition for the product or service being offered, the value architecture or the infrastructure that the firm creates and maintains in order to generate sustainable revenues; the value network representing the firm’s infrastructure and network of partners needed to create value and to maintain good customer relationships; and value finance such as cost and revenue structures. 
    Four types of digital twin business models
    From extensive interviews with middle and top management on services offered by digital twins, we identified four different types of business models and applied our 4Vs approach to understand how those models are configured and how they generate value. 
    These were all found in information, data and system services industries. Their value proposition is to provide a data marketplace that orchestrates the different players in the ecosystem and provides anonymised performance data from, for example, vehicle engines or heating systems for buildings. Value Finance consists of recurring monthly revenues levied through a platform which itself takes a fee and allocates gives the rest according to the partnership arrangements.
    This business model is prevalent in the world of complex assets, such as chemical processing plants and buildings. Its value proposition lies in providing additional insights to the customer on the maintenance of their assets to provide just-in-time services. What-if analysis and scenario planning are used to augment the services provided with the physical asset that is sold. Value Architecture is both open and closed, as these firms play in ecosystems but also create their own. They control the supply chain, how they design the asset, how they test it and deliver it. The Value Network consists of strategic partners in process modelling, 3D visualisation, CAD, infrastructure and telecommunications. Value Finance includes software and services which provide a good margin within a subscription model. Clients are more likely to take add-on services that show significant cost savings.
    Uptime assurers 
    This business model tends to be found in the transport sector, where it’s important to maximise the uptime of the aircraft, train or vehicle. 
    The value proposition centres on keeping these vehicles operational, either through   predictive maintenance for vehicle/aircraft fleet management and, in the case of HGVs, route optimisation. Value Architecture is transitioning from closed to open ecosystems. There are fewer lock-in solutions as customers increasingly want an ecosystems approach. Typically, it is distributors, head offices and workshops that interact with the digital twin rather than the end-customer. The Value Network is open at the design and assembly lifecycle stages but becomes closed during sustainment phases. For direct customers digital twins are built in-house and are therefore less reliant on third-party solutions. Value Finance is focused on customers paying a fee to maximise the uptime of the vehicle or aircraft, guaranteeing, for example, access five days a week between certain hours. 
    Mission assurers
    This business model focuses on delivering the necessary outcome to the customers. It tends to be found with government clients in the defense and aerospace sector. Value propositions are centered around improving efficacy of support and maintenance/ operator insight and guaranteeing mission success or completion. These business models suffer from a complex landscape of ownership for integrators of systems as much of the data does not make it to sustainment stages. 
    Value Architecture is designed to deliver a series of digital threads in a decentralised manner. Immersive technologies are used for training purposes or improved operator experience. Value Network is more closed than open as these industries focus on critical missions of highly secure assets. Therefore, service providers are more security minded and careful of relying on third-party platforms for digital twin services. Semi-open architecture is used to connect to different hierarchies of digital twins/digital threads. Value Finance revealed that existing pricing models, contracts and commercial models are not yet necessarily mature enough to transition into platform-based revenue models. Insights as a service is a future direction but challenging at the moment, with the market not yet mature for outcome-based pricing.
    For B2B service-providers who are looking to generate new revenue from their digital twins, it is important to consider how the business model should be configured and identify major barriers to their success. Our research found that the barriers most often cited were cost, cybersecurity, cultural acceptance of the technology, commercial or market needs and, perhaps most significantly, a lack of buy-in from business leaders. Our 4Vs framework has been designed to help those leaders arrive at a better understanding of the business opportunities digital twin services can provide. We hope this will drive innovation and help digital twins realise their full business potential.  
    Our research to date has been through in-depth qualitative interviews across industry but we wish to expand this research and gather quantitative information on specific business model outcomes from digital twins across industry. 
    If you would like to support this research and learn more about the business model outcomes from digital twins, then please participate in our survey! 
    Take part in our survey here:     https://cambridge.eu.qualtrics.com/jfe/form/SV_0PXRkrDsXwtCnXg 
    Information sheet.pdf
    Like many companies Atkins, a member of the SNC-Lavalin group, is investing heavily in digital transformation and we all know that skills are a key enabler.  We wanted to be clear about the skills needed by our workforce to be able to deliver digitally.  The starting point was finding out what digital skills we have in the company.  Then we could identify gaps and how we might bridge them.
    But what are digital skills…and how do we measure them?
    As we pondered this, we realised that there were many, many challenges we would need to address.  Atkins is a ‘broad church’ comprising many professionals and technical specialisms.  Digital transformation is challenging the business in many different ways.  Articulating a single set of digital skills that reflects needs across the business is complicated by language, terminology and digital maturity.  Furthermore, unlike corporate engagement surveys, there is no established industry baseline that we can use to benchmark our corporate digital skills against.  To evaluate a skills gap would require an estimate of both the quantity and types of skills that will be required in the future – something that is far from certain given our industry’s vulnerability to disruption.
    We knew we were trying to grasp something universal and not sector or domain specific, so this is the definition we decided to use: Digital skills enable the individual to apply new/digital technology in their professional domain. 
    That left the question around how we measure Digital Skills.
    We did some research and explored several frameworks including Skills For the Information Age (SFIA), the EU Science Hub’s DigiComp and the DQ Institute’s framework.  As we were doing this, we became aware that the CDBB Skills and Competence Framework (SCF) was being launched and we immediately sensed it could be just what we were looking for. 
    Why?  Apart from being right up to date, it has a simple and straightforward structure and is capable to be tailored for an organisation.  The proficiency levels are very recognisable - Awareness, Practitioner, Working and Expert - and it is in the public domain.  But most importantly it seemed like a good fit because most of what we do at Atkins is in some way related to infrastructure and therefore is within the domain of digital twins. 
    But we needed to test that fit.  Our hypothesis was “…that the CDBB SCF had sufficient skills to represent the ability of our staff”.  We tested this with a survey, interviews, and a series of workshops.
    In the survey we looked at how individuals from different groups in the company (differentiated by their use of technology) understood the 12 skill definitions and the extent to which they needed and used each skill in their day-to-day role.  We also explored whether there were other digital skills that respondents felt were not recognised by the framework. We followed up the survey with interviews to clarify some of the responses and then used workshops to play back our findings and sense-check the conclusions.
    Our overall conclusion was that we had good evidence to support our hypothesis, i.e. that the CDBB SCF was a good fit for our workforce.  However, we realized we would need to bring the indicators to life so that users could relate them to their roles, particularly with people at the Awareness and Working Levels.  This is not unexpected.  Generally, people with lower levels of competence don’t know what they don’t know. 
    Another conclusion was that we needed a fifth, null competency indicator to recognise that not everyone needs to know about everything.
    In terms of next steps, we are working with a supplier to develop an online assessment tool so that we can apply the framework at scale.  We have rewritten the skills definitions and competence indicators to omit any reference to the NDT programme etc. although these were very minor changes.
    We are working on ways to bring the skill level indicators to life for our employees e.g. through guidance materials, FAQs etc.  We’re also developing an initial ‘rough and ready’ set of learning materials related to each of the digital indicators at Awareness and Practitioner levels.  We expect the CDBB’s Training Skills Register to feature prominently in this!
    Some of things we have parked for the moment are: (1) Moderation and accreditation of the assessments.  Our first wave will be self-assessed only, and (2) Integrating the skills into role definitions.
    We’re very grateful to CDBB for the timely creation of the SCF and I look forward to sharing our onward journey with the DT Hub community.
    Anglian Water is an early adopter of digital twins within the water sector, working closely with the Centre for Digital Built Britain to help develop the market and showcase how digital twins can support an organisation’s strategic outcomes.
    Anglian Water has a 15 year vision to develop a digital twin to sit alongside its physical assets.

    From an Anglian Water perspective, the Digital Twin is essentially an accurate digital representation of their physical assets, enabling insight, supporting decision making and leading to better outcomes. Aligning the digital twin objectives to Anglian Water’s regulated outcomes, as defined by the regulator OFWAT, has been a key step in developing the business case.
    With the initial vision and roadmap outlined the next step on the roadmap was to implement a proof of concept, to explore the value created from digital twins. Anglian Water undertook a discovery phase and a Proof of Concept with Black and Veatch for a Digital Twin back in 2019, and started to define how a Digital Twin would benefit the delivery and management of physical assets.
    The discovery phase looked to understand the current landscape, further enhancing the vision and roadmap, and establish persona requirements. It proved vital to really understand the organisation and the impact on people during this early exploratory work.
    The proof of concept looked at delivering three main outputs, focused on a pumping station to keep the scope focused and value measurable:
    To demonstrate an asset intelligence capability To demonstrate a visualisation capability To examine the asset data and architecture. Alongside the proof of concept other initiatives were kick started to consider how other elements of digital twin might add value, with a focus on more enhanced use of hydraulic models to explore how water networks could be further optimised.  Anglian Water recognised early on that by integrating and enhancing many of the existing enterprise systems, existing investments could be leveraged and technology gaps identified.
    Learning from the proof of concept and other early works Anglian Water looked to the next step of the roadmap, a scaled demonstrator on the Strategic Pipeline Alliance. The Strategic Pipeline Alliance was set up to deliver up to 500km of large scale pipeline, and alongside this to start defining and delivering the first phase of the digital twin. SPA has a 2025 vision is to deliver a large-scale, holistically linked water transfer resilience system. This will be operated, performance managed and maintained using advanced digital technology.
    The SPA team set about developing a digital twin strategy which is based on the wider corporate vision and enhances the proof of concept work. The basic premise of the SPA digital twin is to integrate traditionally siloed business functions and systems, to deliver enhanced capability across the asset lifecycle.
    As with Anglian Water the SPA strategy is focused on using the technology available and developing a robust enterprise, integration, and data architecture to create a foundation for digital twin. Taking this a step further it was decided to adopt a product based approach, thinking about the development of digital twin products aligned to physical assets, that could be re-used across the wider Anglian Water enterprise.
    This whole life product based approach threw up some interesting challenges, namely how to build a business case that delivered benefit to SPA and also enabled Anglian Water’s future ambitions, taking a lifecycle view of the value delivered.
    To achieve this meant considering and assessing the value to both SPA during the capital delivery phase and Anglian Water during the operational phases. This process also highlighted that certain elements of the digital twin deliver value to both SPA and Anglian Water equally and could therefore be considered as a shared benefit.
    The resulting benefits register helped to identify the value delivered to the alliance partners which was vital to securing the delivery board sign off. As Anglian Water are a partner in the alliance, the ability to demonstrate value in the operational phase with SPA developing the technical foundation, was another key element in securing the investment.
    As part of the overall process the SPA board were keen to see how the investment would be allocated, therefore the strategy evolved to incorporate the capabilities to be developed within SPA to enable digital twin. This helped to inform and validate the team for digital twin delivery.
    With the capabilities and organisational chart resolved, a governance framework was put into place to allow the digital twin evolution to be managed effectively, putting in place the right checks and balances. This has included input and oversight from the wider Anglian Water team as ultimately, they will be responsible for managing the various digital twins long term.
    To validate the digital twin against the SPA outcomes and objectives, the various elements of the digital twin were incorporated into the overall enterprise architecture. This has proved to be an important part of the process to ensure alignment to the wider capabilities and importantly ensure the right technology is in place. The enterprise architecture continues to evolve to include information objects below the application layer, again building on the product based approach, so that the enterprise architecture can be utilised in the wider Anglian Water Alliances.
    In total the development of the strategy, business case and capabilities has taken 6 months, however it is important to note that this builds on the earlier proof of concept and ideation during the initial mobilisation of SPA. Given the approach a key next step is to work with Anglian Water to explore accelerated deployment of SPA digital twins on other major schemes, to put to test the product approach and maximise the investment made.
    We have learnt from the early developments on SPA that articulating a whole life view of value is vital and that focusing on capital / operational stages is equally important, so that appropriate budget holders can see the value being delivered. We have also learnt the importance of having a bold vision which must be matched by clear definition of the first few steps, showing a long term roadmap for achieving an enterprise digital twin.
    What is certainly clear is that we still have a lot to learn, however by following good architectural best practice and placing people and our environment at the heart of digital twin, we have put in place a good foundation from which to build.
    If you would like to know more, please get in touch through the DT Hub.
    How manufacturers can structure and share data safely and sustainably. 
    Manufacturers of construction products produce a significant part of the information required to bring about a safer construction industry, but currently, this information isn’t structured or shared in a consistent way.
    If UK construction is to meet the challenges of a digital future and respond to the requirements of a new building safety regulatory system, it needs manufacturers to structure and share their data safely and sustainably.
    There’s no need to wait to digitise your product information. Making the correct changes now will bring immediate benefits to your business and long-term competitive advantage. This guide will help you identify what those changes are.
    Our guide helps decision-makers in manufacturing identify why supplying structured data is important, how to avoid poor investment decisions, safe ways to share information about products across the supply chain, and more.
    The Guide  https://www.theiet.org/media/8843/digitisation-for-construction-product-manufacturers-main-guide.pdf 8 Page Summary https://www.theiet.org/media/8856/digitisation-for-construction-product-manufacturers-summary.pdf
    2 Page  Key facts and Summary  https://www.theiet.org/media/8856/digitisation-for-construction-product-manufacturers-summary.pdf
    The UK’s approach to delivering complex infrastructure projects is obsolete, leading to far too many projects failing to meet the expectations of their sponsors and the public. That was the conclusion reached following a detailed review commissioned by the Institution of Civil Engineers. I was lucky enough to work on the review that was published as A Systems Approach to Infrastructure Delivery (SAID) and published in December 2020.
    At the centre of our findings is a call for a fundamental change of mindset. The review team, led by ex ICE Vice President Andrew McNaughton, concluded that even relatively small projects are now best seen as interventions into existing complex systems, made up of a mix of physical, human and digital components. 
    In this world, traditional civil engineering works, while still a large capital cost, only exists to support (or perhaps just keep dry!) these systems. It is easy to see that the system – not the civils – provides the infrastructure services on which people rely. More importantly, as Crossrail has shown us so clearly, the greatest sources of risk to a project now lie not in managing tunnelling or any other piece of heroic construction but in integrating and commissioning a fully functioning system – trains, stations, tracks, digital signalling, safety and communications, driver behaviour.
    SAID proposes 8 principles for better projects that can shift the infrastructure industry in this direction.  Principle 8 Data Oils your Project was built on detailed interviews with leading practitioners from inside and outside the infrastructure sector. Again and again we heard about the importance of all the project participants having access to consistent timely and reliable information. Client and owner organisations recognised that it is their responsibility to fix the data plumbing.  This means having the capability to define what information is needed to deliver and operate the asset. It also means ensuring that the project’s systems convert raw data into meaningful information that flows to team members as and when they need it to make decisions.
    Much of this is I think a no-brainer. What was really interesting was to see how thinking about data and digital is helping to generate a shift away from a traditional project mindset and towards a systems approach grounded in an understanding of the importance of what is already there.
    Every asset owner we spoke to recognised that we are now firmly in a world where project deliver a cyber-physical asset. They also get that their digital twin can be the basis for a robust delivery and commission plan that integrates the project’s physical outputs into the existing network.
    What was really interesting was to hear about the challenge of how a single project’s digital outputs can be effectively integrated into the existing cyber-physical system to create the kind of golden loop of information described in CDBB’s Flourishing Systems report of 2020. This would be real systems thinking, putting projects in their proper place in relation to the systems and human needs they are meant to be serving.
    The response to SAID has been overwhelmingly positive. In response ICE has commissioned a second phase of work in which we are exploring the SAID principles with a series of live projects and infrastructure sector organisations. Later in 2021 ICE will be publishing practical advice on implementing the 8 principles based on the insight generated by these discussions.  I hope that this blog can start a discussion with the CDBB network that will generate insight we can include in this advice and help the infrastructure world embrace a Systems Approach to Infrastructure Delivery
    In November 2020 DNV published the energy industry’s first recommended practice (RP) on how to quality-assure digital twins.  Our new RP, which we developed in collaboration with TechnipFMC, aims to set a benchmark for the sector’s various approaches to building and operating the technology. It guides industry professionals through:
    ·         assessing whether a digital twin will deliver to stakeholders’ expectations from the inception of a project 
    ·         establishing confidence in the data and computational models that a digital twin runs on 
    ·         evaluating an organization’s readiness to work with and evolve alongside a digital twin.  
    DNV’s RP intends to provide valuable guidance for digital twin developers; introduces a contractual reference between suppliers and users; and acts as a framework for verification and validation of the technology. It builds upon the principles of DNV’s Recommended Practices for the qualification of novel hardware technology and assurance of data and data-driven models.
    Making digital twins a real asset
    Physical assets are built to perform to the highest standards and undergo rigorous assurance processes throughout their life. However, there has been no requirement for their digital counterparts to go through the same procedures. Our new recommended practice seeks to remedy this issue as the technology begins a path of significant scaling across the sector. We believe it is time to prove that twins can be trusted and that the investments made in them give the right return!
    The methodology behind DNV’s new RP has been piloted on 10 projects with companies including Aker BP and Kongsberg Digital. It has also been through an extensive external hearing process involving the industry at large. In addition, TechnipFMC’s deep domain knowledge and expertise in digital technologies and oil and gas infrastructures has made an essential contribution to jointly developing the RP.
    A framework to handle complex requirements
    The framework provides clarity on the definition of a digital twin; required data quality and algorithm performance; and requirements on the interaction between the digital twin and the operating system. It addresses three distinct parts: the physical asset, the virtual representation, and the connection between the two. This connection amounts to the data streams that flow between the physical asset to the digital twin and information that is available from the digital twin to the asset and the operator for decision making.
    A preview copy of our recommended practice can be downloaded from our website: https://www.dnv.com/oilgas/digital-twins/preview-DNVGL-RP-A204-qualification-and-assurance-of-digital-twins.html
    We’d love to get your comments and feedback on our work – and look forward to giving a short overview of our methodology at the Gemini call on 3rd August 2021.
    Graham Faiz
    Head of Growth and Innovation UK & Ireland – Energy Systems
    Footnote:  Who are DNV?
    We’re an independent assurance and risk management company, part of our service offering includes the provision of software, platforms, cyber and other digital solutions to the energy sector. We have a specific focus on helping our customers manage risk and complexity linked to the energy transition, specifically their ongoing decarbonization and digitalization journeys.
    Company website: www.dnv.com
    Link to digital twin services: https://www.dnv.com/oilgas/digital-twins/services.html
    The building stock is a city’s most significant socio-cultural and economic resource and its largest capital asset. Buildings are also where we spend most of our lives and most of our money, and where enormous potential for energy and waste reduction lies. 
    To help improve the quality, sustainability and resilience of building stocks, and to help reduce emissions from them, comprehensive information on their composition, operation and dynamic behaviour are required. However in many countries relevant data are extremely difficult to obtain, often highly fragmented, restricted, missing or only available in aggregated form. 
    Colouring Cities sets out to address this issue. The initiative develops open code to facilitate the construction and management of low cost public databases, which double as knowledge exchange platforms, providing open data on buildings, at building level. These are provided to answer questions such as: How many buildings do we have? Which building types, uses, construction systems, ages, styles and sizes are located where? How repairable, adaptable and extendable are they? How long can they last if properly maintained? How energy efficient are they? Can they easily be retrofitted?  Who built them and what is their ownership type, and how well do local communities think they work? 
    Colouring Cities also looks to advance a more efficient, whole-of-society approach to knowledge sharing on buildings and cities, allowing for permanent databases to be collaboratively maintained and enriched, year-on-year, by citizens, academia, government, industry and the voluntary sector. Colouring London https://colouringlondon.org/, our live prototype, has been built and tested over the past five years using a step-by-step collaborative approach which has involved consultation with academia, government, industry, the voluntary sector and the community (working across science, the humanities and the arts). It looks to test four approaches to data provision-collation of existing open uploads, computational generation, local crowdsourcing and live streaming.
    In 2020 the Colouring Cities Research Programme was set up at The Alan Turing Institute to support international research institutions wishing to reproduce and co-work on Colouring Cities code at city or country level. We are currently collaborating with academic partners in Lebanon, Bahrain, Australia, Germany and Greece and Switzerland.
    Watch the Hub Insight to learn more about the project and the opportunity to get involved.
    If you'd like to get involved please do test our site and add any recommendations for features you would like in our discussion thread  https://discuss.colouring.london/. Or, if you are a public body or DTHub industry member wishing to increase open access to your infrastructure datasets,  and/or to digital twin visualisations, relating to the building stock, please contact Polly Hudson at Turing.
    Find out more:
    Who are we
    Game engine technology is at the heart of heralding a new age of content creation, immersive storytelling, design driven development, and business process innovation. These tools are now being utilised to work along side your data to create a visual front end digital twin, to allow for a more immersive, controllable and completely customisable digital twin application.
    Unreal Engine is a game engine created by Epic Games to allow developers to create their own games and immersive 3D worlds. This technology has seen fast adoption across a number of industries including Manufacturing, Automotive, Film and Media, Architecture, Engineering and Construction [AEC]. As the need to collaborate virtually with stakeholders and end-users has increased, and the need to customise unique applications and visualise our 3D models and data becomes more important, it is where the role of game engines in AEC is making a mark. Unreal Engine is a free, open source tool for creators to develop their custom real-time experiences.
    Unreal Engine and Digital twins
    Data alone can often be confusing and hard to understand, its not until the data is contextualised that you are able to better understand the data and turn it into information that can benefit  the project. This is where the Unreal Engine is here to support the Digital Twin communities, with its unique ability to aggregate data sources, from 3D geometry, BIM metadata, 4D construction data and IoT Hubs. Users are able to have a centralised location to contextualise the data in its native environment and allow users to build custom applications around it.
    Getting involved in our future roadmap...
    As we see more and more companies developing large scale digital twin applications, here at Epic Games we want to make sure we are providing everything you need to make your own digital twin applications with Unreal Engine. To allow you to integrate your existing data, geometry and IoT hub information into a visual platform for sharing with the world.
    We'd love to hear from you about how you see the world of digital twins evolving. Going forward, which tools and features will you find most valuable in creating digital twins? What kinds of training and support would you like to have access to from Epic Games on this?
    To help them serve you better, please take their survey about the current state of digital twins, and share your ideas or what you would like to see happen.
    Take the survey here
    Results of this survey will be shared to the community for wider awareness. In the mean time you can check out a recent article we shared with one of our customers in China:
    Good day to you!
    I am a member of the BSi e-committee, tasked with producing the attached draft of BS 99001:2021 Quality management systems.
    It has been produced with the intention of being utilised alongside BS EN ISO 9001:20159001 in the UK construction sector, as it has specific requirements for the built environment sector.
    It is out for public consultation until the 24th July 2021. Thereafter BSI shall hold comment resolution meeting(s) to address and resolve comments received.
    The aim of this new quality management standard is to ensure that in the wake of the Grenfell Tower Inquiry, BS EN ISO 9001:20159001 remains relevant to UK construction industry. 
    Because the NDTp is such an important element of the ever changing landscape of the UK construction industry, the BSi e-committee would very much appreciate feed back from those who are heavily involved in digitalization of the built environment in general, and those who are committed to the NDTp in particular, on the draft version of BS 99001:2021. Specifically feedback on this question would be very gratefully received:- 
    Will this new, built environment centric quality management system, actually help the NDTp achieve its vision, by not only supporting that vision, but actually being a key enabler of that vision?
    Please do make comments using the online SDP system. Please note comments need to be saved and submitted individually.
    Obviously if you have any questions, please do contact me.
    Sincere and grateful thanks in advance everyone,
    BSI 99001.pdf
    The attached guide was put together from discussions and knowledge share through the Infrastructure Asset Data Dictionary for the UK (IADD4UK) group between 2013 and 2018. Updated where appropriate to include the most recent standards and some additional thought leadership.
    The IADD4UK initiative was formed of the foremost owners, major projects, delivery partners and interested parties under the chairmanship of the COMIT innovations group. A list of participants can be found at the rear of this guide.
    Early in our BIM journey it was recognised that data and its slightly more refined form, information would be the key. We had standards as how to classify it, manage it, secure it, procure it, exchange it, but nothing about what “it” actually was.
    It was also understood that this required information would have an impact on everything we do with our assets, across the entirety of its lifecycle. That impact had a relationship with the outcomes delivered to their respective clients, whether that was an end user, consumer, member of the public, a shareholder or the country itself. The delivery of the outcomes ensured that there was a value in the information, without which their upkeep would not be possible.
    The IADD4UK group was put together with an agreement to research and document the best way to create information requirements, not to write them, but it was agreed that if organisations could come together when writing them, the costs and risk could be shared and the benefits doubled.
    The reason for increased benefits, were that when assets were transferred from one owner to another, or between delivery partners they would be described in the same way, negating the risks of translation and converting information from one system to another. Key assets in infrastructure are basically the same, whether they are owned by a transport, communications, energy or water company. They will have the same questions, tasks and decisions during their lifecycle. The answers will be different, but the basic information requirement will be largely the same. This commonality across owners could help reduce the procurement costs and the risks of generating, managing and exchanging each information set with the side effect of reducing interoperability issues between software packages.
    In 2017 the IADD4UK organisation was put on hold for various reasons, chiefly lack of funding to both create and curate a common information requirements dictionary. This meant that the participants in the initiative dispersed to create their own data dictionaries utilising some of the methods and processes shared with you in this guide.
    Writing information requriements by IADD4UK.pdf
    Something related to digital twins:
    "Delta Sharing is the industry’s first open protocol for secure data sharing, making it simple to share data with other organizations regardless of which computing platforms they use." -https://delta.io/sharing/
    More information:
    <Introducing Delta Sharing: An Open Protocol for Secure Data Sharing> https://databricks.com/blog/2021/05/26/introducing-delta-sharing-an-open-protocol-for-secure-data-sharing.html
    Last month, on Thursday 25 February, techUK released a landmark report ‘Unlocking Value Across the UK’s Digital Twin Ecosystem’, alongside the much anticipated publication of the CDBB’s ‘Digital Twin Toolkit’ report. Please see here for the full recording of the session: 
    To kick-off, Tom Henderson (Programme Manager, Smart Infrastructure & Systems, techUK) thanked members of the Digital Twins Working Group (DTWG) for their deep insight and hard work, welcoming the publication before running through the different parameters of techUK's report- highlighting the core strategic conclusions and recommendations (2:57) which focus on the need to:
    Develop a cross-cutting, interdisciplinary coordinating body to drive forward digital twin adoption and diffusion in the UK 
    Demonstrate value from (and explore barriers to) the adoption and diffusion of digital twins via a series of strategic demonstrators 
    Trigger the adoption of digital twins across the UK by exploring the development of an online digital twin procurement portal 
    Work with industry to identify talent pipeline requirements and anticipate levels of future demand for skills across the UK’s digital twin ecosystem 
    Fund a Net Zero 2050 digital twin demonstrator to establish the UK as a global leader in leveraging digital twins for decarbonisation 
    Following the release of the techUK report, Sarah Hayes (Change Stream Lead, National Digital Twin Programme) provided an insightful overview of the NDTP and ran through the significance and findings of the newly released DT Toolkit (9:05), which looks at: 
    What is a digital twin? 
    What can a digital twin be used for? 
    Key case studies 
    How to build a business case template?
    How to develop a digital twin roadmap? 
    Thanking the Toolkit team for their hard work and deep technical expertise, Sarah signposted the opportunity to continue engaging in the development and application of the DT Toolkit via the Digital Twin Hub – an online resource where you can learn more about emerging digital twin initiatives and share insights across the UK’s digital twin ecosystem. techUK looks forward to continuing work with the CDBB and encourages techUK members of all shapes and sizes to sign up for the DT Hub moving forward! 
    Subsequently (23:30), delegates heard from the Parliamentary Under Secretary of State for Science, Research and Innovation – Amanda Solloway MP, who took the time to welcome the publication of the reports and expressed optimism around the role that digital twin technologies can play in enabling the UK to become a world-leading scientific superpower. 
    In particular, the Minister discussed the link between digital twins and possibilities to drive prosperity, create new products, services, and jobs, and to transform public services. techUK would like to thank Minister Solloway for taking the time, and welcomes the Government’s recognition that digital twins are critical – not only for our recovery from the pandemic, but also to our long-term growth and productivity.  
    Download and read the full report here.
    Icebreaker One has won a major UK Research and Innovation competition for the Open Energy project, which aims to revolutionise the way data is shared across the energy sector to make sure the UK achieves its net-zero goals.
    It means the project will receive £750k in UK Government funding to continue developing a standard that all organisations in the energy data ecosystem can use to search, share and access data. It’s also developing a prototype governance platform to make sure data is shared securely. 
    Icebreaker One hosted a webinar on 16 February at 10am to share more information about its progress so far and plans for the future.
    View launch webinar (16 February 2021)
    View project summary briefing
    Open Energy aims to transform the way organisations exchange the information they need to phase out fossil fuels and implement renewable energy technology. Icebreaker One is aiming to roll out the Open Energy standards, guides and recommendations across the energy sector over the next year.
    Open Energy has been guided by industry advisory groups across the UK which include representatives from Ofgem, Scottish Power and SSE. It’s led by Gavin Starks, one of the key figures behind the Open Banking Standard that has revolutionised the banking sector over the past five years.
    Icebreaker One worked with project partners Open Climate Fix, Raidiam and PassivSystems, to win the Modernising Energy Data Access (MEDA) competition, run by Innovate UK as part of the Industrial Strategy Prospering from the Energy Revolution programme. 
    A summary of the MEDA Phase Two work is available here.
    Gavin Starks, founder and CEO at Icebreaker One, said,
    “We’re delighted to have this backing to continue developing the data infrastructure to help unlock access to data to deliver efficiency and innovation across the energy sector.

    This will have a material impact on the UK’s ability to make the most of decentralised energy supply and consumption, help address the coming challenges of the transition to electric vehicles and catalyse the delivery of our net-zero targets.

    Our work will help unlock data discovery by enabling energy data search and usage by delivering a trusted ecosystem for decentralised data sharing.”
    Rob Saunders, Challenge Director, Prospering from the Energy Revolution at UKRI, said:
    “The MEDA competition was designed to accelerate innovative ways for energy data to be open-sourced, organised and accessed, providing a platform for new technology, services and more agile regulation within the energy sector. 
    “The Icebreaker One project showed exactly what can be achieved through collaborative thinking and will help create a framework for all stakeholders to share data further for the common benefit – and ultimately for the UK’s net-zero ambitions. We are looking forward to working with them closely as the project develops further.”
    David Manning, Head of Data Management at SSE plc, said: “At SSE we recognise that becoming a data driven organisation is critical to our role in helping achieve a net zero world.”
    “Readily accessible and trusted data will be essential to building the decarbonised energy system of the future; ensuring flexibility, customisation and personalisation for energy users, large and small. It’s exciting to see the progress being made in this space.”
    “There are two things in life for which we are never truly prepared: twins.”
    Josh Billings
    We have thought a lot about Digital Twins in recent times and heard an awful lot more. But there is always room for new thoughts on any subject, hence this short series of articles. We want to share fresh views with the experts and with the uninitiated. And we’ll include a hidden gem each week.
    We’ll speak in plain English. We won't talk about taxonomies, ontologies or system of systems. Instead we will look to the wisdom of Rumsfeld, Einstein, Gandhi and others to explore the wonderful world of twinning. And we’ll keep the number of words below 400 for most of the time. That’s just one page of your valuable time. We’ll post one every week for the next few weeks, starting today, and then stop (or maybe start talking about something else when we are done).
    Here are the different episodes in the series:
    1.  Known unknowns. Unlocking awareness, knowledge and action.
    2.  Time and space. The relativity of structure, behaviour and certainty.
    3.  Trusted friends. Authority, assurance and agency.
    4.  A puppy isn’t just for Christmas. Long-term value.
    5.  Greeks bearing gifts. Giving context.
    6.  Back to the future. History, science and maths.
    7.  Wisdom of the crowds. People matter.
    So, settle back and read the first in the series. It shows us how Donald Rumsfeld has helped us unlock some of the hidden secrets of Digital Twins. And why we should seriously consider using them more.
    Peter van Manen & Mark Stevens .. Frazer-Nash Consultancy
  • Create New...