Motion sensors, CO₂ sensors and the like are considered to be benign forms of monitoring, since they don’t capture images or personal data about us as we move through the buildings we visit. Or at least, that’s what we want to believe. Guest blogger Professor Matthew Chalmers (University of Glasgow) helped develop a mobile game called About Us as part of the CDBB funded Project OAK. The game takes players through a mission using information from building sensors to help them achieve their aims — with a twist at the end. He writes about why we all need to engage with the ethics of data collection in smart built environments.
Mobile games are more than just entertainment. They can also teach powerful lessons by giving the player the ability to make decisions, and then showing them the consequences of those decisions. About Us features a simulated twin of a building in Cambridge, with strategically placed CO₂ sensors in public spaces (such as corridors), and raises ethical questions about the Internet of Things (IoT) in buildings.
The premise of the game is simple. While you complete a series of tasks around the building, you must avoid the characters who you don’t want to interact with (as they will lower your game score), and you should contact your helpers — characters who will boost your score. You can view a map of the building, and plan your avatar’s route to accomplish your tasks, based on which route you think is safest. On the map, you can watch the building’s sensors being triggered. By combining this anonymous sensor data with map details of which offices are located where, you can gather intelligence about the movements of particular characters. In this way, you can find your helpers and avoid annoying interactions. If you’ve avoided the bad characters and interacted with the good characters while completing your tasks, you win the game.
However, a twist comes after you have finished: the game shows you how much could be inferred about your game character, from the exact same sensors that you had been using to make inferences about other characters. Every task in the game exposes some sensitive data about the player’s avatar, and reinforces the player’s uncomfortable realisation that they have exploited apparently neutral data to find and avoid others.
What does this tell us about the ethics of digital twins? Our journeys through the built environment can reveal more than we intend them to, e.g. our movements, our routines, where we congregate, and where we go to avoid others. All this information could inadvertently be revealed by a building digital twin, even though the data used seems (at first glance) to be anonymous and impersonal. The game used CO₂ levels as an example of apparently impersonal data that, when combined with other information (local knowledge in this case), becomes more personal. More generally, data might be low risk when isolated within its originating context, but risk levels are higher given that data can be combined with other systems and other (possibly non-digital) forms of information.
The Gemini Principles set out the need for digital twins to be ethical and secure, but About Us demonstrates that this can be surprisingly difficult to ensure. Collecting data through digital twins provides aggregate insights — that’s why they’re so useful — but it also creates risks that need ongoing governance. It’s vitally important that citizens understand the double-edged problem of digital twins, so that citizens are more able to advocate for how they want the technology to be used, and not used, and for how governance should be implemented.
Gamification is now a well-established technique for understanding and changing user attitudes toward digital technology. About Us was designed to create a safe but challenging environment, in which players can explore an example of data that could be collected in distributed computing environments, the uses to which such data can be put, and the intelligence that can be gathered from resulting inferences. The ultimate purpose of Project OAK is to enable anyone concerned with how data is managed (e.g., data processors, data subjects, governance bodies) to build appropriate levels of trust in the data and in its processing. Only if we recognise the ethical and legal issues represented by digital twins can we start to give meaningful answers to questions about what good system design and good system governance look like in this domain.
Information about this project is available on their GitHub page.
This research forms part of the Centre for Digital Built Britain’s (CDBB) work at the University of Cambridge. It was enabled by the Construction Innovation Hub, of which CDBB is a core partner, and funded by UK Research and Innovation (UKRI) through the Industrial Strategy Challenge Fund (ISCF).
To join the conversation with others who are on their own digital twin journeys, join the Digital Twin Hub.