To asset owners and managers, understanding how people move through and use the built environment is a high priority, enabling better, more user-focused decisions. However, many of the methods for getting these insights can feel invasive to users. The latest output from Digital Twin Journeys looks at how a researcher at the University of Cambridge has solved this problem by teaching a computer to see. Watch the video to learn more.
Working from the University of Cambridge Computer Laboratory, Matthew Danish is developing an innovative, low-cost sensor that tracks the movement of people through the built environment. DeepDish is based on open-source software and low-cost hardware, including a webcam and a Raspberry Pi. Using Machine Learning, Matthew has previously taught DeepDish to recognise pedestrians and track their journeys through the space, and then began training them to distinguish pedestrians from Cambridge’s many cyclists.
One of the key innovations in Matthew’s technique is that no images of people are actually stored or processed outside of the camera. Instead, it is programmed to count and track people without capturing any identifying information or images. This means that DeepDish can map the paths of individuals using different mobility modes through space, without violating anyone’s privacy.
Matthew’s digital twin journey teaches us that technological solutions need not be expensive to tick multiple boxes, and a security- and privacy-minded approach to asset sensing can still deliver useful insights.
To find out more about DeepDish, read about it here.
This research forms part of the Centre for Digital Built Britain’s (CDBB) work at the University of Cambridge. It was enabled by the Construction Innovation Hub, of which CDBB is a core partner, and funded by UK Research and Innovation (UKRI) through the Industrial Strategy Challenge Fund (ISCF).