Jump to content
Join live conversation in our weekly Gemini Call - knowledge sharing and digital twin presentations from the community ×

Search the Community

Showing results for tags 'Sensors'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Themes

  • Member Forums
    • General discussion
    • Testing digital twin concepts
    • Digital twin competencies
    • Pathway to value
    • IMF Pathway
    • Videos
    • Q & A
  • IMF Community's Forum
  • Data Value and Quality's Forum
  • 4-dimensionalism's Forum
  • The Defence Digital Network's Welcome!
  • The Defence Digital Network's Defence DT Roadmap
  • The Defence Digital Network's Acronym Buster
  • The Defence Digital Network's Open Forum
  • The Defence Digital Network's Documents
  • Open Innovation - Open (Citizen) Science - SDG's Open Innovation
  • Open Innovation - Open (Citizen) Science - SDG's Social Innovation Ecosystem
  • Open Innovation - Open (Citizen) Science - SDG's Events
  • Funding / Collaborative R&D Opportunities's Challenges
  • Funding / Collaborative R&D Opportunities's Funding
  • Italian DT Hub's Q&A
  • Italian DT Hub's News
  • Gemini Papers Community Review's Gemini Papers
  • DT Hub Community Champions's Discussion
  • Gemini Call's Full recordings
  • Gemini Call's Gemini Snapshot and agenda

Calendars

  • Community Calendar
  • Italian DT Hub's Events
  • DT Hub Community Champions's Events

Categories

  • A survey of Top-level ontologies - Appendix D

Categories

  • Articles
  • Blogs
  • Publications
  • Editorials
  • Newsletters
  • Shared by the Community

Categories

  • A Survey of Industry Data Models and Reference Data Libraries

Categories

  • Climate Resilience Demonstrator (CReDo)
  • Gemini Call Feature Focus presentations
  • Hub Insights
  • Digital Twin Talks: Interconnection of digital twins
  • Digital Twin Talks: Exploring the definitions and concepts of digital twins
  • Other Digital Twin media

Categories

  • Member Resources
  • Public Resources
  • Guidance
  • IMF Community's Files
  • Data Value and Quality's Shared Resources
  • Italian DT Hub's Risorse
  • Gemini Call's Slide decks
  • Gemini Call's Archive

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Found 11 results

  1. Talk Title: Aerogeophysics at the British Antarctic Survey: science, sensors and future systems Speaker(s): Tom Jordan & Carl Robinson Date & Time: 19th May 2022, 17:00 Where: Lecture Theatre 1, Department of Chemical Engineering and Biotechnology, University of Cambridge The event will be followed by a drink reception, food, and network opportunities for students and supervisors. Please register to attend here: https://www.eventbrite.co.uk/e/industry-lecture-tom-jordan-carl-robinson-british-antarctic-survey-tickets-319647984727
  2. To asset owners and managers, understanding how people move through and use the built environment is a high priority, enabling better, more user-focused decisions. However, many of the methods for getting these insights can feel invasive to users. The latest output from Digital Twin Journeys looks at how a researcher at the University of Cambridge has solved this problem by teaching a computer to see. Watch the video to learn more. Working from the University of Cambridge Computer Laboratory, Matthew Danish is developing an innovative, low-cost sensor that tracks the movement of people through the built environment. DeepDish is based on open-source software and low-cost hardware, including a webcam and a Raspberry Pi. Using Machine Learning, Matthew has previously taught DeepDish to recognise pedestrians and track their journeys through the space, and then began training them to distinguish pedestrians from Cambridge’s many cyclists. One of the key innovations in Matthew’s technique is that no images of people are actually stored or processed outside of the camera. Instead, it is programmed to count and track people without capturing any identifying information or images. This means that DeepDish can map the paths of individuals using different mobility modes through space, without violating anyone’s privacy. Matthew’s digital twin journey teaches us that technological solutions need not be expensive to tick multiple boxes, and a security- and privacy-minded approach to asset sensing can still deliver useful insights. To find out more about DeepDish, read about it here. This research forms part of the Centre for Digital Built Britain’s (CDBB) work at the University of Cambridge. It was enabled by the Construction Innovation Hub, of which CDBB is a core partner, and funded by UK Research and Innovation (UKRI) through the Industrial Strategy Challenge Fund (ISCF).
  3. 3Cell currently going through one of the largest transformations in the company’s history. So far, 3Cell has mainly been an expert/engineering services provider to multinationals, including mobile operators and mobile network equipment manufacturers. However, we are now working on our own exciting range of products and solutions. 3Cell has prototyped a fully autonomous drone offering that will significantly simplify and digitalise telco site surveys. Our drone offering can operate autonomously to complete surveying missions and use the data collected to generate 3D site models, report on signal strength measurements, manage digital asset data and identify any structural site issues. 3Cell’s grant fund application for Innovate UK has been submitted and waiting for the result. 3Cell have conducted initial research into providing a full end-to-end autonomous drone telecommunications survey solution (elipptic), developing the base architecture for the software platform, and identifying the necessary hardware and cloud engine specifications. This project enables 3Cell to develop a novel AI engine, utilising 3D modelling of the tower and surrounding areas (identifying blocking points) to provide a full survey report and equipment inventory with insights (e.g. antenna positioning/direction) to maximise ROI and 5G network performance, removing the reliance on end-user experience. However this solution could apply to other industries as well where 3D modelling and its data acquisition required. We have already partnered with a range of innovative organisations to turn the prototype into an industrial offering. These include Brunel University, IUK, Cambridge Wireless, UK5G, BT and the University of Westminster. We are getting a range of technical, financial and commercial support from these partners. After all collaborative projects, we expect the first version of the industrialised solution to be working at client sites by late 2022. As part of our solution development and transformation effort, we are calling for organisations or individuals who have an interest in running pilot projects, 5G/Drone trials and in asset digitisation those that can help us increase awareness and build a network in the industry, as well as those that are working on similar drone-based (telco/non-telco) solutions.
  4. Sensor technology has come a long way over the last 30 years, from the world’s first, bulky webcam at the University of Cambridge Computer Science Department to near ubiquitous networks of sleek sensors that can provide data at an unprecedented volume, velocity and quality. Today, sensors can even talk to each other to combine single points of data into useful insights about complex events. The new webcomic ‘Coffee Time’ by Dave Sheppard, part of the Digital Twin Journeys series, tells the story of this evolution and what it means for what we can learn about our built environment through smart sensors. Starting with a simple problem – is there coffee in the lab’s kitchen? – researchers in the early 1990s set up the world’s first webcam to get the information they wanted. Today, people in the Computer Lab still want to know when the coffee is ready, but there are more ways to solve the problem, and new problems that can be solved, using smart sensors. Smart sensors don’t just send information from point A to point B, providing one type of data about one factor. That data needed to be collated and analysed to get insights. Now sensors can share data with each other and generate insights more instantaneously. The West Cambridge Digital Twin team at the computer lab have looked at how specific sequences of sensor events can be combined into an insight that translates actions in the physical world into carefully defined digital events. When someone makes coffee, for example, they might turn on a machine to grind the coffee beans, triggering a smart sensor in the grinder. Then they’d lift the pot to fill it with water, triggering a weight sensor pad beneath to record a change in weight. Then they would switch the coffee machine on, triggering a sensor between the plug and the outlet that senses that the machine is drawing power. Those events in close succession, in that order, would tell the smart sensor network when the coffee is ready. These sequences of sensor triggers are known as complex events. Using this technique, smart sensors in the built environment can detect and react to events like changes in building occupancy, fires and security threats. One advantage of this approach is that expensive, specialist sensors may not be needed to detect rarer occurrences if existing sensors can be programmed to detect them. Another is that simple, off-the-shelf sensors can detect events they were never designed to. As the comic points out, however, it is important to programme the correct sequence, timing and location of sensor triggers, or you may draw the wrong conclusion from the data that’s available. Something as simple as wanting to know if the coffee is ready led to the first implementation of the webcam. Digital twin journeys can have simple beginnings, with solving a simple problem with a solution that’s accessible to you, sparking off an evolution that can scale up to solve a wide range of problems in the future. You can read and download the full webcomic here. You can read more from the West Cambridge Digital Twin project by visiting their research profile. This research forms part of the Centre for Digital Built Britain’s (CDBB) work at the University of Cambridge. It was enabled by the Construction Innovation Hub, of which CDBB is a core partner, and funded by UK Research and Innovation (UKRI) through the Industrial Strategy Challenge Fund (ISCF).
  5. Digital twins are not just a useful resource for understanding the here-and-now of built assets. If an asset changes condition or position over its lifecycle, historical data from remote sensors can make this change visible to asset managers through a digital twin. However, this means retaining and managing a potentially much larger data set in order to capture value across the whole life of an asset. In this blog post, Dr Sakthy Selvakumaran, an expert in remote sensing and monitoring, tells us about the importance of curation in the processing of high-volume built environment data. There are many sources of data in the built environment, in increasing volumes and with increasing accessibility. They include sensors added to existing structures – such as wireless fatigue sensors mounted on ageing steel bridges – or sensors attached to vehicles that use the assets. Sources also include sensing systems including fibre optics embedded in new structures to understand their capacity over the whole life of the asset. Even data not intended for the built environment can provide useful information; social media posts, geo-tagged photos and GPS from mobile phones can tell us about dynamic behaviours of assets in use. Remote sensing: a high-volume data resource My research group works with another data source – remote sensing – which includes satellite acquisitions, drone surveys and laser monitoring. There have been dramatic improvements in spatial, spectral, temporal and radiometric resolution of the data gathered by satellites, which is providing an increasing volume of data to study structures at a global scale. While these techniques have historically been prohibitively expensive, the cost of remote sensing is dropping. For example, we have been able to access optical, radar and other forms of satellite data to track the dynamic behaviour of assets for free through open access policy of the European Space Agency (ESA). The ESA Sentinel programme’s constellation of satellites fly over assets, bouncing radar off them and generating precise geospatial measurements every six days as they orbit the Earth. This growing data resource – not only of current data but of historical data – can help asset owners track changes in the position of their asset over its whole life. This process can even catch subsidence and other small positional shifts that may point to the need for maintenance, risk of structural instability, and other vital information, without the expense of embedding sensors in assets, particularly where they are difficult to access. Data curation One of the key insights I have gained in my work with the University of Cambridge’s Centre for Smart Infrastructure and Construction (CSIC) is that data curation is essential to capture the value from remote sensing and other data collection methods. High volumes of data are generated during the construction and operational management of assets. However, this data is often looked at only once before being deleted or archived, where it often becomes obsolete or inaccessible. This means that we are not getting the optimal financial return on our investment on that data, nor are we capturing its value in the broader sense. Combining data from different sources or compiling historical data can generate a lot of value, but the value is dependent on how it is stored and managed. Correct descriptions, security protocols and interoperability are important technical enablers. Social enablers include a culture of interdisciplinary collaboration, a common vision, and an understanding of the whole lifecycle of data. The crucial element that ensures we secure value from data is the consideration of how we store, structure and clean the data. We should be asking ourselves key questions as we develop data management processes, such as: ‘How will it stay up to date?’ ‘How will we ensure its quality?’ and ‘Who is responsible for managing it?’ Interoperability and standardisation The more high-volume data sources are used to monitor the built environment, the more important it is that we curate our data to common standards – without these, we won’t even be able to compare apples with apples. For example, sometimes when I have compared data from different satellite providers, the same assets have different co-ordinates depending on the source of the data. Like ground manual surveying, remote measurements can be made relative to different points, many of which assume (rightly or wrongly) a non-moving, stationary point. Aligning our standards, especially for geospatial and time data, would enable researchers and practitioners to cross-check the accuracy of data from different sources, and give asset managers access to a broader picture of the performance of their assets. Automated processing The ever increasing quantity of data prohibits manual analysis by human operators beyond the most basic tasks. Therefore, the only way to enable data processing at this large scale is automation, fusing together remote sensing data analysis with domain-specific contextual understanding. This is especially true when monitoring dynamic urban environments, and the potential risks and hazards in these contexts. Failure to react quickly is tantamount to not reacting at all, so automated processing enables asset owners to make timely changes to improve the resilience of their assets. Much more research and development is needed to increase the availability and reliability of automated data curation in this space. If we fail to curate and manage data about our assets, then we fail to recognise and extract value from it. Without good data curation, we won’t be able to develop digital twins that provide the added value of insights across the whole life of assets. Data management forms the basis for connected digital twins, big data analysis, models, data mining and other activities, which then provide the opportunity for further insights and better decisions, creating value for researchers, asset owners and the public alike. You can read more from the Satellites project by visiting their research profile. This research forms part of the Centre for Digital Built Britain’s (CDBB) work at the University of Cambridge. It was enabled by the Construction Innovation Hub, of which CDBB is a core partner, and funded by UK Research and Innovation (UKRI) through the Industrial Strategy Challenge Fund (ISCF). For more on the Digital Twin Journeys projects, visit the project's homepage on the CDBB website.
  6. Our latest output from the Digital Twin Journeys series is a webcomic by David Sheppard. 'Now We Know' tells the story of a fictional building manager, Hank, who isn't sure how a building digital twin can help him in his work when the existing building management system tells him what he thinks he needs to know. This same tension plays out around real-world digital twin development, as advocates point to the promise of perfect, right-time information to make better decisions, while others remain unconvinced of the value that digital twins can add. As the West Cambridge Digital Twin research team developed a prototype digital twin, they encountered this barrier, and found that working with the building-manager-as-expert to co-develop digital twin capability is the way to go. While they grounded iterations of the prototype in the building managers' present needs, they were also able to present the potential capability of the digital twin in ways that demonstrated its value. This is mirrored in the fictional narrative of the comic in the consultation between the Cambridge Digital Twin Team expert and the building manager, Hank. Involving end users, like building occupants and managers, in the design and development of digital twins will ensure that they meet real-world information needs. Both people and data bring value to the whole-life management of assets. Many uncertainties exist in the built environment, and in many cases when pure data-driven solutions get into trouble (e.g. through poor data curation or low data quality), expertise from asset managers can bolster automated and data-driven solutions. Therefore, incorporating the knowledge and expertise of the frontline managers is crucial to good decision-making in building operations. The benefits of this hybrid approach work in the other direction as well. While the knowledge developed by building managers is often lost when people move on from the role, the digital twin enables the curation of data over time, making it possible to operate buildings beyond the tenure of individual staff members based on quality data. At present, the knowledge of experienced asset managers in combination with existing building information, is greater than the insights that early-stage digital twins can offer. But that does not mean that the promise of digital twins is a false one. It simply means that there is still a long way to go to realise the vision of right-time, predictive information portrayed in the comic. Digital twin prototypes should be developed in partnership with these experienced stakeholders. You can read more from the West Cambridge Digital Twin project by visiting their research profile, and find out about more Digital Twin Journeys on the project's homepage. This research forms part of the Centre for Digital Built Britain’s (CDBB) work at the University of Cambridge. It was enabled by the Construction Innovation Hub, of which CDBB is a core partner, and funded by UK Research and Innovation (UKRI) through the Industrial Strategy Challenge Fund (ISCF).
  7. Digital twins enable asset owners to use better information at the right time to make better decisions. Exploring the early stages of a digital twin journey – understanding the information need – are Staffordshire Bridges researcher Dr Farhad Huseynov and Head of Information Management Henry Fenby-Taylor. Network Rail manages over 28,000 bridges, with many being more than 150 years old. The primary means of evaluating the condition of the bridges is through two assessment programmes; visual examination and Strength Capability Assessment. Every conceivable form of bridge construction is represented across Network Rail’s portfolio of assets, from simple stone slabs to large estuary crossings, such as the Forth Bridge. Managing a portfolio of this diversity with frequent and extensive assessments is a considerable challenge. Condition monitoring The current process for condition monitoring involves visual examination by engineers and takes place every year, along with a more detailed examination every six years. The visual inspection provides a qualitative outcome and does not directly predict the bridge strength; it is conducted to keep a detailed record of visible changes that may indicate deterioration. The load-carrying capacity of bridges is evaluated every five years through a Strength Capability Assessment, conducted in three levels of detail: Level 1 is the simplest, using safety assumptions known to be conservatively over-cautious (i.e. 1-dimensional structural idealisation). Level 2 involves refined analysis and better structural idealisation (i.e. grillage model). This level may also include the use of data on material strength based on recent material tests, etc. Level 3 is the most sophisticated level of assessment, requiring bridge-specific traffic loading information based on a statistical model of the known traffic. Understanding the information and insights that asset owners require helps shape what data is needed and how frequently it should be collected – two essential factors in creating infrastructure that is genuinely smart. During the discussions with Network Rail, the research team found that Level 3 assessment is only used in exceptional circumstances. This is because there is no active live train load monitoring system on the network; hence there is no site-specific traffic loading information available for the majority of bridges. Instead, bridges failing Level 2 assessment are typically put under weight and/or speed restrictions, reducing their ability to contribute to the network. This means that there is potentially huge value in providing Level 3 assessment at key sites with greater frequency. Digital twins for condition assessment The Stafford Area Improvement Programme was setup to remove a bottleneck in the West Coast Main Line that resulted in high-speed trains being impeded by slower local passenger and goods trains. To increase network capacity and efficiency, a major upgrade of the line was undertaken, including the construction of 10 new bridges. Working with Atkins, Laing O’Rourke, Volker Rail and Network Rail, a research team including the Centre for Smart Infrastructure and Construction (CSIC), the Centre for Digital Built Britain (CDBB) and the Laing O’Rourke (LOR) Centre for Construction Engineering and Technology at the University of Cambridge is collaborating with Network Rail to find a digital twin solution for effective condition monitoring. Two bridges in the scheme were built with a variety of different sensors to create a prototype that would enable the team to understand their condition, performance and utilisation. Both bridges were densely instrumented with fibre optic sensors during construction, enabling the creation of a digital twin of the bridges in use. The digital twin’s objective is to provide an effective condition monitoring tool for asset and route managers, using the sensor array to generate data and derive insights. Identifying challenges and solutions Meetings were held with key stakeholders including route managers and infrastructure engineers at Network Rail to learn the main challenges they face in maintaining their bridge stock, and to discover what information they would ideally like to obtain from an effective condition monitoring tool. The team liaised closely with the key stakeholders throughout to make sure that they were developing valuable insights. Through discussions with Network Rail about the team’s work on the two instrumented bridges in the Staffordshire Bridges project the following fundamental issues and expected outcomes were identified: A better understanding of asset risks: How can these be predicted? What precursors can be measured and detected? A better understanding of individual asset behaviour Development of sensor technology with a lifespan and maintenance requirement congruent with the assets that they are monitoring How structural capability be calculated instantly on the receipt of new data from the field Development of a holistic system for the overall health monitoring and prognosis of structures assets Realistic traffic population data in the UK railway network. (Can this be predicted with sufficient accuracy for freight control and monitoring purposes?) To address these issues, the team instrumented one of the bridges with the following additional sensors, which, combined, produce a rich dataset: Rangefinder sensors to obtain the axle locations. A humidity and temperature sensor to improve the accuracy of weight predictions against variations in ambient temperature. Accelerometers to calculate rotational restraints at the boundary conditions and therefore improve the accuracy of weight predictions. Cameras to categorise passing trains. Data from these sensors feeds into a finite element model structural analysis digital twin that interprets the data and provides a range of insights about the performance of the bridge and the actual strain it has been put under. Applying insights to other bridges Significantly, information from the instrumented bridge sites is relevant to adjacent bridges on the same line. Having one bridge instrumented on a specific route would enable Level 3 assessment for other structures in their portfolio and those of other asset owners, including retaining walls, culverts, and other associated structures. Just as the new bridges relieved a service bottleneck, digital twins can resolve procedural and resource bottlenecks by enabling insights to be drawn about the condition of other assets that weren’t instrumented. This is a valuable insight for those developing their own digital twins, because given that one bridge is instrumented it follows that where trains cannot have diverted course, then any other bridges along that same stretch of track will be undergoing the same strain from the same trains. This insight will enable teams implementing sensors to be able to efficiently implement a sensor network across their own assets. One of the outcomes of the Staffordshire Bridges project is development towards a holistic approach for the overall health monitoring and prognosis of bridge stocks. Such changes improve workforce safety by reducing the requirement for costly site visits while maintaining a healthy bridge network. You can read more from the Staffordshire Bridges project by visiting their research profile. This research forms part of the Centre for Digital Built Britain’s (CDBB) work at the University of Cambridge. It was enabled by the Construction Innovation Hub, of which CDBB is a core partner, and funded by UK Research and Innovation (UKRI) through the Industrial Strategy Challenge Fund (ISCF). To keep up with the Digital Twin Journeys project, check out the Digital Twin Journeys home page.
  8. The Open Geospatial Consortium, an open standards consortium with an experimental innovation arm, invites digital twin enthusiasts to evaluate the use of APIs and web services to connect to a variety of information resources in the built environment. https://www.ogc.org/projects/initiatives/idbepilot At this stage, we are after use cases, ideas, datasets and establish the requirements for a 4 months pilot. Use cases may include Building Condition Assessments across larger portfolios and evaluating building occupancy under certain constraints such as social distancing. Response period ends September 30st, 2021.
  9. Digital Twins are a way of getting better insights about the assets you commission, design, construct or manage, enhancing their performance and the outcomes for people who use them. But how do you get started creating one? What are the questions you need to ask yourself and potential challenges you’ll face? What lessons have been learned that may have slipped through the cracks of academic papers and published case studies? ‘Digital Twin Journeys’ will present lessons learned by our researchers in digital twin projects enabled by the Construction Innovation Hub. Culminating in a report for industry professionals who are involved with developing their first digital twins (March 2022), this series of outputs will highlight in various engaging formats many of the processes, decisions and insights our researchers have explored during their own digital twin journeys. Hear from the researchers themselves about how they have developed digital twin processes and tools, and the key themes that run through their projects. The outputs will be shared on the DT Hub blog, and will be collated on a dedicated page on the CDBB website. But first, we want to hear from you! Let us know in the comments what you still want to know about the process of developing digital twins.
  10. A4I round 6 launches tomorrow, 29/07/2021. This funding is aimed at SMEs looking to solve analysis or measurement problems. Below are some example ideas which might be eligible for A4I funding, and relevant to Digital Twin development: Collection of real-time data Accessing new sensing technologies, analytical tools & methodologies for input into Digital Twins Data analysis techniques Developing new analytical techniques or systems to improve existing Digital Twins e.g. data quality verification, or generating new insights using AI. Measurement of Digital Twin performance Note that this is a fast tracked funding round so please pay close attention to the closing dates. Link to the full information on the A4I funding: https://apply-for-innovation-funding.service.gov.uk/competition/975/overview For projects requiring Hartree Centre capabilities (AI, Data Science, HPC) you can also contact me directly to discuss the project and funding submission process. Examples of previous A4I projects: https://www.a4i.info/a4i_case_studies/data-performance-consultancy-limited/?bpage=1 https://www.a4i.info/a4i_case_studies/riskaware/?bpage=1 Summary:
  11. Dave Murray

    Test Engineering and DTs

    I am considering starting a network for topics related to Lifecycle V&V (Validation and Verification) centred on Evaluation and Testing, and this message is to poll the level of potential interest. I imagine the network would offer the following: · A place for Test Engineers from different market sectors to share experiences and gain knowledge · Support for those areas where DT activity is low but growing, the Defence Sector is an example, to benefit from the experiences of other sectors Test Engineers have a mix of technical and customer skills that are central to successful project implementation. The DT concept provides a lifecycle project-thread that provides Test Engineers with an unprecedented opportunity to exercise their skills. Maybe finding a way to maximise this opportunity might also attract more people to the career, and be a way to improve recruitment into the world of Engineering? If we launch this Network, would you consider joining it? Dave Murray
Top
×
×
  • Create New...