Skip to main content Skip to navigation

AI in the Street Observatories

AI in the Street City Observatories

Starting with a simple question "what does responsible AI look like from the street?" AI-in-the-street teams are undertaking creative participatory research in 5 cities the UK and Australia - London, Edinburgh, Coventry, Cambridge and Logan. These research-based interventions take the form of diagramming workshops, sensing walks and street-based activities, and will inform the scoping of a prototype for a “street-level observatory” for everyday AI: a digital showcase and protocol for rendering the presence, role and effects of AI-based technologies visible and/or tangible for everyday publics in the street.

Edinburgh

Amazon pick up point, Edinburgh, with a cabinet with yellow drawers

Amazon pick-up point, Edinburgh

The Edinburgh observatory is concerned with the role that place plays in people's views of AI. It will focus on public understandings and needs tied to AI in a specific area/neighbourhood in Edinburgh, namely Leith Walk. It is anticipated that the Edinburgh observatory will be particularly concerned with technologies that observe and enumerate citizenship and that use this data in civic planning and decision making.

The Edinburgh observatory intends to trial participatory methods to engage a public audience and gather their views of AI in the public realm. This work will offer insights into the efficacy and value of the participatory methods and provide some preliminary insights into public understandings of AI.

Cambridge

Level crossing on High Street, Cherry Hinton, Cambridge by JThomas

Level crossing on High Street, Cherry Hinton, Cambridge by J. Thomas

The Cambridge Street Observatory identifies the crossing as a physical and metaphysical site of inquiry to analyse how the technical sensing infrastructures of connected and automated mobility systems make sense of human mobilities. Our observatory identifies 'sitpoints' at crossings to analyse the different kinds of data that delivery robots, scooters, cyclists, autonomous vehicles, and people moving at different speeds require, and that will inform how we analyse 'AI' as being a responsible application of technology in shared public spaces.

In May 2024, the Cambridge team will run three Access Data Walks with members of the greater Cambridgeshire partnership, a network of local councils; the disabled community; and people who work in delivering connected and automated mobility systems and services. A network of low res cameras and data sensing infrastructure that collects data to build models to predict traffic flows towards a number of outcomes including, of relevance to us, pedestrians crossing in the street. The purpose of this AI is to generate a corridor (data corridor) in which lights at the crossing change on demand/in response to pedestrian needs and demands.

London

3D rendering of The Science Gallery, London

3D rendering of The Science Gallery, London

The London Observatory will be delivered by the artist studio Ambient Information Systems, building on their smartphone-based audio-guided walk around Westminster [Here, Hear To See, Manu Luksch & Mukul Patel 2022], which narrated the role of algorithms in UK society and politics. For this street AI observatory activity, Luksch and Patel, in collaboration with Yasmine Boudiaf, will continue their enquiry into existing and potential deployment of ‘AI’ and algorithmic decision-making systems in public space. Their focus will be on mobility and street infrastructure (including computational and distributed systems) at two locations – The Science Gallery (KCL) and Broadway Market, Hackney. There will be public workshops at each site, in which participants will be led through a series of exploratory, creative exercises to interrogate how they, as street users, interact with each other and street infrastructure, and to envision how these interactions might be differently designed. Research findings and workshop documentation will be shared with AI in the street project partners, and a report and analytic summary will be published.

Coventry

Automated mobility test environment featured in West Midlands Future Mobility documentation where text is shown on a windscreen

Automated mobility test environment featured in West Midlands Future Mobility documentation (2020)

In Coventry, we will build on a previous City of Culture engagement project to curate participatory “listening walks” that explore what composes intelligent mobility test environments from the standpoint of the street. We will use the locative Unheard City app (Emsley, 2020) which listens for machine signals – Wi-Fi and Bluetooth signals that digital networked devices emit – in the street to create place-based engagement. Working in collaboration with the Coventry-based company of artists, Talking Birds, and in partnership with the regional mobility innovation leads, Transport for West Midlands, the intervention will be structured around specifc questions to make intelligent mobility infrastructures in the city visible to local participants: who/what is observed by who/what? What is and could be the role of AI-based technologies in the transformation of the city into a space for mobility innovation and sustainability transitions? The aim is to render explorable for both everyday publics and regional partners how AI innovation creates transformative relations between engineering, governance and everyday life.

Logan

Flargo GmbH VTOL Heavy-Lift Drone carrying a box with 2 propellors

Flargo GmbH VTOL Heavy-Lift Drone

The international trial in Logan Australia will focus on developing an observatory for "streets in the sky." It is part of a wider project project led by Thao Phan investigating the social and environmental impacts of commercial drone delivery testing in Australian communities — and will extend the concept of the street upward to include the spaces above (and not just below) the built environment. Working with community-based visual artist and filmmaker Sari Braithwaite, Phan will invite everyday publics who live under drone lightpaths to submit images, videos, and anecdotes of drone activity in their neighbourhoods, which will then be curated into online exhibitions of everyday encounters with drones-in-the- wild. By utilising creative and participatory methods, our aim is to create a sensory archive that captures the experience of everyday life with these new technologies, documenting the diverse and unexpected encounters between drones and people, animals (pets and wildlife), city infrastructure, and the wider social and natural environment. In doing so, it will make visible the contest over urban skies, shedding light on negotiations between local councils, national regulators, and corporate entities introducing street-based AI.