Concurrently, technologies that augment human sensation, such as virtual reality and augmented reality, are becoming mainstream and as they do, they extend and focus the perceptual practice of everyday urban life. We see these vectors — machine sensing and augmented sensation — as correspondent to and convergent with one another. For machine sensing, the surfaces of the city are made more vital as they respond to light, touch and motion in new ways, and for augmented sensation the living inhabitants’ sensory apparatuses are infused with new layers of hot and cool stimulus. There is an urbanism to be found in the hatched membrane between these.
We will be working with biosensing, 360 video, 3D-scanning, virtual reality and augmented reality, studying how these sensory systems “read” the world. At stake is both how we sense the city and how the city senses us, and itself, and thus how each “makes sense” of the other in turn. For each of these technologies our interest is not urban administration per se, but extending what can be experienced and modeled.