EDIT_THIS ADD_ARCHIVE ADD_ISSUE ADD_ARTICLE PUBLISH ?

Pulse V2 (2020)

article⁄Pulse V2 (2020)
contributor⁄
abstract⁄Pulse v2 is an interactive installation designed to investigate how realtime lidar data can be used to develop new spatial relationships between people and an autonomous digital agent through dynamic visual expressions. The first iteration of this research, Pulse v1, used a single point lidar with a 160o FOV in conjunction with 240 servoactuated antennas that visualized the position and movement of visitors via their vibrations. This second iteration blends digital and physical materiality to create a synthetic organism that fully integrates sensing, computation, and response into its form. Simultaneously, the raw data feed it ‘sees’ is projected onto the wall in realtime, allowing visitors to experience both the response and the logic. The data feed is supplied by a 360o FOV, 2d lidar scanner. This type of scanner is typically used by small autonomous robots to map and navigate their environments. However, in this installation, the relationship is inverted to allow a stationary agent to respond to a dynamically changing environment. The sensor is mounted under the displays and provides a realtime slice of the space at the height of 20cm. An algorithm filters this data stream into trackable blobs by recognizing people via their ankles. The agent analyzes this stream of data and filters it through a series of micro and macro expressions that play out on the screen in the form of a digital microorganism.
keywords⁄2020archive-note-no-tags
Year 2020
Authors Puckett, Nick.
Issue ACADIA 2020: Distributed Proximities / Volume II: Projects
Pages 188-191
Library link N/A
Entry filename pulse-v2