EDIT_THIS ADD_ARCHIVE ADD_ISSUE ADD_ARTICLE PUBLISH ?

Digital Derive. Reconstructing urban environmentsbased on human experience (2018)

article⁄Digital Derive. Reconstructing urban environmentsbased on human experience (2018)
abstract⁄This paper describes a novel method for reconstructing urban environments based on individual occupant experience. The method relies on a lowcost offtheshelf 360degree camera to capture video and audio data from a natural walk through the city. It then uses a custom workflow based on an opensource Structure from Motion SfM library to reconstruct a dense point cloud from images extracted from the 360degree video. The point cloud and audio data are then represented within a virtual reality VR model, creating a multisensory environment that immerses the viewer into the subjective experience of the occupant.This work questions the role of precision and fidelity in our experience and representation of a ‘real’ physical environment. On the one hand, the resulting VR environment is less complete and has lower fidelity than digital environments created through traditional modeling and rendering workflows. On the other hand, because each point in the point cloud is literally sampled from the actual environment, the resulting model also captures more of the noise and imprecision that characterizes our world. The result is an uncanny immersive experience that is less precise than traditional digital environments, yet represents many more of the unique physical characteristics that define our urban experiences.
keywords⁄full paperurban design-analysisrepresentation + perceptioninteractive simulationsvirtual reality2018
Year 2018
Authors Nagy, Danil; Stoddart, Jim; Villaggi, Lorenzo; Burger, Shane; Benjamin, David.
Issue ACADIA 2018: Recalibration. On imprecisionand infidelity.
Pages 72-81
Library link N/A
Entry filename digital-derive-reconstructing-urban-environmentsbased-on