2014
DOI: 10.52842/conf.acadia.2014.389
|View full text |Cite
|
Sign up to set email alerts
|

This is not a Glitch: Algorithms and Anomalies in Google Architecture

Abstract: Big Data is increasingly deployed for the capturing, mapping, and analysis of the built environment. These layers of information are used to produce highly convincing representations of the built environment. This paper explores how these processes, specifically those deployed by Google Earth, are used to translate data into simulated environments with the goal of better understanding how designers might begin to produce physical constructs that provide resistance to accurate image capture and simulation. We f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2016
2016
2016
2016

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 0 publications
0
2
0
Order By: Relevance
“…Despite its best efforts to "iron out the creases," turbulence is no more evident than in the visual anomalies of Google Earth (figure 2). As Johnson and Parker (2014) demonstrate, the visual anomalies of Google Earth are not glitches but the algorithms exposing themselves from behind a veil of anonymity. These digital artifacts are produced through architecture's attempt to present its outside (façade) to a class of algorithmic observers that have been tasked with sensing and making sense of the built environment.…”
Section: Le Corbusier's Techno-dematerializationmentioning
confidence: 99%
“…Despite its best efforts to "iron out the creases," turbulence is no more evident than in the visual anomalies of Google Earth (figure 2). As Johnson and Parker (2014) demonstrate, the visual anomalies of Google Earth are not glitches but the algorithms exposing themselves from behind a veil of anonymity. These digital artifacts are produced through architecture's attempt to present its outside (façade) to a class of algorithmic observers that have been tasked with sensing and making sense of the built environment.…”
Section: Le Corbusier's Techno-dematerializationmentioning
confidence: 99%
“…Here, we propose an approach that leverages computational tools for vision, sensing and recombining the data associated with architectural production, towards a workflow that might begin to engage the age of Ubiquitous Simultaneity (Johnson and Parker 2016), which is allowing for an unprecedented amount of visual, environmental, and regulatory data to be gathered and processed. This research builds on previous explorations into the potential for computer vision and Scale-Invariant Feature Transform (SIFT) algorithms to be deployed for the production of two-dimensional images and introduces into the workflow techniques for three-dimensional form finding and the embedding of bias for the purpose of innovation (Johnson and Parker 2014). It builds out an approach that speculates about the nature of design as a profession that must begin to leverage massive repositories of historical and speculative design strategies and artifacts in order to better understand both past tendencies and future directions.…”
Section: Introductionmentioning
confidence: 99%