From global search engines to local smart cities, from public health monitoring to personal self-tracking technologies, digital technologies continuously capture, process, and archive social, material, and affective information in the form of big data. Although the use of big data emerged from the human desire to acquire more knowledge and master more information and to eliminate human error in large-scale information management, it has become clear in recent years that big data technologies, and the archives of data they accrue, bring with them new and important uncertainties in the form of new biases, systemic errors, and, as a result, new ethical challenges that require urgent attention and analysis. This collaboratively written article outlines the conceptual framework of the Uncertain Archives research collective to show how cultural theories of the archive can be meaningfully applied to the empirical field of big data. More specifically, the article argues that this approach grounded in cultural theory can help research going forward to attune to and address the uncertainties present in the storage and analysis of large amounts of information. By focusing on the notions of the unknown, error, and vulnerability, we reveal a set of different, albeit intertwined, configurations of archival uncertainty that emerge along with the phenomenon of big data use. We regard these configurations as central to understanding the conditions of the digitally networked data archives that are a crucial component of today’s cultures of surveillance and governmentality.
Geolocation as an increasingly common technique in dating apps is often portrayed as a way of configuring uncertainty that facilitates playful interaction with unknown strangers while avoiding subjecting the user to unwanted risks. Geolocation features are used in these apps on the one hand as matching techniques that created links between the user and potential partners through geographical location, and on the other as warranting techniques that can help a user to determine whether to trust a given profile. Tracing a trajectory from Georg Simmel's figure of the stranger as intrinsic to modern urban culture, through Stanley Milgram's familiar stranger as an inspiration for the infrastructure of social networking sites, to a consideration of the double perspective of overview and embedment inherent in geolocation's ability to map, we identify the stalker as an emblematic figure that appears not as a threatening Other, but rather as our own doubling.
With slogans such as 'Tell the stories hidden in your data' (www.narrativescience.com) and 'From data to clear, insightful content-Wordsmith automatically generates narratives on a massive scale that sound like a person crafted each one' (www.automatedinsights.com), a series of companies currently market themselves on the ability to turn data into stories through Natural Language Generation (NLG) techniques. The data interpretation and knowledge production process is here automated, while at the same time hailing narrativity as a fundamental human ability of meaning-making. Reading both the marketing rhetoric and the functionality of the automated narrative services through narrative theory allows for a contextualization of the rhetoric flourishing in Big Data discourse. Building upon case material obtained from companies such as Arria NLG, Automated Insights, Narrativa, Narrative Science, and Yseop, this article argues that what might be seen as a 're-turn' of narrative as a form of knowledge production that can make sense of large data sets inscribes itself in-but also rearticulates-an ongoing debate about what narrative entails. Methodological considerations are thus raised on the one hand about the insights to be gained for critical data studies by turning to literary theory, and on the other hand about how automated technologies may inform our understanding of narrative as a faculty of human meaning-making.
At a lecture on the history of the book in May 2003 at the University of Cambridge, Jerome McGann, a vehement spokesman for the cultural significance of information technology, was asked why it is at all worthwhile for literary scholars to occupy themselves with digital technologies. Why this marveling at the possibilities of the new media? His answer emphasized the opinion that literary scholars have an obligation to use their abilities of aesthetic analysis on new phenomena such as the interface, and to learn to take advantage of the possibilities that the new media offer their profession. Only by doing so is it possible to take part in the ongoing definition of what purposes these technologies serve. In contrast to this optimistic and affirmative attitude, we find polemical positions that are less eager to embrace the new media. In his most recent novella, Im Krebsgang, the Nobel Prize-winning author Günter Grass points to the dangers of the apparently ethically neutral spatial freedom of the internet. Grass claims that this neutrality blurs the sense of history and continuity compared to the more ethically coherent realm of literature in the traditional sense. Information technology and the ever greater impact it has on our everyday lives, right down to the metaphors we use, are thus greeted with an equal amount of optimism and pessimism, if at all taken seriously as an object of study outside small dedicated circles. As such, the reception of digital media today resembles the way in which photography and film were initially addressed with an equal amount of hope and fear, but above all conceptualized as less-profound expressions of popular culture well into the twentieth century. Much of the distrust of information technology and digital narratives as an object of study for the humanities originates in the emergence of these new media coinciding with the prevalence of poststructuralist theory, which saw in the technology a practical proof of the assumption of a play of signifiers, fragmentation, multilinearity, and the death of the author. This meant that the future possibilities for the technology, rather than the actual realities, were highlighted; and once the first wave of euphoria had subsided, a humanistic approach to cyberspace and information technology was to a certain extent stigmatized as being immaterial and lofty speculation. McGannʼs answer to the skeptic member of the audience in 2003 emphasizes the necessity of a profound attempt to make use of the actual possibilities that the digital technologies provide. One should not overemphasize the revolutionary potential of the new media, nor shut oneʼs mind to new possibilities. If successful, this balancing act renders it possible to create a coherent continuity in the study of information technology that aims at mapping out the ways in which media are part of shaping our lives, our values, and the way we relate to the world. To provide a starting point for this approach, I shall examine the way in which we relate to and navigate the space that com...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.