lunedì 16 dicembre 2013

Antoinette Rouvroy: Algorithmic Governmentalities and the End(s) of Critique @ INC - October 2013

PhotoAntoinette Rouvroy, in her presentation titled “Algorithmic Governmentalities and the End(s) of Critique,” discussed issues surrounding search engines current focus on relationships between sites rather than content. 


She began by explaining how modern academic knowledge goes through a series of critiques and peer reviews, while algorithmic knowledge is focused on more predictive aspects, never challenging people or content. An alternative definition of these changes could be summarized as “knowledge without truth.”
Rouvroy provided three examples of this paradigm shift by highlighting changes in knowledge production, modes of power, and human subjectivity. Knowledge production is a constantly accelerating and evolving process. Today, the vast amounts of raw data available make it difficult, if not impossible, to understand fully.
More and more of our knowledge production is being controlled, or at least accessed through, machines and search engines. This flow of “signs without signals,” attempting to represent reality, falls short and creates an atmosphere of “significance without symbolism.” The idea that knowledge is not constructed anymore, and merely found by Google and similar engines, has real consequences for humanity.
In this new world quality is determined by the relational infrastructures such as hyperlinks and keywords. When ranking pages, these relations are weighted much more heavily than the content itself, or the truth it may represent. While this system may seem extremely democratic, Rouvroy warns us of the implications of having ephemeral and algorithmic programs determine what we view as knowledge. A implication of this is the dwindling importance of the subject. As big data continues to grow and be analyzed at ever increasing rates, individuals lose forms of identity in order to be included in this knowledge system.
For Rouvroy, this loss of individuation and critique are highly related. She argues there is worth in how older systems of knowledge, such as physical archives, allowed for ideas to be categorized, and then subsequently tested for accuracy. Today these checks on truth are more more difficult to execute. Rouvroy ends by arguing that these new paradigms are “maybe” emancipatory and democratic, but are certainly multifaceted. All of this has created the current state of human/digital interactions as “multitude without alterity,” finding knowledge through difficult to fully understand search algorithms and engines.
  1. Antoinette Rouvroy says: 
    November 12th, 2013 at 12:09 pm (#)
    Thank you very much for this. But, please read “multitude without alterity” rather than “multitude without authority”. Here are some (additional) words about what I mean by this.
    The new, “truth regime”, evolving in real time, may appear “emancipatory” and “democratic” (with regards to “old” authorities, hierarchies and suspiciously rigid categories and measures), but the “subjects” it produces are “multitudes without alterity” [not without authority]. A profile being an aggregation of data provening from a multitude of separate individuals, one could easily believe that profiling would generate new processes of trans-individual, relational individuation, and new “objective” and democratic knowledge, where alterity, otherness, disparities are – at last – acknowledged their status as fundamental resources both in knowledge production and in individuation processes through relationships*.
    But as “users” (that is, essentially, as potential consumers or potential criminals), we are categorised according to what algorithms allow to detect as our intentions, desires, propensities, risks, etc. This happens often at a pre-conscious stage (subliminal, Geert Lovink would say), before we even had the time to build, understand, reflect on, revise,… what we consider are our own intentions, desires, propensities, risks, etc. Knowledge about our potentialities or virtualities (as I call them**), is generated directly from raw data (at least in the hypothesis of machine learning), without us being allowed to speak out by ourself, interpret, revise, the potentialities and virtualities which are automatically designed in advance. This is personalization without subjects : algorithms do not care about individual subjects: they merely care about “data” – which are nothing more than infra-individual, exploded, fragments of daily existences – and about “patterns” or “profiles”, which are supra-individual agregations of data collected in temporally and spacially heterogenuous contexts. By the way, this is a reason why legal data protection regimes, focusing on “personal data” (data relating to individually identified or identifiable persons), completely miss what is at stake with data mining and profiling.
    Yet, despite the fashionable complaints about the disappearance of the private sphere, I argue that profiling and personalisation result in a hypertrophy of individual private spheres, a privatization of the public space, and an immunisation of atomistic individuals agains events of the world, including the encounter with alterity (each individual walks in the streets surrounded by his/her own “augmented” reality, when IPhones, IPods etc. function as personal(ized) immune systems against unpredactibility of events and encounters of the physical world). It is not merely that public space (defined as the time and place for collective encounters and deliberations) is gradually vanishing, it is that algorithmic governmentality, purporting to model and design “the social” from within “the (digitized) social” itself, in “real time”, rather than from the outside, keeping a public space is not a necessity anymore. “Algorithmic design” takes the role of human coordination. No need for time consuming encounters and discussions anymore. Anyway, “users” available brain time must not be waisted in political discussions but rather directly oriented towards consumption through new digital market manipulation (http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2309703 ).
    All this to say that algorithmic personalisation is not a new way to encounter others (with same tastes, interests, problems, etc.) nor to give due consideration to alterity in a socially and politically emancipatory way. Because it (pretends to) “guess us in advance”, it avoids us not only the “burden”, but, more importantly, the chance, the occasion, and the pleasure to speak to each-other, to argue against each-other, to decide under circumstances of uncertainty (this, and the assuming responsibility for our decisions, is what makes the “dignity” of a decision) or, as Michel Foucault would say it, to exercise the courage of truth (http://pages.uoregon.edu/koopman/events_readings/cgc/foucault_CdF_84_CT_complete.pdf).
    Thank you everyone for the enjoyable colloquium, for the lively discussions, and for your patience in reading this up to here ! Hope to see you soon.
    Antoinette Rouvroy
    ——
    *http://works.bepress.com/antoinette_rouvroy/47/
    **A.Rouvroy, “Technology, Virtuality and Utopia: Governmentality in an Age of Autonomic Computing.” in. Autonomic Computing and Transformations of Human Agency. Ed. Mireille Hildebrandt and Antoinette Rouvroy, Routledge, 2011.
  2. Antoinette Rouvroy says: 
    November 13th, 2013 at 3:28 pm (#)
    Well it’s also “signals without signs” rather than “signs without signals”. If you invert the terms, the meaning is completely changed. A signal is quantifiable despite being meaningless, whereas a sign carries meaning (by resemblance to the thing it represents, or by keeping the shape or physical trace of the thing it represents, or through convention). Could you please correct this? Thanks.

Gouvernementalité algorithmique et perspectives d'émancipation : le disparate comme condition d'individuation par la relation? 

Antoinette Rouvroy, National Fund for Scientific Research (FNRS) and Information Technology & Law Research Centre, University of Namur (CRID)
Thomas BernsPHI / ULB

Abstract

La gouvernementalité algorithmique se caractérise notamment par le double mouvement suivant : a) l’abandon de toute forme d’« échelle », d’« étalon », de hiérarchie, au profit d’une normativité immanente et évolutive en temps réel, dont émerge un « double statistique » du monde et qui semble faire table rase des anciennes hiérarchies dessinée par l’homme normal ou l’homme moyen ; b) l’évitement de toute confrontation avec les individus dont les occasions de subjectivation se trouvent raréfiées. Ce double mouvement nous paraît le fruit de la focalisation de la statistique contemporaine sur les relations. Nous tentons d’évaluer dans quelle mesure ces deux aspects de la « gouvernementalité algorithmique » ainsi dessinée, avec l’appui qu’elle se donne sur les seules relations, pourraient être favorables, d’une part, à des processus d’individuation par la relation (Simondon) et, d’autre part, à l’émergence de formes de vie nouvelles sous la forme du dépassement du plan d’organisation par le plan d’immanence (Deleuze-Guattari). Par cette confrontation aux principales philosophies contemporaines de la relation, il apparaît alors qu’une pensée du devenir et des processus d’individuation par la relation réclame nécessairement du « disparate » - une hétérogénéïté des ordres de grandeur, une multiplicité des régimes d’existence - que la gouvernementalité algorithmique ne cesse précisément d’étouffer en clôturant le réel (numérisé) sur lui-même. La gouvernementalité algorithmique tend plutôt à forclore de telles perspectives d’émancipation en repliant les processus d’individuation sur la monade subjective.

Suggested Citation

Antoinette Rouvroy and Thomas Berns. "Gouvernementalité algorithmique et perspectives d'émancipation : le disparate comme condition d'individuation par la relation?" Politique des algorithmes. Les métriques du web. RESEAUX, Vol.31, n.177, pp. 163-196. Ed. D. Cardon. La Découverte, 2013.
Available at: http://works.bepress.com/antoinette_rouvroy/47 Read more

Nessun commento:

Posta un commento