Making Sense of the Data-driven: SETUP’s Algorithmic History Museum and Its Relevance for Contemporary Reflection

Maranke Wieringa

Making Sense of the Data-driven: SETUP’s Algorithmic History Museum and Its Relevance for Contemporary Reflection

What do experiments with electoral vote differentiation, the physical ‘well-being’ of slaves, and Catholics looking for a spouse have in common? According to SETUP’s Algoritmisch Historisch Museum, or ‘Algorithmic History Museum’, they all have a basis in algorithms.

While we tend to associate algorithms today with high-tech systems, quants, and machine learning, its origins were much more modest. The term derives from the Latinised name of the Persian mathematician الخوارزمی موسى بن محمد (Muḥammad ibn Mūsā al-Khwārizmī), ‘Algoritmi’, and the greek αριθμός (arithmós), meaning ‘number’. Originally, Al-Khwārizmī’s treatises suggest, it referred to the solving of mathematical problems in a step-by-step fashion.1 The numbers he was concerned with at the time still matter today, as we now understand algorithms as sets of instructions fed into computers in order to solve particular problems; for instance, by calculating them in specific ways.2

The Algorithmic History Museum is a pop-up exhibition curated by Ellen Bijsterbosch and Casper de Jong, created in collaboration with artists Milou Backx, Sosha de Jong, and Sophie Pluim. It explores three fictional solutions to historical ‘problems’ – of the eighteenth, nineteenth, and twentieth centuries, respectively – that could have materialised, had mathematics been enlisted to solve them at the time. In doing so, they expose some fundamentally flawed assumptions embedded within the fictitious algorithms. This in turn provides a much-needed impetus for critical reflection on their contemporary, non-fictional equivalents, within the context of our increasingly ‘datafied’ society.3 In what follows, I take a brief look at each of the three scenarios presented in the exhibit and the particular questions they raise about contemporary algorithmic practices.


A Few Exercises a Day Keep a Loss of Projected Revenue at Bay


The first of the ‘what-if’ scenarios the exhibit proposes is inspired by a narrative from the ­history of slavery. A big concern for slavers of the Dutch Republic was that a large portion of the slaves they captured died before arrival at their selling location, due to the terrible ­conditions aboard the vessels in which they were transported. In his thesis Noodige Onderrichtingen voor de Slaafhandelaren (Necessary Instruction for Slavers), David Gallandat, a medical professional, made some proposals for remedying this ‘unfortunate loss of projected revenue.’4 In this work, he argued among other things that the enslaved would benefit from fresh air and regular exercise. Inspired by this work, SETUP introduces the fictional character of Marcus Ruysch, who builds on Gallandat’s hypotheses in creating an algorithm and prototype bracelet that is to measure how much exercise a slave has in a day. This way, the slavers can maximise their profit, aided by a tool that can help monitor health, thus (indirectly) achieving higher survival rates.

Figure. 1

Figure 1. Overview of the Algorithmic History Museum. Source: SETUP.

Although the exhibition depicts the fictional Ruysch-device as an eighteenth-century-­version ‘Fitbit’, it does not directly address the wide range of parallels with our contemporary society this suggests. However, equivalent uses of one’s personal health data readily present themselves. Today, health insurance agencies – for instance – are already ‘encouraging’ their clients to wear health tracking devices, allowing them to earn rewards if they share their data.5 The use of such activity trackers of course harbours a variety of dangers: there is the threat of data leaks, which causes privacy concerns; it enables data-driven discrimination, which raises another set of ethical concerns; and it even invites certain kinds of criminal activity, such as identity theft.6 On a more fundamental level, one can also question the ethics, specifically, of the kinds of financial coercion such encouragement involves.7 Research shows that many users are currently oblivious to the dangers of sharing activity data.8 While the enslaved in SETUP’s ­fictional narrative presumably had no choice in the matter, users today may willingly subject themselves to very similar regimes of control, to further the profit of companies in exchange for a reduced insurance fee.

Figure. 2

Figure 2. In the foreground, the Ruysch device is depicted. Source: SETUP.

All Voters are Created Equal, But Some are More Equal than Others


In its second section, the exhibit zooms in on universal women’s suffrage, as it played out in the Netherlands. In the nineteenth century, the expansion of voter rights was an important political issue. At the time, the basic electoral principle was that those who pay, get to decide.’ By implication, women were excluded from participation, as their economical standing was negligible; likewise, only a select portion of the male populace was granted the right to vote. SETUP tells the fictional story of a small experiment, conducted in the voting district of Appingedam. Here, votes are weighted on the basis of the their occupations – or rather, the relative economical standing of their particular occupation (broadly conceived) at the time. This means that a notary’s vote, for instance, is worth 5.25 times that of a woman’s. This way, the exhibition hypothesises what could have happened if votes were given more importance, based on one’s economic standing.

The exhibition ponders whether women would have equal voting rights today, had the fictional experiment been widely implemented. While this starting point is interesting, the exhibit ignores that there are actually contemporary parallels to this historical situation which perhaps illustrate the case in a more pressing manner. Today also, some voters – to speak with Orwell – are ‘more equal’ than others, as a result of similar kinds of (implicit) value judgements. Consider for instance the American voting system, which makes it increasingly difficult to vote, especially for minorities.9 In Alabama, for instance, voters are required to register using a photo ID. However, at the time the relevant regulation came into effect, the state closed down 31 driver’s license-issuing locations.10 In an opinion piece on an Alabama news website, John Archibald noted that the counties in which the driver’s license offices were closed, were predominantly populated by of people of colour – a group already less likely to own a photo ID.11 And in countries were photo identification is mandatory – such as the Netherlands – some minorities are less likely to own one.12 Murillo writes that in North-Carolina, similar legislative actions to those in Alabama point to ‘a clear racial bias’.13 Here, in the process of drafting a new voting law, racial data were sought out and then used to amend the law – which ultimately resulted in a discriminatory thwarting of ‘many African-American voters at the polls.’14 So, while all voters may in principle be equal, they encounter different entry barriers to voting procedures – barriers informed by demographic and voting data analysis and fuelled by racial prejudice.

Figure. 3

Figure 3. Overview of the universal women’s suffrage portion of the exhibition. Source: SETUP.

… So What Does Your Father Do?


The final story that SETUP zooms in on, centers on mid-twentieth-century Catholics unable to find adequate spouses in their own communities. In 1948, the Contactbureau Huwelijksbemiddeling Katholieken (Agency for Marriage Mediation for Catholics) was founded to address this problem. In the exhibit, the agency is renamed Rekenkundig Bureau voor Huwelijksbemiddeling (Mathematical Agency for Marriage Mediation). In the fictional situation sketched, the agency’s foundation capitalises on the success of the so-called ‘rekenbezegelaar’ (mathematical ratifier), a device that algorithmically matches Catholics looking for spouses, based, amongst others, on their favorite hymns, their fathers’ occupations, the dates of their first communions, and their stances on ‘onanism’ (in lay terms, masturbation) and ‘neo-Malthusianism’ (or the limiting of population growth – a practice opposed by the Catholic church).

Figure. 4

Figure 4. The data cards (front) of the candidates of Rekenkundig Bureau Huwelijksbemiddeling, and their pictures (back). Source: SETUP.

While SETUP makes a cursory allusion to Tinder, the exhibition does not explicitly address contemporary dating applications. The matching criteria used by the rekenbezegelaar may seem ludicrous from a contemporary perspective. Yet at the same time, the exhibit’s ­proposition here puts into relief that today, we often do not know on which grounds our own online matches are made; arguably, however, SETUP could have underlined this more. Dating apps or platforms such as OkCupid, Tinder, Grindr and the likes collect a wealth of data about their users, but we do not know precisely how their matching algorithms work, or which variables feed it.15 In other words, the algorithms that inform the matches based on our data inputs are ‘black boxed’; the same also applies to the particular ‘data ecosystems’ in which they are embedded, that is, the connections and dependencies between the particular parties involved (for ­example, logging into Tinder via Facebook).16 Furthermore, we need to consider the experiments such sites regularly conduct, in which particular metrics may be manipulated to measure its effects – which means that the metrics or information one is presented with, may not even be ‘accurate’.17 The rekenbezegelaar, by opening the black box, makes us reflect on the choices our current dating apps might make – and how little we in fact know about them. All things considered, then, this fictitious device may just be much more straightforward about its matching algorithm.


Reflecting on Algorithmic Bias


The Algorithmic History Museum deftly exposes the biases embedded in the three fictional algorithmic systems it presents, by embedding values which we currently take for granted into made-up systems, in order to solve historical ‘problems’. While some of the examples SETUP draws upon may seem rather outdated, similar forms of bias and discrimination can be found in contemporary systems. Through its historicised, fictional analogies, SETUP provides a foundation for a much-needed reflection in contemporary discourse. While the exhibit invites the visitor to think along similar lines when looking at contemporary phenomena, the parallels sketched could have been presented in a more overt manner, to be more accessible to those less literate in data practices.

Nevertheless, the most fundamental arguments of the exhibit do come across – for a broad audience as well as for those more versed in the complexities of data and its infrastructures. Algorithms, lying at the heart of much data processing, are permeated with norms and values that need not correspond with our own, and that hold the risk of privileging specific perspectives over others – in the process also perpetuating the unequal relations between them.18 The exhibit exposes such values as social constructions, heavily informed by racist, sexist, and other discriminatory discourses. As platforms, and their algorithms, increasingly quantify our every actions, we need to critically assess whose values are embedded in them. The exhibit brilliantly achieves outrage in its audience at the fictitious historical systems; arguably, however, the implicit call to action in our contemporary reality could have been made more overtly.

SETUP’s installation facilitates a foundation from which to start questioning the values and perspectives which are now seen as ‘natural’ and which inform contemporary ­algorithms – by exposing the distant social norms which could have been embedded also in ­historical socio-technical systems. Thus, they attempt to bring awareness to the risk of algorithms becoming ‘weapons of math destruction’, which camouflage ‘poisonous assumptions […] and go largely untested and unquestioned.’19 The fictional eighteenth century Ruysch device doesn’t allow space for questioning if the enslaved are people deserving of human rights: the device simply operates in the assumption that they belong to the category of ‘property’ – and therefore, can be reduced to numbers that are crunched in order to ­establish the most cost-effective way to transport them. In the device, these perspectives are so integrated into the system – or ‘black-boxed’ – that it becomes increasingly difficult to interrogate its fundamental premises.

Dating apps, insurance companies, and governments are not the only ones employing various types of algorithms; users of those can be found throughout our social system. This lends urgency to the matter of understanding the potential ‘fatal flaws’ embedded in them. In order to think through contemporary bias, injustice, and unfairness in algorithms, however, we first need to create awareness for these issues. SETUP paves the way for the beginnings of such an awareness in its exhibition. Although some visitors confused fictional with actual narratives, the curator claims the exhibit’s achieved its goal. Visitors were shocked by the (potential) power of these algorithms, and a fictional demonstration can go a long way in convincing people of the need for ‘algorithmic accountability’. For if ‘code is law’ – as Lawrence Lessig famously stated, when noting that (judicial) values are embedded in socio-technical systems – then algorithms may just end up being judge, jury, and executioner.20 As SETUP underlines, it is time to critically interrogate algorithms, and to bring their creators and subjects both to the witness box, so they can attest to the profound influence that these algorithms have.


The Algorithmic History Museum was an installation created by SETUP. It was on display at the Dutch Design Week 2017 (21– 29 October 2017, Eindhoven, the Netherlands).



Acknowledgements


The author wishes to thank SETUP for generously sharing its time to discuss the installation.

Notes



1.     Douglas Harper, “Algorithm,” Online Etymology Dictionary, 2017, https://www.peigencihui.com/word/algorithm.

2.     Andrew Goffey, “Algorithm,” in Software Studies: A Lexicon, ed. Matthew Fuller (Cambridge/London: MIT Press, 2008), 16.

3.     Karin Van Es and Mirko Tobias Schäfer ed., The Datafied Society (Amsterdam: Amsterdam University Press, 2017).

4.     David Henri Gallandat, Noodige Onderrichtingen Voor de Slaafhandelaaren (Middelburg: Pieter Gillissen, 1769), online in digitised form at http://slavernij.eloweb.nl/literatuurlijst/bibliotheek/gallandat.pdf.

5.     For instance, Vitality, “Activity Tracking,” Vitality, 2017, https://www.vitality.co.uk/rewards/partners/activity-tracking/.

6.     Mario Barcena, Candid Wueest and Hon Lau, “How Safe Is Your Quantified Self?” (Mountain View, 2014), https://doi.org/me12010016 [pii].

7.     Deborah Lupton, “Health Promotion in the Digital Era: A Critical Commentary,” Health Promotion International 30, no. 1 (2015): 174–83, https://doi.org/10.1093/heapro/dau091; Parmy Olson and Aaron Tilley, “The Quantified Other: Nest And Fitbit Chase A Lucrative Side Business,” Forbes, 2014, https://www.forbes.com/sites/­parmyolson/2014/04/17/the-quantified-other-nest-and-fitbit-chase-a-lucrative-side-business/#1b6e79282c8a.

8.     Florian Rheingans, Burhan Cikit and Claus-Peter H. Ernst, “The Potential Influence of Privacy Risk on Activity Tracker Usage: A Study,” in The Drivers of Wearable Device Usage, 2016, 25–35, https://doi.org/10.1007/978-3-319-30376-5.

9.     German Lopez, “7 Specific Ways States Made It Harder for Americans to Vote in 2016,” Vox, 2016, https://www.vox.com/policy-and-politics/2016/11/7/13545718/voter-suppression-early-voting-2016.

10.     Amy Erickson, “Selma to Selma: Modern Day Voter Discrimination in Alabama,” Law & Inequality 35, no. 75 (2017): 94.

11.     John Archibald, “Alabama Sends Message: We Are Too Broke to Care about Right and Wrong,” Al.com, 2015, http://www.al.com/opinion/index.ssf/2015/09/alabama_sends_message_we_are_t.html; Brennan Center for Justice, “Citizens Without Proof” (New York, 2006), http://www.brennancenter.org/sites/default/files/­legacy/d/download_file_39242.pdf.

12.     Hayke Everwijn, Wouter Jongebreur, and Pieter Lolkema, “Het Functioneren van de WUID in de Praktijk” (Barneveld, 2009), 17.

13.     Matthew Murillo, “Did Voter Suppression Win President Trump the Election?: The Decimation of the Voting Rights Act and the Importance of Section 5,” University of San Francisco Law Review 51 (2017): 607.

14.     Murillo, “Did Voter Suppression”.

15.     On the topic of this collecting, see Judith Duportail, “I Asked Tinder for My Data. It Sent Me 800 Pages of My Deepest, Darkest Secrets,” The Guardian, 2017, https://www.theguardian.com/technology/2017/sep/26/tinder-personal-data-dating-app-messages-hacked-sold.

16.     On black-boxing, see Frank Pasquale, The Black Box Society: The Secret Algorithms That Control Money and Information (Cambridge/London: Harvard University Press, 2015), 3; for said eco-systems, consult José Van Dijck, The Culture of Connectivity: A Critical History of Social Media (Oxford: Oxford University Press, 2013); José Van Dijck, Thomas Poell and Martijn De Waal, De Platformsamenleving: Strijd Om Publieke Waarden in Een Online Wereld (Amsterdam: Amsterdam University Press, 2016), 20–24, https://doi.org/10.5117/9789462984615.

17.     OkCupid, “We Experiment On Human Beings! (So Does Everyone Else.),” OkCupid, 2014, https://theblog.­okcupid.com/we-experiment-on-human-beings-5dd9fe280cd5.

18.     See for instance Astrid Mager, “Algorithmic Ideology,” Information, Communication & Society 15, no. 5 (2012): 769–87, https://doi.org/10.1080/1369118X.2012.676056; Batya Friedman and Helen Nissenbaum, “Bias in Computer Systems,” ACM SIGCHI Bulletin 14, no. 3 (1996): 330 –347, https://doi.org/10.1145/249170.249184; Deborah G. Johnson and Helen Nissenbaum, Computers, Ethics & Social Values (Upper Saddle River: Prentice-Hall, 1995); Van Dijck, Poell, and De Waal, De Platformsamenleving.

19.     Cathy O’Neil, Weapons of Math Destruction (New York: Crown Publishing Group, 2016), 16.

20.     Lawrence Lessig, Code 2.0 (New York: Basic Books, 2006).


Biography


Maranke Wieringa (1992) is a PhD candidate at Utrecht University. She has a background in Cultural Studies and Media Studies, with a specialization in software and data. Her dissertation investigates algorithmic accountability in Dutch municipalities and, following from the insights obtained, develops a toolkit to empower municipal data professionals to give testimony on the (decisions around the) algorithms used. At Utrecht University, Maranke is part of the Datafied Society research platform, and she teaches courses on (scholarly) data analysis and new media studies. Her academic interests lie at the intersections of software and data (such as tool criticism, algorithms).


Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.



Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.

ISSN: 2213-7653