How are advertisers adressing new PhD students through algorithms?

Suddenly, a tablet of €599 was shoved in my face, a hi-tech device specifically designed to replace my usual pen-and-paper note-taking. Not the real device, unfortunately, but an online advert that kept popping up on my smartphone. ‘Why don’t you install an ad blocker?’ one might ask. That’s afair question. I do use one on my laptop and, for a moment, I regretted not installing one on my phone too. But only for a moment, because it made me aware again of the several ads that were shown on my social media accounts. It became even more apparent when I changed my job description to ‘PhD student’ on both LinkedIn and Facebook, where posts, images, video’s, and even personal messages, started addressing me in a different way than before.

I had just started my new job and decided to inform friends, family and acquaintances through social media, which in itself isn’t very special. However, it was only a few minutes later that it occurred to me that both platforms had stopped displaying the usual advertisements about clothing brands, video games and travel deals, and instead were offering products for both students and researchers. Some of them weren’t very surprising, a bicycle for instance or collection of flash cards. But others were. ‘Do students actually use this?’ I wondered, looking at the pricy diamond shaped device to optimise time management. 

Three interrelated aspects of this incident caused me to start a small-scale investigation: the specific content of the adverts that diverged from what I was used to, the timing of the ads after changing my job description, and the price of the mentioned products. It made me wonder: how are advertisers addressing this new part of my identity? I found a tentative answer in recent research on algorithms.

screenshot_20181007-160120_chrome
Article 1 - Social media status

The algorithmic beat

In the book If… Then: Algorithmic Power and Politics by Taina Bucher, the author addresses the issue of algorithms on social media platforms and provides not only a lens to look at this phenomena, but also presents a research method to analyse the effects of algorithmic power. A few concepts may help us understand the things I – and probably many of you – experienced.

First, she defines ‘algorithms’ not as something merely technical, as if it were recipe for cooking. Generally speaking, an algorithm is usually perceived as something that attempts to reach a specific goal by using relevant input and predetermined instructions. It behaves much like a juice blender that produces the right drink if it’s given the correct set of fruits to mix. Along that same traditional line of thought, developers program the algorithmic code to solve a problem: for example, finding a suitable buyer for a companies’ product by using personal data as input (e.g. ‘age’ or ‘sex’) and matching it to the description of the products’ target group (e.g. keywords ‘games’ or ‘men’). Bucher, however, doesn’t look at the code itself, but instead approaches what the algorithm does to people who encounter it. Algorithms consequently become something with an additional effect besides obtaining the prescribed goal of developers and marketeers.

Second, she embraces Michel Foucault’s perspective on power as something that isn’t owned, but exerted through relations and has a productive effect. In other words, the algorithm and the power that goes with it shapes individuals world in a specific way. With this in mind, Bucher argues that “the power and politics of algorithms stems from how algorithmic systems shape people’s encounters and orientations in the world” (Bucher, 2018, p. 8). Behind the scenes of social media profiling machines “seek to produce a sense of identity through detailed consumer profiles, which are geared toward anticipating future needs” (Bucher, 2018, p. 102).

Finally, if Buchers’ claim is accurate, then I didn’t encounter the tablet or diamond shaped device by accident. On the contrary, I argue that it was an encounter with an algorithm that does not only show adverts based on my personal data for the purpose of selecting and presenting specific commercial goods I might buy. The algorithm, consequently, also prevents me from seeing alternatives and shapes the world I relate to. Following authors on discourse analysis, both the language and images I was exposed to may contribute in shaping identity (Rodgers, 2004). For example, it might happen that I only regard the tablet as a suitable option for note-taking during seminars and don’t consider a regular notebook any longer. The ‘PhD student’ then becomes someone who uses this instead of that. I found an algorithmic heartbeat, but what was being played and who was the drummer?

REVERSED ENGINEERING

As mentioned above, I set out on examining the phenomenon and looked for a fitting method to analyse the workings of the encountered algorithm. Again, following Bucher’s approach, I tested ‘reversed engineering’, a concept originating in cybernetic studies to examine the effects of algorithms through its in- and output, instead of trying to understand the complex code itself. Drawing upon Bruno Latour, she introduces the ‘technography’ as a way of “describing and observing the workings of technology in order to examine the interplay between a divers set of actors (human and nonhuman)” (Bucher, 2018, p. 60).

In a first phase, bescribing and observing was what I ended up doing. Moreover, screenshots were taken of the displayed adverts to examine the way algorithms act in relation to the user and to increase the transparency of my endeavour. This was done each day for the duration of one month, beginning on the day I changed the job description. I only captured the content and refrained from clicking on the provided hyperlinks to exclude the possibility of reinforcing the algorithm. Most of the images are presented below, but due to privacy concerns, personal messages are not included.

In an additional phase, the content was examined by identifying the company, reading the description of the mentioned product, and following the suggested link. This small-scale and exploratory research was a tentative way to practice reversed engineering and by no means a thorough empiric analysis. Conclusions are therefore limited and only apply to this particular case.

The PhD defibrillator?

Some of the results were quite striking. The adverts typically promoted articles and infomercials, educational courses or training, events and the registration for it, various products, services, apps or software. The latter was often related to particular devices that ran the software or app. If the output of algorithms shapes identity (e.g. an advert generated after matching input and recipe), and if one were to accept all the offers, the figure of ‘the new PhD student’ would have some of the following characteristics.

For example, he or she would receive tips on “how to survive a PhD”, while another advert promotes an AED package with a defibrillator. The student would use a tablet to write on, use LED lights to lit the room, and drink coffee from a popular brand that has a famous actor starring in its television commercials. Also, the use of MS Office is encouraged on various devices, and the need of sharing a screen with others in a classroom or auditorium is mentioned. Furthermore, according to two different ads, the student should drive a car – arguably the presented BMW – and order a new bike online. Various examples offer courses, training and e-books about topics such as artificial intelligence, data analytics, deep learning, and blockchain technology. Finally, some adverts promote data protection and IT security, while others encourage the development of entrepreneurial skills or career skills. The most striking aspect of all these messages seems to be that the user is assumed to be a consumer. Advertising for other ends, a non-profit cause perhaps, did not occur.

The commercial sound of the algorithm

In this brief article I attempted to illustrate how advertisers are addressing a new PhD student through algorithms. This is not meant as a thorough analysis, but rather as a tentative way of practicing reversed engineering of algorithms on social media. Starting from deliberative input (a new job description) and collecting the following output (adverts), an unusual hypothetical identity took shape: a consumer PhD student in relation to digital and analogue products. Although it is possible that not all adverts mean to aim at (PhD) students, they did however only occur after changing the status. After initiating the algorithm, the user was exposed to discourse (including language, images, video’s and hyperlinks) with multiple calls-for-action on social media platforms. I argue that these messages aren’t coincidentally generated by algorithms, but (some) are referring to articles and infomercials, educational courses, events, products, services, apps or software on purpose. Some of the products may sound like music to my ears, but I must stress that none of the companies attempted to influence me in any other way. Hopefully this brief experiment serves as an interesting example of how reversed engineering could be used in actual educational research and how the issue of identity shaping in higher education is seen in a different light.

Did you like this article? Would you have approached it differently? Feel free to leave a like or comment below.

Also check out Tiana Bucher’s blog at http://tainabucher.com

Reference list

Bucher, T. (2018). If… Then: Algorithmic Power and Politics. New York, NY: Oxford University Press.

Latour, B. (2005). Reassembling the social: An Introduction to Actor-Network-Theory. New York, NY: Oxford University Press.

Rodgers, R. (2004). No An introduction to critical discourse analysis in education. (R. Rodgers, Ed.). New York: Routledge.

Collected images

Leave a Reply