Businesses oblivious to their need for ethnographic enquiry

Written by Amalie Blixt, MSc Digital innovation & Management and Junior Researcher in ETHOS Lab

By the internet, scholars, managers and acquaintances, we are told that this is the age of algorithms, of machine learning, of big data. So naturally, when I faced the question of what the focus in my Master thesis should be, it had to be a machine learning algorithm. Additionally, I am a big fan of qualitative approaches to data (hence my affiliation with ETHOS Lab). Thus, from January to June 2020, I immersed myself into an ethnographic study of an algorithm integrated into a large financial company to assist operational decision processes. I had worked in a similar company prior to beginning my fieldwork and was fascinated by this certain atmosphere of anticipation that surrounded machine learning and algorithms. What I gained from this study was a real-life confirmation of Tricia Wang’s statement that “big data needs thick data” (Wang, 2013). However, this opinion was not widely shared in a business environment dominated by numbers.

For this company, a machine learning algorithm was unknown territory as this was the first time they would integrate an algorithm into the operational environment. This algorithm was trained on past decisions about the phenomenon in question, and its output was shown as a small color-changing icon in the corner of the common case handling interface.

Data scientist: Well, it was the first time [the Machine Learning department] put something into production that had to be interpreted by people.

No connection between culture and technology?

Going into the field, I took special notice of two conditions. First, different informants kept telling me that this company was extremely relations-driven, meaning that personal relations across departments and teams were very important for coordination and execution of work. Second, there was a lot of talk about the particular algorithm’s performance in terms of hit rate, economic impact and statistical significance. Inspired by science and technology scholars, I set out to investigate how this relations-driven culture and the algorithm were related. How did one affect – alter, reinforce, cause, or something else – the other? However, any informant I asked about the relation between the two looked puzzled and said that they were not related. Even the idea of focusing on social and cultural conditions in the study of an algorithm was regarded as very unconventional. The company did have a team of anthropologists in place, but their objective was to study the customers. A change management team was even available, but the implementation of this algorithm was not considered disruptive enough to involve them, because the users would not be introduced to a new interface. Ultimately, I was puzzled by the lack of attention directed towards the people interacting with this algorithm, when so much attention was directed towards the algorithm’s numbers.

It was not the case that this algorithm was just dropped into development and implementation without any planning or preparation. For more than two years, data- and IT professionals tirelessly worked up business cases and a comprehensive proof of concept to convince executive managers of this algorithm’s technical and economic utility. Quantitative estimates were fine-tuned to anticipate its effect as precisely as possible. But how can anyone really quantify unknown territory?

In this case, the quantified estimate of the algorithm’s economic impact was very high. So high that it got executive management off their chairs with excitement. So high that the people involved in its development found it unrealistic.

Project manager: I have a lot of reservations (…) We are nowhere near the amount that the initial POC anticipated we could make from machine learning. I’ve never believed it was realistic, but that’s because there are all these false negatives, and it’s very hard to get it all down in a box on how to predict [this phenomenon].

What I saw through my ethnographic glasses was that the anticipation that was so conclusively directed at this algorithm was slowly starting to crumble in the eyes of everyone but top management. Data scientists started pointing out that the algorithm was “just software” and shouldn’t be elevated to a higher status. Middle managers started questioning whether it was the hype that had been created around this particular effort more than the algorithm itself that gave rise to better economic performance.


Figure 1: Comics stuck to the door entering the Machine Learning department reminding visitors to not elevate algorithms and their developers to a higher status than others.

Most importantly, the intended users of this algorithm – operational staff – did not understand what the “algorithm fuss” was about. In the end, they were still the defining forces in making decisions based on the expertise which they had perfected for decades. This expertise was widely comprehended as “gut feelings”, and despite the algorithm’s glorious reputation among managers, it did not change the decision making hierarchy in the eyes of operational employees.

Operational employee: The algorithm doesn’t provide a gut feeling, it provides concrete information. But if I have a gut feeling, then you should not bypass it. It is the absolute best parameter.

Social and organizational context matter

Ultimately, there was a great deal of social tension surrounding this algorithm, which was not visible through traditional project management methods and quantification. A qualitative approach to this algorithm was necessary to uncover these ongoing negotiations. In my case, I found it necessary to take an explorative approach to this situation as opposed to these very goal-oriented project management methods. Thus, I was inspired by concepts from actor-network theory (Callon, 1984; Latour, 1987, 1994; Latour & Porter, 1996),

situational analysis (Clarke, 2005) and critical algorithm studies (Ananny & Crawford, 2018; Gillespie, 2014; Introna, 2016; Seaver, 2014, 2017, 2018; Zarsky, 2016; Ziewitz, 2016) when investigating how this algorithm interacted with its network.

I will argue that the social negotiations I observed through ethnographic methods are extremely important to consider, even if all you care about are economic predictions. The social and organizational context is paramount to the success of a machine learning algorithm, especially when this type of technology is an unknown territory in a very established organizational environment. If the users of an exciting new decision making algorithm do not find it particularly exciting or useful, it will merely come to represent untapped potential and wasted development effort. The quantitative and the qualitative, the economic and the social, the technical and the cultural – they are inevitably intertwined.

For more than a decade, scholars have pointed out the limitations of algorithms and defended the importance of social context. However, the myth of algorithms (Ziewitz, 2016) is so strong that these critical arguments have not made their ways into business environments. There is a strong and outspoken preference for numbers in these environments, leaving the advocates for qualitative approaches silenced and overruled.

Calling all ethnographers

Ethnographers, you are desperately needed in innovation projects where established companies seek to increase efficiency and accuracy of decision processes through a machine learning algorithm. Qualitative conditions need to be prioritized as much as quantitative predictions, because these can and should not be separated. The real challenge is, however, making business managers realize this as well.

 

References

Ananny, M., & Crawford, K. (2018). Seeing without knowing: Limitations of the transparency ideal and its application to algorithmic accountability. New Media & Society, 20(3), 973–989.

Callon, M. (1984). Some elements of a sociology of translation: Domestication of the scallops and the fishermen of St Brieuc Bay. The Sociological Review, 32(1_suppl), 196–233.

Clarke, A. (2005). Situational Analysis: Grounded Theory Around the Post-Modern Turn. SAGE Publications, Inc.

Gillespie, T. (2014). The Relevance of Algorithms. In T. Gillespie, P. J. Boczkowski, & K. A. Foot (Eds.), Media Technologies (pp. 167–194). The MIT Press. https://doi.org/10.7551/mitpress/9780262525374.003.0009

Introna, L. D. (2016). Algorithms, governance, and governmentality: On governing academic writing. Science, Technology, & Human Values, 41(1), 17–49.

Latour, B. (1987). Science in action: How to follow scientists and engineers through society. Harvard University Press.

Latour, B. (1994). On Technical Mediation: Philosophy, Sociology, Genealogy. In Common Knowledge (pp. 29–64). Oxford University Press.

Latour, B., & Porter, C. (1996). Aramis, or, The love of technology (Vol. 1996). Harvard University Press Cambridge, MA.

Seaver, N. (2014). Knowing algorithms. February.

Seaver, N. (2017). Algorithms as culture: Some tactics for the ethnography of algorithmic systems. Big Data & Society, 4(2), 1–12. https://doi.org/10.1177/2053951717738104

Seaver, N. (2018). What should an anthropology of algorithms do? Cultural Anthropology, 33(3), 375–385.

Wang, T. (2013, May 13). Big Data Needs Thick Data. Ethnography Matters. http://ethnographymatters.net/blog/2013/05/13/big-data-needs-thick-data/

Zarsky, T. (2016). The trouble with algorithmic decisions: An analytic road map to examine efficiency and fairness in automated and opaque decision making. Science, Technology, & Human Values, 41(1), 118–132.

Ziewitz, M. (2016). Governing algorithms: Myth, mess, and methods. Science, Technology, & Human Values, 41(1), 3–16.