Studying Facebooks News Feed Algorithm with a Grounded Theory approach

This blog post is based on a research paper that was written to present the findings of my research done  in EthosLab. The paper was also handed-in as an exam assignment for the course Innovation and Technology in Society at ITU. The research started out with a general interest in the way algorithms enact politics. The initial area of study is inspired by my own curiosity as well as my academical and professional background in Data Science and Big Data Processing practices as well Science and Technology Studies (STS). For my Master Thesis, I am conducting Action Design Research (see Bryman, 2012, pg. 709) involving the development and implementation of an analytical tool based on Machine Learning Algorithms for a big manufacturing company. Literature concerning those topics appears to me rather uncritical, coming from a perception of technology being a neutral, unpolitical force, solely depending on its users intention. Especially upcoming debates about algorithms having influenced the US elections results open up for further discussions regarding accountability in the development of algorithm based technology. Due to its current relevance, its performativity as well as my interest in it, I choose Facebooks News Feed as a case of study for algorithmic and specifically Machine Learning technologies. This was the starting point for the development of further theory and questions that were developed throughout the research, applying a Grounded Theory inspired research methodology (see Bryman, 2012, pg. 712).

A STS approach to Grounded Theory
I do not try to be neutral in this research, because I believe neutrality in a positivist sense is not possible. Due to this, I try to avoid passiv language, where I think it is misleading, and use ”I” instead to emphasise my situatedness.I used Carol Bacchis ’whats the problem represented to be?’ approach towards problematisations as a method to re-complexify developed theories, which always are simplifications of complex realities (Bacchi, 2012). With this method, I wanted to challenge theories that can emerge grounded in data, after a process of coding and categorization. In particular, I asked critical questions as an attempt to understand how a problem was and is constructed. The methodological approach that I used in this research is summarised in figure 1. This approach was firstly developed in another research that I was part of, which is, however, not published.



Facebooks News Feed
Facebook presents on its press relations page, ”newsroom”, an algorithm based news feed ranking technology as a product called ”News Feed”. News Feed, introduced in 2006, is described as follows (Facebook, 2016d):
”News Feed is a regularly updating list of stories from friends, Pages, and other connections, like groups and events. People can like or comment on what they see. Each persons News Feed is personalized based on their interests and the sharing activity of their friends.”

Analysis of News Feed

The development team of News Feed publishes since 2013 recent changes and improvements in a specific news channel called ”News Feed FYI”(Facebook, 2013):
”We are continually working to improve News Feed and from time to time we make updates to the algorithm that determines which stories appear first. Today we announced a new series of blog posts called News Feed FYI that will highlight major updates to News Feed and explain the thinking behind them.”
I analyzed the ca. 35 press releases (Facebook, 2016b), most of them are 1-2 pages long and they start with a paragraph about the goals of News Feed, which ends with a sentence pointing to an update. This is followed by a section reporting certain findings from internal research activities used as an argument for implementing the update. Updates are often the incorporation of new factors into the algorithms that News Feed is built on. The documents are mostly closed by stating what those updates mean for stakeholders like People, Page Owners and Marketers.

Summarising, Coding, Categorising and Visualising
I started by writing notes and creating tags for a press release, those tags represent people, keywords, concepts and eventually categories. In the notes, I summarized different sections, commented on phrases and paragraphs, selected interesting quotes and interpreted certain arguments and statements. Throughout the analysis, I tried to find relationships and patterns between the press releases and developed first categories. I visualised a lot of my thoughts on whiteboards, in order to see the data in new ways and from new perspectives. Out of this, theories emerged and changed during the analysis, some initial theories were rejected after collecting more data, others complemented each other and finally became one. Together with the emerging theories, changed my opinions and believes, closely bound to the collection of data and the construction of my knowledge space. This process resulted in a visualisation representing the assemblage of practices, actants and its relations that went into its production. (see also Turnbull, 2003)

Emerging Theory

The overall goal of News Feed is to show ”the right content to the right people at the right time”. News Feed calls that also ”relevant” content. To decide if something is relevant, News Feed detects patterns of use that are measured via Engagement and Qualitative Feedback. Engagement is detected via Machine Learning algorithms based on factors as: Clicks, Likes, Comments, Shares, Time-viewing (if a user stops scrolling for a significant time), Time-reading (the time spent on an external link before returning to Facebook), actions on videos (turning on volume, enabling HD, expand window), Reactions (emoticons with different meanings instead of a like), unfollow (stay connected but do not follow posts in News Feed), prioritize (select friends and pages that you want to see more posts from), hide (hide a specific post or ad), report (hide a post or ad and mention reasons). Qualitative Feedback (since 2016) is mostly gathered via an ongoing online survey of about 10.000 users per day asking for Feedback and questions like which one of two presented posts a user would prefer. This is used to test the accuracy of the algorithms as well as for direct influence towards what is shown in News Feed. In addition, an ongoing panel called Quality Feedback Panel of about 1.000 users is daily asked detailed questions about News Feed. If a pattern was detected via engagement, the next step is to figure out why there is a pattern and if the pattern shall be supported or hindered, this is mostly done as well via qualitative measures and since 2016 as well via the Qualitative Feedback Program. Next, factors are added to the ranking algorithm that shall either support or hinder the pattern and tested on a sample of around 21 million users. If this leads to a positive development of the pattern (increase, decrease) the changes are deployed step by step for all users and platforms. This can either lead to a decrease of an unwanted pattern or a further increase of a wanted pattern. Because the results of the Machine Learning algorithm were often too general and boosted certain unwanted issues like hoax-stories and click and like-baiting, the Qualitative Feedback program was introduced to higher the accuracy of the predictions, especially towards quality of content. In addition, News Feed publishes certain recommendations towards how pages and marketers shall design their posts, so that it will create more engagement, which further accelerates the initial patterns. Those relations are shown in figure 2.

This means that things that users and their friends like, know and interact with (patterns of use shown in engagement and qualitative feedback) are generally ranked higher and shown more prominent in News Feed than other content. This can lead to states of self-referentiality, where opinions are built and actions are taken on granted facts that eventually become reality for those constructing such unchallenged, isolating and excluding social worlds. Out of this, the theory emerged that ”News Feed enacts uncritical thinking”, meaning not so much uncritical thinking towards others but towards truth, reality and especially ones own responsibility and accountability. This developed Theory, however, is clearly a simplification of a complex issue that needs to be seen as an intervention constructed by me and my situatedness. In order to zoom out again, I try to problematise this theory by asking critical questions being:

  • ”For whom is it a problem?”
  • ”In which interest is it?”
  • ”Which actants are involved and how are they related?”
  • ”How did it become a problem?”

Enacted uncritical thinking

For whom is it a problem?
I see News Feed as a representation of its inscribed network of social-technical relations between actants constructing it. I have attempted to understand how News Feed has become to be by analysing its development (at least based on front stage information; see Goffman, 1990) and developing a controversial theory grounded in the data I analysed. I have even re-assembled it by constructively intervening into it, as a result, I can somehow study News Feed in the making, taking News Feed not for granted anymore. This however, is my subjective perspective, based on my situation and notion of reality. (see also Smith and Marx, 1994; Winner, 1980;  Bijker et al., 1987;  Woolgar and Cooper, 1999; Latour, 1988; Law and Urry, 2004)

However, other actants in News Feeds meta-network (with focus on it as an actant in a network, not one it represents) take it for granted, thus, translating agreement to Facebook as one of the main actants. Such actants can be users of Facebook, not aware of News Feed enacting uncritical thinking. This is problematic for the users, being enacted to be uncritical, not aware of News Feeds epistemology, using it black-boxed and by doing so, constructing their own realities and affecting other realities based on unchallenged views, concepts and assumptions.  Those social worlds do have agency, agency to enact, for instance, discrimination, populism and inequality. Eventually, it is problematic for people that are not conform with such more or less closed social worlds, people who do not seem to fit in. 

In which interest is it?

News Feed describes on its ”values” page (Facebook, 2016c):
”Our success is built on getting people the stories that matter to them most. If you could look through thousands of stories every day and choose the 10 that were most important to you, which would they be? The answer should be your News Feed. It is subjective, personal, and unique and defines the spirit of what we hope to achieve.”
The problem is that even if it is somehow individual, due to the Qualitative Feedback and options to set-up your feed preferences, News Feed uses many different forms of generalisations and predictions that are built on grouping and classifying people into target and interest groups. Moreover, what you already like and whom and what you connect to has a big effect on what you will see in News Feed, which again can have an influence on what you like and connect with. This again can lead to somewhat isolated social groups that are to an extent self-referential:
“Our top priority is keeping you connected to the people, places and things you want to be connected to starting with the people you are friends with on Facebook.”

News Feed is finally honest about whom’s interest it in the end is; Facebooks: 
“We don’t favor specific kinds of sources or ideas. Our aim is to deliver the types of stories we’ve gotten feedback that an individual person most wants to see. We do this not only because we believe it’s the right thing but also because it’s good for our business. When people see content they are interested in, they are more likely to spend time on News Feed and enjoy their experience.”
So for Facebook, it does not matter too much how people interact, as long as they do interact, so that their platform is attractive for Marketers and other business opportunities. It is, however, important for them that users act according to certain ”community standards” and have ”authentic communication”, which means that News Feed tries to rank down hoaxes, click- and like-baiting and other forms of communication that, eventually, might not be good for their business:
”We are in the business of connecting people and ideas and matching people with the stories they find most meaningful. Our integrity depends on being inclusive of all perspectives and view points, and using ranking to connect people with the stories and sources they find the most meaningful and engaging.”
One aspect is that ranking and inclusion seems to be generally hard to combine in a good way. The other issue here is that Facebook might be overall open for all kinds of perspectives and view points, News Feed, however, enacts those views and perspectives to be separated and unchallenged in just barely connected digital social worlds.

Which actants are involved and how are they related?
One of the most powerful actants involved is Facebook as a company because it translates its interests directly to News Feed, which again translates its interests, dictated by the goals Facebooks defines, via its underlying algorithms to people. People are, in this network, mainly users and marketers. By taking News Feed for granted, users translate their agreement to News Feed and Facebook. Marketers again translate their interest to Facebook (e.g., targeting potential customers with advertisings) and Facebook to Marketers (e.g., offering new ways to reach a target group). I see the human actants behind Facebook (e.g., Management) and News Feed as less powerful than the Non-Human social-technical constructed actants (Facebook, News Feed) as their agency appears to me more manifold constructed than just due to social translations of a few individuals behind it like the managers of Facebook or the Developers of News Feed, who seem to have lost control and direct accountability for the technology. I see users who take News Feed for granted as a rather weak actant in this network of assemblages, because they translate their agreement to News Feed and Facebook by using it uncritically and have, therefore, not much influence. They are rather animated for engagement and classified into groups in order to work in agreement with News Feed, Facebook and Marketers. (see also Bijker et al., 1987; Winner, 1980;   Woolgar and Cooper, 1999; Latour, 1988; Law and Urry, 2004)

How did it become a problem?
Facebook argues, that the user controls what to see (Facebook, 2016c):
”Ultimately, you know what’s most meaningful to you and that’s why we’ve developed controls so you can customize what you see. Features such as unfollow, hide and see first help you design your own experience and when you use them, we take your actions as feedback to help us better understand what content is most important to you. For example, if you hide a story from someone, that signals that youre less interested in hearing from that person in the future. As News Feed evolves, well continue building easy-to-use and powerful tools to give you the most personalized experience.”

What Facebook describes is not just giving the users control over what they want to see in their News Feed, it is more about looking at certain patterns of use, interpreting the reason for it and taking based on that generalised assumptions about what a user perceives as relevant and what not. This is closely related to classifying and categorising users based on the same patterns. Those classifications are used, again, for targeting users with specific content. This can eventually mean that just a few of peoples characteristics are used as assumptions about who a person overall is. Based on this information, a person is confronted with content targeted towards that assumption that again leads to a state where News Feed constructs its users towards their own assumptions about them. For me, it also has become to be a problem, due the uncritical thinking of the people working for Facebook and the developers of News Feed itself. Especially by them somehow denying responsibility and their claims that users would be in full control, thus, fully accountable for how they act on Facebook. In addition, what they are not aware of and uncritical towards is the agency that News Feed has, due to its inscribed socio-technical relations, practices and interests.

Based on my situatedness, this research started out from a general curiosity about how algorithms enact politics, which became further focused towards the analysis of a specific case of algorithm based technology. Using a composition of methods deriving from grounded theory and Science and Technology studies, the research was led by my interests rather than fixed research questions. This allowed to inductively develop and reject theory grounded in data and formalised by problematisations. The statement ”News Feed enacts uncritical thinking” is not supposed to represent a certain kind of truth but rather to open up for debate and by that, to be performative. This explorative approach of data collection, summarising, coding, categorising, connecting and visualising was eventually used to ground and open up for a meta-level discussion that adds up to general questions about the technical, social, human and non-human.





















Adamic, L., Bakshy, E., Messing, S., 2015. Exposure to Diverse Information

on Facebook.


Bacchi, C., Apr. 2012. Why Study Problematizations? Making Politics

Visible. Open Journal of Political Science 02 (01), 1.


Bijker, W. E., Hughes, T. P., Pinch, T. J., 1987. The Social Construction

of Technological Systems: New Directions in the Sociology and History of

Technology. MIT Press, google-Books-ID: SUCtOwns7TEC.

Bryman, A., Mar. 2012. Social Research Methods, 4th Edition, 4th Edition.

Oxford University Press, Oxford ; New York.

Facebook, 2013. Announcing News Feed FYI: A Series of Blogs on News

Feed Ranking | Facebook Newsroom.


Facebook, 2016a. Company Info | Facebook Newsroom.


Facebook, 2016b. News Feed FYI | Facebook Newsroom.


Facebook, Jun. 2016c. News Feed Values.


Facebook, 2016d. Products | Facebook Newsroom.


Goffman, E., 1990. The presentation of self in everyday life, repr Edition.

Penguin, London, oCLC: 832784223.

Latour, B., Oct. 1988. Science in Action: How to Follow Scientists and En-

gineers Through Society, revised ed. edition Edition. Harvard University

Press, Cambridge, Mass.

Law, J., Urry, J., Aug. 2004. Enacting the social. Economy and Society 33 (3),



Sismondo, S., 2010. An introduction to science and technology studies, 2nd

Edition. Wiley-Blackwell, Chichester, West Sussex, U.K. ; Malden, MA,

oCLC: ocn317361783.

Smith, M. R., Marx, L. (Eds.), Jun. 1994. Does Technology Drive History?

The Dilemma of Technological Determinism. The MIT Press, Cambridge,


Turnbull, D., Sep. 2003. Masons, Tricksters and Cartographers: Comparative

Studies in the Sociology of Scientific and Indigenous Knowledge. Taylor &

Francis, google-Books-ID: zqrPhJk09moC.

Winner, L., 1980. Do Artifacts Have Politics? Daedalus 109 (1), 121–136.


Woolgar, S., Cooper, G., 1999. Do Artefacts Have Ambivalence? Moses’

Bridges, Winner’s Bridges and Other Urban Legends in S&TS. Social Studies of Science 29 (3), 433-449.