ETHOS Lab works with experimental digital methods to explore what can be done to “create value with IT”. But what constitutes value? We believe methods are a key ingredient to mapping and creating value, and in more ways than one might expect. This article describes a recent project by Lab Manager Michael Hockenhull that sought to explore how methods might come to matter differently for value.

Michael Hockenhull | 2015-05-11| Copenhagen

A famous adage made popular by Mark Twain claims: “There are three kinds of lies: lies, damned lies and statistics.” While most of us smile at that sort of literary claim, the mistrust of statistics was my underlying motivation for the MA thesis that I have just finished for my degree in Philosophy at the University of Copenhagen. Particularly, I was distrustful of the way in which statistics and metrics are used to measure and control those disciplines that we with a catch-all term call ‘the humanities’.

Popular consensus says that there is a crisis of or in the humanities. Governments cut back on funding, pundits call them ‘luxuries’ and students are increasingly sceptical of their ability to get a job after finishing their degrees – and of the worth of such a degree if they cannot! This is as much the case here in Denmark as it is in the U.S., UK or other Western countries.

The above clip from the HBO series Silicon Valley showcases a pervasive attitude towards the liberal arts, what we in Denmark simply call the humanities. It is viewed as a luxury or even in this extreme case, “snake-oil”.

The argument of my thesis is that the humanities do indeed produce value for society. Humanistic research and the knowledge it produces, rather than being a luxury, is sorely needed when one considers that the “grand challenges” of the 21st millennium all have one common factor: human beings. Whether it is climate change, terrorism, mass migration or other challenges, human beings are at the core of the problem, for better or worse.

If this is the case, why are the humanities not valued higher? What I would like to suggest, is that the methods we use to understand and detect the value of academic activities are not up to the task that we ask of them. We use statistics, impact factor and citation analysis to determine what and how research is valuable. Impact factor is a good example of this: journals are awarded a rating based on how well-cited they are, and researchers are encouraged to publish in journals with higher ratings. Good research is here equated with a higher score on a fairly arbitrary numeric scale. This is admittedly a simplistic rendition, however I still claim that it represents a gross simplification of the impact and value of what we broadly call science.

Innovations with regards to the evaluation of science are luckily appearing: Article level metrics (ALMs) for example, such as those you can read about here, are being implemented. These can trace the citing of individual paragraphs rather than a whole paper, and they track not just how papers cite other papers, but also how they are referenced in outlets such as Twitter or blogs.

ALMs are great steps forward, but for humanistic research, there are still challenges. Recent research has suggested that value generated by humanistic research is hard to pin down because it is diverse, diffuse, social and cultural*. It does not conform to the standard narratives of value conceived as being economic and countable. In my thesis, I propose one possible method that might help chart the value of humanistic knowledge.

I suggest that one way in which a method could do this is by mapping the spread of ideas. This is based on a model** put forward by impact researcher Paul Benneworth that suggests that humanistic research creates value for society in a three-tier model. Research is first disseminated by humanistic scholars, it is then debated by a small group of dedicated lay people before becoming a part of general consciousness. Because of digitalization most research is now indexable and searchable down to every last footnote, as are most debate forums and public news outlets. My suggested method, the “idea-mapper”, is designed to make use of this digitalization to crawl and scrape full-text articles, analyze it for semantic patterns and then display these to the researcher. The idea is that texts represent ideas, and since texts are now searchable on a large scale we can do just that to map how ideas move from one context to another.

The key point of my thesis was that this “method assemblage”, using a term from social scientist John Law, itself requires humanistic competencies to actually draw out the value it maps. Value, I claim, is not self-evident and so you need the skills of interpretation, critique and intimate knowledge of what you are studying to actually make claims about the value of what you are seeing. My suggestion with the thesis is not that digital methods can magically make the value of the humanities apparent, but that by building digital method assemblages that require the input of humanistic researchers it is perhaps possible to effect the existing practices of valuing science from the inside out.

The flip side of this is also true, and is one of the reasons why economic and quantitative ways of valuing have come to dominate research valuation. Because economic methods were easy to use***, they were employed and have now created realities in which anything that does not conform is treated as not valuable. To remedy this situation we need different methods that create different realities of valuation.

The idea-mapper is a suggestion that can be built with currently existing technology, and I am sure a more advanced version of it already exists somewhere in Google’s or the NSA’s basement. In my own thesis, I used a potpourri of existing methods to mimic the functions such a program would have, but unfortunately ran into large technical challenges. The data I was able to analyze was only a small part of the full data I collected and it was insufficient to draw conclusions from it. The below visualization is based on the data I was able to analyze, and shows the semantic overlap between humanistic journals, debate pieces from the website and news and blogs from the New York Times.



The network was produced in the network visualization program Gephi, and structured using the ForceAtlas2 layout. It shows the semantic overlap between humanistic journals and more popular media.

I used a custom-API building tool called KimonoAPI. With it, I was able to scrape the full-text articles of the sources that can be seen on the above map. The results were categorized and put into the program Actor Network Text Analyser (ANTA) where semantic analysis technology was used to find connections between the contents of different texts. The map included here shows those connections: the purple nodes represent so-called semantic entities present in the texts, i.e. words representing real life things such as ‘United Nations’, ‘immigration’, ‘Barack Obama’ etc.. The bigger the node the more prevalent the term. The other type of node cannot be seen at this resolution, but they represent the different texts from NYT, The Conversation and different journals. The edges – connections – between the nodes are visible however, and are coloured based on their node on departure. Unsurprisingly the journals are grouped together in “tendrils”, each with their own colour. The news, blogs and debate articles can be seen clumped together in one big mass to the left side of the visualization.

In this visualization, the closer a node is to another one, the stronger they are related. Research articles from the same journals naturally come together as they share the same topics, and likewise with a lot of news articles.

While the above map shows overlap, it turned out that most of this was incidental. It did not refer to concepts or ideas that could be further analyzed to show ideas being generated in the journals and then being reported on or transmitted to the more popular news channels. Does this mean that humanistic research is not valuable after all? That there is no contact between research done in the humanities and the popular sphere? As a pilot study, it is hard to make that sort of claim. Rather, my experience with trying to put together an experimental method is that doing so is very difficult! There are all sort of pitfalls that I managed to wade into, and which showed me that there is a very good reason for why politicians and others rely on statistics: exactly because they are dependable and stable.

Does this mean that we should not try to create new methods? Of course not. Contrary to the popular saying, sometimes you do need to fix what is not broke, simply for the reason that while it may still be working, the job it was made for is no longer relevant. Digitalization has created a world in which data is everywhere and a single MA thesis can couple together a primitive way of mapping ideas. That in itself indicates the possibilities of new methods, of which the idea-mapper is but a single example. In ETHOS Lab we work with citation data of course – but we also work with data from Twitter, Facebook, from web-crawlers, from Google and from other sources to come. Digitalization has only begun and we cannot rely only on the old methods of understanding the world any longer. At its core, digitalization creates worlds of data that represent different aspects of reality. Methods are what we use to stitch that data together into impressions of how the world looks. My experience with methods is that they are exactly that – impressionistic rather than photographically accurate. Statistics claim to be the latter, but with digital methods it becomes apparent that the former is more fitting. The question then is how to create value with an impressionistic approach? I have no easy answer, and this is a topic which we are continually exploring at ETHOS.

This sort of talk about method and data may sound unsettling, but in order to look at methods in a new light we have to be prepared to let our normal conceptions be rattled. For me my thesis has raised as many questions as it has answered: about the nature of value, about developing methods and about the nature of methods themselves.

What thing seems clear to me: the methods we use define what we come to value. It should not be a stretch of the imagination to understand that what one counts is what gets valued. What is harder to count is left as intangible. At ETHOS Lab we are trying to use digital methods to expand on what we can make visible and valuable and figure out how visibility can translate into value, especially when one or the other or both are impressionistic, so to speak.

These are not questions we can answer in any other way than by collaborating, which is also why we are actively looking to foster relationships with students, researchers and organisations. We need partners to exchange ideas with, data to analyze and explore and people to help us do so. Therefore, if you find any of these questions interesting please feel free to write us at


*Molas-Gallart, “Research Evaluation and the Assessment of Public Value”; O’Brien, “Cultural Value, Measurement and Policy Making”; Benneworth, “Tracing How Arts and Humanities Research Translates, Circulates and Consolidates in Society. How Have Scholars Been Reacting to Diverse Impact and Public Value Agendas?”

**Benneworth, “Tracing How Arts and Humanities Research Translates, Circulates and Consolidates in Society. How Have Scholars Been Reacting to Diverse Impact and Public Value Agendas?”

***Godin and Doré, “Measuring the Impacts of Science: Beyond the Economic Dimension.”