Measuring the Societal Impact of Research with Altmetrics: An Experiment

Ursula Oberst

Library African Studies Centre, Leiden


All research at Dutch universities is assessed on a regular basis following the Standard Evaluation Protocol (SEP). From 2015 onwards, one of the protocol’s criteria for measuring research success is the societal impact of the research. As traditional metrics do not provide an indication of public reach and influence, the African Studies Centre in Leiden (ASCL) decided to experiment with the new suite of alternative metrics – altmetrics – that measure the number of times a research output is viewed, downloaded or mentioned online. I analyzed the presence of altmetric indicators in 148 publications using and evaluated the content that tracked. This paper describes the ASCL Altmetric experiment and reports on its results.

1 Introduction

All research at Dutch universities is assessed on a regular basis following the Standard Evaluation Protocol (SEP). [1] From 2015 onwards, one of the protocol’s criteria for measuring research success is the societal impact of the research. But how can a research institute demonstrate that its research outputs have had societal impact? How can it prove that its outputs have not only been made publicly available, but that the public is using them? This question had to be tackled by the African Studies Centre in Leiden [2] (ASCL) in early 2017 when it was evaluated by an external independent committee according to SEP.

As traditional metrics do not provide an indication of public reach and influence, I decided to experiment with the new suite of alternative metrics – altmetrics – that were first proposed in 2010 (Priem et al. 2010). Altmetrics measure the number of times a research output is cited, tweeted about, liked, shared, bookmarked, viewed, downloaded, mentioned, favourited, reviewed, or discussed. The data is retrieved from a variety of (social) web services such as Facebook, Twitter, blogs, news media, or online reference management tools. Altmetrics analyse the impact of any research output (e.g. books, blogs, publications for the ’general public’, etc.) as opposed to the traditional way of assessing the impact of scientific output, which is primarily based on the analysis of journal articles.

There are a number of tools that capture altmetrics, such as PlumAnalytics, [3] Impactstory, [4] and [5] I chose for this experiment as it offers a free Altmetric Explorer account to academic librarians. [6] Furthermore Altmetric scores – visualized in the colourful Altmetric doughnut [7] – increasingly appear on publishers’ websites, in institutional repositories, and even in library catalogues.

2 The ASCL altmetrics experiment

2.1 Data and method

In May 2017, I analyzed the presence of altmetric indicators for 148 publications. The dataset consisted of the five most cited publications by ASCL researchers in the last five years (2012-2016) and was based on the data collected in Google Scholar for a benchmarking exercise, which also had to be accomplished for the ASCL evaluation. As scans a curated list of sources for mentions of academic research output by looking for DOIs [8] or other unique identifiers (such as identifiers, which are often used in institutional repositories), the ASCL first gathered each publication’s unique identifier. The Altmetric score and the Altmetric sources were then determined and the tracked content was evaluated.

2.2 Results

  • 14 out of 148 publications (solely journal articles) presented some altmetric activity (i.e. 9.5% of all publications).
  • The source that provided the most Altmetric scores was Twitter (11), followed at a distance by Facebook (4), mentions in blogs (3), mentions in Google+ (1) and mentions in policy documents (1). [9]
  • Roughly one third of all ASCL researchers (#11 of 35) had published a research output with at least one Altmetric mention. One ASCL researcher had published four articles with at least one Altmetric mention.
  • The highest scoring article had an Altmetric attention score of 15.

The absence of any Altmetric score for 90.5% of the publications in the ASCL dataset was striking and raised questions about’s method, its reliabilty, and the quality of its data. What are the conditions under which picks up mentions of online research output? Why were ASCL’s in-house publications not found by, even though they had a Leiden University Repository handle and had been tweeted about? Why were publications written by ASCL researchers not captured, even though they had a DOI or an ISBN and were being discussed on social media?

A closer look at’s support websites (Altmetric 2017) revealed that a couple of conditions have to be met before is able to capture altmetric activity. The prerequisites that enable to pick up mentions vary per source. Let us take Wikipedia as an example. tracks only articles on the English, Finnish, and Swedish-language Wikipedias. In order for to find a mention, the article cited must use a properly formatted Wikipedia citation tag. If the Wikipedia article only contains a link to a research output and not a valid Wikipedia citation tag, the mentioned research output will not be picked up.

To enable to track content on a publisher’s website, at least two supported meta tags must be present in the source code of the site. One of these metadata tags should be an identifier. If these metadata tags are absent, is not able to link a publication to its social media attention. [10]

In the ASCL case, tweets by the institute’s social media staff about new publications, which were expected to be traced by Altmetric, were not captured because these tweets linked to a news item on the ASCL website (e.g. They did not contain a DOI and did not link to the original research output page, i.e. the publisher’s website (e.g. As many researchers, policymakers, and members of the general public tweet about a publication without using a DOI, it is quite possible that more tweets about ASCL research output have been missed.

Not a single ASCL in-house publication had been captured by The reason for the absence of any social media attention for these publications was simple yet surprising: Leiden University’s Repository had not been ’whitelisted’. This means that it was not on’s curated list of sources for mentions of academic research output and had thus not been tracked by — a fact the ASCL did not anticipate. Even if the ASCL had wanted to, it could not have checked’s curated list of sources as this list is not publicly available. [11]

For measuring the impact of a research institution, not only the reliability of the data for each publication (and other research output) is important, but also whether the entire research output is taken into account. Looking at the results of the ASCL data set, it is notable that, only captured journal articles and no other publication types. The ASCL (scholarly) output does not exclusively consist of journal articles; on the contrary, roughly two thirds of its research output comprises books, book chapters, research reports, working papers, etc. Even though Altmetrics is set up to track all research output, in practice, the likelihood that it captures other output than journal articles seems small. This is problematic for an area studies institute that focuses on humanities and social sciences and whose researchers publish in a wide variety of media.

The highest scoring article in the test set had an attention score of 15 (in May 2017). Is this a ’good’ result? (Altmetric 2017) rightly emphazises that, very similar to traditional citation scores, one cannot really say a score is ’good’ as it measures attention, which can be good or bad. Furthermore, the average score varies by discipline/journal: an article in a natural science or medical journal such as Science, Nature, or The Lancet will
typically score much higher than one in a smaller journal in African Studies. [12]

In order to get a good idea of the meaning of the Altmetric score, I also considered the content and context of the information that is shared about its research output.

Twitter contributed the most to the ASCL Altmetric scores. The majority of the tweets were mere announcements by the author, the publisher, or a third party that a new article had been published and did not contain any judgement, comment, or question. Less than an estimated 20% of the tweets included a kind of appraisal, namely a recommendation to read the article. Even though the articles were widely shared on Twitter – in one case among more than 100,000 followers – this does not automatically mean that the articles were also read by a lot of people and that they had impact on science and/or society. I found no trace that one of the tweets resulted, e.g. in a newspaper article or attracted the attention of policymakers.

Two of the three blog posts captured by linked to distinguished blogs, the third blog post was more of a false positive. The article on the national security of a West African country was, according to, cited in a blog post (apparently) about computational biology and entitled ’Article_Sep.2 -ß’. As the blog no longer exists, it is impossible to figure out exactly why picked up this post. However, thanks to the Internet Archive it is possible to understand more about the blog and its content. [13] The blog, with a German motto, published posts on natural science news and music in English and Japanese. It included a category ’Oxford Journals’ which may have contained the article’s DOI. This ’false’ hit contributes substantially to the article’s Altmetric score, as blog articles count much more than Facebook posts or tweets. It derived from a blog that is dedicated to science communication in a very broad sense, (covering ’anything and everything’), which raises questions about Altmetric’s criteria for adding blogs to its curated list.

To adjust the score of tweets, applies ‘modifiers’ based on three principles (Altmetric 2017): reach (how many people is are likely to see the tweet); promiscuity (how often does a person tweet about research output); and bias (is this person/account tweeting about lots of papers from the same journal domain). It might be an idea to also introduce modifiers to the score of blogs to differentiate between posts that are merely annocunements and ’real’ (scientific) articles.

3 Conclusion discovered valuable mentions of ASCL research that the institute otherwise would not have noticed. But the data captured by, in general, was incomplete and too promotional to build upon an evaluation report on the societal impact of ASCL research output. is an instrument that can help researchers and research institutes to discover a pearl in an oyster, but in African Studies, at present, one needs a good portion of luck to spot a gem. In order to write a report on the societal impact of ASCL research output, novel approaches in research evaluation are needed. The ASCL looks forward to the outcomes of the Centre for Science and Technology’s new research programme ’Valuing science and scholarship: Meaningful relations between quality, impact and indicators’ (Centre for Science and Technology Studies 2017), which aims to develop the knowledge and tools to overcome current deficiencies in research assessments. Europeana’s recently published Impact Playbook (Fallon and Verwayen 2017) – developed primarily for the cultural heritage sector – might also be a useful instrument for assessing the societal impact of research, but this is still to be tested. [14]

[1] For more information on the SEP, see:

[2] The African Studies Centre Leiden is a knowledge institute in the Netherlands that undertakes research and is involved in teaching about Africa and aims to promote a better understanding of and insight into historical, current and future developments in Africa. As of 1 January 2016, the institute is a part of Leiden University. For more information, see:

[3] Plum Analytics ( gathers metrics about research from dozens of scholarly sources, media channels and social media tools, and categorizes them into Usage, Captures, Mentions, Social Media and Citations. The company is owned by Elsevier.

[4] Impactstory ( is a nonprofit organisation launched by Jason Priem and Heather Piwowar with the intention to make scholarly research more open, accessible, and reusable. The organisation creates and supports free services including Impactstory Profiles that allows researchers to create their own impact profiles online (including altmetrics; see for example

[5], or Altmetric (, is a London-based commercial company that tracks mentions of online research output and provides tools and services to institutions, publishers, researchers, funders, and other organizations to monitor this activity. Some tools are available for free.

[6] The librarian version of the explorer can be used to browse and filter all of the research outputs in the’s database, but does not include any institutional views, functionality or reporting. Great advantage of this version for librarians is the possibility to search up to 50 unique identifiers in a single search. For more information, see:

[7] The colors of the Altmetric doughnut represent the sources in which the publication was mentioned (red for news outlets, blue for Twitter, and so on), and the score in the centre indicates the overall attention an item has received. The Altmetric score is a weighted count and not just a sum of all the mentions. It reflects both the quantity (higher attention, higher score) and quality (weighting according to different sources) of attention recieved by each item, for example a newspaper article will have a higher weighting than a tweet.

[8] A digital object identifier (DOI) is a unique alphanumeric string (f.e. 10.1080/02589001.2017.1324620)
assigned by a registration agency (the International DOI Foundation, to identify content
and provide a persistent link to its location on the Internet.

[9] The low coverage of Altmetric sources, except for Twitter, is in line with Costas, Zahedi, and Wouters (2015:2007) and other studies.

[10] Kramer points out that altmetrics data for the same publication can vary widely depending on which altmetrics provider tracked the data (Kramer 2015:18). Moreover, providers can differ in the data they collect from the same source, for example Altmetric traces Facebook posts whereas Plum Analytics captures Facebook likes and comments.

[11] Leiden University’s repository is whitelisted now thanks to the friendly and quickly responding helpdesk of which the ASCL contacted when it noted the absence of any social media attention for the ASCL in-house publications.

[12] The highest scoring paper in 2016 was an article by former US president Barack Obama about the health care reform. It resulted in the highest Altmetric attention score ever tracked, 8,063, compared with a score of 4,912 for the second-most popular article on medical errors being the third leading cause of death in the US (see

[13] See*/

[14] ASCL finally ’measured’ the societal impact of its research output in a ’traditional’ way, by counting the number of outputs produced for the ’general public’ and by comparing the numbers with those counted in earlier evaluation reports.


Altmetric (2017). Altmetric data. URL:

Centre for Science and Technology Studies (2017). Valuing science and scholarship: meaningful relations between quality, impact and indicators. URL:

Costas, R., Zahedi, Z., and Wouters, P. (2015). Do "altmetrics" correlate with citations? Extensive comparison of altmetric indicators with citations from a multidisciplinary perspective. In: Journal of the Association for Information Science and Technology 66.10, pp. 2003–2019. DOI: 10.1002/asi.23309.

Fallon, J. and Verwayen, H. (2017). Introducing the Impact Playbook: the cultural heritage professionals’ guide to assessing your impact. URL:

Kramer, B. (2015). Altmetrics. Eerder gepubliceerd in Handboek Informatiewetenschap. DOI: 10.5281/zenodo.31899.

Priem, J., Taraborelli, D., Groth, P., and Neylon, C. (2010). Altmetrics: A manifesto. URL: