Measuring societal impact of science in figures
Measuring the Impact of Universities is extremely complicated. It has to appeal to all scientific disciplines and it should imply direct Impact and Impact which may occur in a timespan of a hundred years or more. When you start realising that it may not be possible to measure the full impact of universities, it makes sense to start focussing on the “Road to societal impact” which may be defined as “knowledge exchange” or “Valorisatie” as we tend to call it in the Netherlands. What we mean to define is the process with which the societal impact of scientific knowledge is realised.
ScienceWorks started to measure the interactions between science and society in 2011 on a bi-annual basis. We started with measuring the traditional output indicators in three disciplines:
– The most entrepreneurial university, measuring the size and number of university spin off companies, jobs created in scienceparks, and the size of the university’s investment funds. We also included the number of patent applications.
– The most cooperative university, which measured all contract income, license income, public-private consortia and public-private publications.
– The best communicating university which measured the appearances of universities in the News on radio and TV and other popular media, excluding the scientific literature.
After making a correction for the number of researchers employed by a university, technical universities appeared to score much better in the first two disciplines. The social sciences, however, seemed to be far more popular in the media. Overall, the technical sciences scored better in the outcome which was related to financial investments and job creation.
For the 2017 Ranking, some knowledge transfer experts from classic universities asked ScienceWorks to include more specific attention for measuring the societal impact of Social Sciences and Humanities. Therefore, we have created a new category “Policy Impact”. This category measured the references to Dutch universities in the European Parliament, the Dutch national Parliament documents, municipal Council documents and also the advisory functions of scientists in National Government Advisory Boards. Not surprisingly, the Social Sciences, more than the Humanities, scored higher than the technical sciences and life sciences.
Balancing the outcomes
When combining all the subcategories one may argue that the overall picture may give a broad indication of the “Highest societal impact at large”. When measuring this we had to decide if the four categories needed to have the same weight assigned. As mentioned above, the outcome of indicators for “entrepreneurship” and “cooperation” are more focused on financial investments, job creation and the development of start-up companies. The other two indicators were more strongly focused on the appearances in the media or in the political arenas. Did we need to value these on an equal basis? It was clearly the case that the Classic Universities (including all scientific disciplines) would score higher if we did so. When still having to rely on public figures that are strongly based on the contract income we needed to value the following. If a scientist publishes research outcomes you may think of four different categories to do so:
– Publications in scientific literature
– Publication in a research or advisory report which is paid for
– Publication in a popular format, available on the market (book, lecture)
– Publication of a new finding in the media
For the first category, we have no means to measure the impact in society. For the other three categories we have decided that the publications for which the consumer has paid may indicate a higher societal value than a (free) internet publication. That’s why we decided to give the outcome for “entrepreneurship” and “cooperation” – both measuring more business like factors – a weight factor of 30%, while the outcome for “communication” and “policy-impact” each 20%. Nevertheless, the overall winner of this national Impact Ranking in the Netherlands – Twente University – would even have won if we had treated these four categories on an equal basis.
Knowledge transfer versus Impact
If we look at the outcome of our findings, we may conclude that measuring “Impact of Science” can only be professionally measured on a quantitative basis, when it is related to a kind of “direct impact”. This implies attention in the media, political arena’s or contract-related investments from external parties who would like to get something back for their investment in research or education. It also implies that “contract income”, which in many countries accounts for almost 25% of the funding of university research, is a meaningful indicator for impact. It derives from contract parties (business, charities, government-users, European Union) who trust in a concrete kind of “pay back”, which will take place in the near future. This implies that the financial investment in contract research, contract education, patents or start-ups are still the best indicator of impact, although there is often no societal impact yet. It is an investment in knowledge transfer for which the investor itself is responsible for the impact on the end-user. This end user is the consumer of a new product, a patient waiting for a new medicine, a policymaker who needs evidence, a better understanding of society or of ourselves. The idea that financial investments in science from external parties are “just commercial’ and are not “societal” does not make sense. In the Netherlands, the EU investments per researchers went up from 17 to 22% and the business investments per researchers went down from 9800€ to 7900€ (19,5% of all contract income). They all want to see return on their investments although the final impact in society is much harder to measure in figures. We need to professionalise the transfer of science to society and we need to stimulate the quality of these interactions and measure it. That’s why we strongly support the Knowledge Exchange Framework that’s now being developed in the UK. Just because we may not be able to measure the final societal impact of science, we should warmly embrace the insights on how the roads towards impact are substantiated as concrete as possible.
For more insight in the Impact ranking of the Dutch universities please check the Dutch press release here (English translation available here).
Frank Zwetsloot is the CEO of ScienceWorks and founder of ASTP and AESIS. Anika Duut van Goor is the General Manager of AESIS
Measuring societal impact of science in figures