Being a responsible data provider

We’re committed to following three simple rules

1

We’ll be open about the scope of our data, and transparent about its limitations and what we’re doing to address them

2

We’ll support the principle that quantitative data should be used to support, evidence or challenge qualitative assessments, not replace them

3

We’ll work with the bibliometrics and research assessment communities to better understand the data we collect, and to develop research backed approaches to using it

What this means in practice

1. Being open about the data

We believe that being responsible data provider means sharing detailed information about our data and its limitations with our users and the wider community.

Where we’re extracting names, citations or keyword matches from policy documents we’ll provide an audit trail, so you can see exactly which paragraph, page and document we’re getting your results from.

We hold a lot of other data used for filtering and reporting, some of which is compiled or created in-house (e.g. policy document metadata, policy citations and topics), and some of which comes from external sources which have their own known limitations (e.g. data on funding, affiliation and publishers); in both cases, we explain how things work and what any relevant limitations are in the detailed publicly-available articles on our knowledge base at help.overton.io

The knowledge base includes information about geographical coverage, languages, coverage through time, how we determine author affiliations, and our criteria for adding new sources. 

The database changes over time, but we’ll monitor the help pages to make sure they’re not too out of date. Actually interpreting all this information and deciding if and how it materially affects different types of analysis can be difficult for non-specialists. Inside the Overton app we present a more concise view of potential data biases or limitations through “data notes” at the bottom of each page when you search in the policy or article tabs.  Where this isn’t enough we’ll try to ensure that you can always get advice from us through our support channels.

2. Quantitative data should support, not replace, qualitative approaches

Measuring the “success” of researchers, academic departments and institutions using quantitative indicators has been a controversial feature of the research landscape for many years. As a provider of data and indicators related to research we are acutely aware of our potential impact on and wider responsibilities to the scholarly community.

We think there’s a lot of value in quantitative indicators around research / policy interactions, especially when used by people familiar with the space and with existing bibliometric approaches. That said they need to be framed, understood and used correctly.

In particular they should be used in conjunction with qualitative – human led – approaches. We think Overton’s data can help highlight or pinpoint areas of interest, challenge or support human analysis or provide evidence for case studies and other narratives.

We’ve tried to design the Overton platform with these kind of use cases in mind.

We have a high utility vs risk threshold for including indicators or metrics within the Overton app, and we’ll be careful about how we present and market any that we do include.

We know that some users will use the data to create their own indicators and metrics, for example to benchmark their institution against others. In this instance, we are committed to working closely with clients and the wider sector to promote best practice for the responsible use of policy citation data.

3. Working with the wider community

We’re always keen to collaborate with our customers, funders and the bibliometrics and research administrator communities to further develop research-backed models for how to understand and use policy citation data.

The Overton data is free for non-commercial academic use i.e. if you’re using it for an academic study, or anything else that ends up in a journal, poster or other output. You can apply for researcher access to the Overton app through the trial access form and letting us know that you’re planning on using it for this purpose.

For larger, more complex projects we can provide slices of the data as one off data snapshots.

More generally we remain engaged with the academic community: the team regularly review papers for journals and conferences, publishes peer reviewed research and attends relevant conferences.

You can see a list of papers and projects that have used Overton data in our Zotero library.

Demonstrating our adherence to these commitments

We’re in the process of figuring out exactly what this will look like but will likely include discussions with users, feedback surveys and regular updates on relevant activities and product updates. We’ll provide regular public reports on our progress and activities and would welcome any thoughts on areas to prioritise.  

Our commitments to academic sector standards

We are signatories to DORA, we endorse the Leiden Manifesto and we maintain an updated NISO Altmetrics Code of Conduct self-reporting table.

Our commitment to DORA: what this means for Overton

Our commitment to the Leiden Manifesto principles: what this means for Overton

NISO Altmetrics Code of Conduct self-reporting table for Overton

 

If you have any feedback on our approach to being a responsible data provider, we’d love to hear from you. Email support@overton.io

What are people saying about Overton?

Have questions? We’re happy to help!​