Rxivist logo

Evaluating FAIR Maturity Through a Scalable, Automated, Community-Governed Framework

By Mark D Wilkinson, Michel Dumontier, Susanna-Assunta Sansone, Luiz Olavo Bonino da Silva Santos, Mario Prieto Godoy, Dominique Batista, Peter McQuilton, Tobias Kuhn, Philippe Rocca-Serra, Mercè Crosas, Erik Schultes

Posted 28 May 2019
bioRxiv DOI: 10.1101/649202 (published DOI: 10.1038/s41597-019-0184-5)

Transparent evaluations of FAIRness are increasingly required by a wide range of stakeholders, from scientists to publishers, funding agencies and policy makers. We propose a scalable, automatable framework to evaluate digital resources that encompasses measurable indicators, open source tools, and participation guidelines, which come together to accommodate domain relevant community-defined FAIR assessments. The components of the framework are: (1) Maturity Indicators - community-authored specifications that delimit a specific automatically-measurable FAIR behavior; (2) Compliance Tests - small Web apps that test digital resources against individual Maturity Indicators; and (3) the Evaluator, a Web application that registers, assembles, and applies community-relevant sets of Compliance Tests against a digital resource, and provides a detailed report about what a machine "sees" when it visits that resource. We discuss the technical and social considerations of FAIR assessments, and how this translates to our community-driven infrastructure. We then illustrate how the output of the Evaluator tool can serve as a roadmap to assist data stewards to incrementally and realistically improve the FAIRness of their resources.

Download data

  • Downloaded 674 times
  • Download rankings, all-time:
    • Site-wide: 46,514
    • In bioinformatics: 4,831
  • Year to date:
    • Site-wide: 127,218
  • Since beginning of last month:
    • Site-wide: 152,679

Altmetric data

Downloads over time

Distribution of downloads per paper, site-wide