The Data Provenance Initiative: A Large Scale Audit of Dataset Licensing & Attribution in AI - Archive ouverte HAL Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2023

The Data Provenance Initiative: A Large Scale Audit of Dataset Licensing & Attribution in AI

Robert Mahari
  • Fonction : Auteur
Anthony Chen
  • Fonction : Auteur
Naana Obeng-Marnu
  • Fonction : Auteur
William Brannon
  • Fonction : Auteur
Niklas Muennighoff
  • Fonction : Auteur
Nathan Khazam
  • Fonction : Auteur
Jad Kabbara
  • Fonction : Auteur
Kartik Perisetla
  • Fonction : Auteur
Alexis Wu
  • Fonction : Auteur
Enrico Shippole
  • Fonction : Auteur
Kurt Bollacker
  • Fonction : Auteur
Tongshuang Wu
  • Fonction : Auteur
Luis Villa
  • Fonction : Auteur
Sandy Pentland
  • Fonction : Auteur
Sara Hooker
  • Fonction : Auteur

Résumé

The race to train language models on vast, diverse, and inconsistently documented datasets has raised pressing concerns about the legal and ethical risks for practitioners. To remedy these practices threatening data transparency and understanding, we convene a multidisciplinary effort between legal and machine learning experts to systematically audit and trace 1800+ text datasets. We develop tools and standards to trace the lineage of these datasets, from their source, creators, series of license conditions, properties, and subsequent use. Our landscape analysis highlights the sharp divides in composition and focus of commercially open vs closed datasets, with closed datasets monopolizing important categories: lower resource languages, more creative tasks, richer topic variety, newer and more synthetic training data. This points to a deepening divide in the types of data that are made available under different license conditions, and heightened implications for jurisdictional legal interpretations of copyright and fair use. We also observe frequent miscategorization of licenses on widely used dataset hosting sites, with license omission of 70%+ and error rates of 50%+. This points to a crisis in misattribution and informed use of the most popular datasets driving many recent breakthroughs. As a contribution to ongoing improvements in dataset transparency and responsible use, we release our entire audit, with an interactive UI, the Data Provenance Explorer, which allows practitioners to trace and filter on data provenance for the most popular open source finetuning data collections: www.dataprovenance.org.
Fichier principal
Vignette du fichier
2310.16787.pdf (3.61 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04290233 , version 1 (16-11-2023)

Identifiants

  • HAL Id : hal-04290233 , version 1

Citer

Shayne Longpre, Robert Mahari, Anthony Chen, Naana Obeng-Marnu, Damien Sileo, et al.. The Data Provenance Initiative: A Large Scale Audit of Dataset Licensing & Attribution in AI. 2023. ⟨hal-04290233⟩
16 Consultations
32 Téléchargements

Partager

Gmail Facebook X LinkedIn More