Beyond lexical frequencies: using R for text analysis in the digital humanities - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Language Resources and Evaluation Année : 2019

Beyond lexical frequencies: using R for text analysis in the digital humanities

Taylor Arnold
  • Fonction : Auteur
Paula Lissón
Lauren Tilton
  • Fonction : Auteur

Résumé

This paper presents a combination of R packages—user contributed toolkits written in a common core programming language—to facilitate the humanistic investigation of digitised, text-based corpora. Our survey of text analysis packages includes those of our own creation (cleanNLP and fasttextM) as well as packages built by other research groups (stringi, readtext, hyphenatr, quanteda, and hunspell). By operating on generic object types, these packages unite research innovations in corpus linguistics, natural language processing, machine learning, statistics, and digital humanities. We begin by extrapolating on the theoretical benefits of R as an elaborate gluing language for bringing together several areas of expertise and compare it to linguistic concordancers and other tool-based approaches to text analysis in the digital humanities. We then showcase the practical benefits of an ecosystem by illustrating how R packages have been integrated into a digital humanities project. Throughout, the focus is on moving beyond the bag-of-words, lexical frequency model by incorporating linguistically-driven analyses in research.

Domaines

Linguistique
Fichier non déposé

Dates et versions

hal-03201084 , version 1 (17-04-2021)

Identifiants

Citer

Taylor Arnold, Nicolas Ballier, Paula Lissón, Lauren Tilton. Beyond lexical frequencies: using R for text analysis in the digital humanities. Language Resources and Evaluation, 2019, 53 (4), pp.707-733. ⟨10.1007/s10579-019-09456-6⟩. ⟨hal-03201084⟩
63 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More