Improving Information Extraction on Business Documents with Specific Pre-Training Tasks - Archive ouverte HAL
Communication Dans Un Congrès Année : 2022

Improving Information Extraction on Business Documents with Specific Pre-Training Tasks

Résumé

Transformer-based Language Models are widely used in Natural Language Processing related tasks. Thanks to their pre-training, they have been successfully adapted to Information Extraction in business documents. However, most pre-training tasks proposed in the literature for business documents are too generic and not sufficient to learn more complex structures. In this paper, we use LayoutLM, a language model pre-trained on a collection of business documents, and introduce two new pre-training tasks that further improve its capacity to extract relevant information. The first is aimed at better understanding the complex layout of documents, and the second focuses on numeric values and their order of magnitude. These tasks force the model to learn bettercontextualized representations of the scanned documents. We further introduce a new post-processing algorithm to decode BIESO tags in Information Extraction that performs better with complex entities. Our method significantly improves extraction performance on both public (from 93.88 to 95.50 F1 score) and private (from 84.35 to 84.84 F1 score) datasets composed of expense receipts, invoices, and purchase orders.
Fichier principal
Vignette du fichier
DAS_2022_DOUZON.pdf (2.14 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03676134 , version 1 (25-05-2022)
hal-03676134 , version 2 (14-06-2022)

Identifiants

Citer

Thibault Douzon, Stefan Duffner, Christophe Garcia, Jérémy Espinas. Improving Information Extraction on Business Documents with Specific Pre-Training Tasks. Document Analysis Systems 15th IAPR International Workshop, DAS 2022, May 2022, La Rochelle, France. pp.111-125, ⟨10.1007/978-3-031-06555-2_8⟩. ⟨hal-03676134v2⟩
146 Consultations
753 Téléchargements

Altmetric

Partager

More