Federated Learning-Based Tokenizer for Domain-Specific Language Models in Finance - Archive ouverte HAL
Communication Dans Un Congrès Année : 2024

Federated Learning-Based Tokenizer for Domain-Specific Language Models in Finance

Résumé

The Federated Byte-level Byte-Pair Encoding (BPE) Tokenizer (FedByteBPE) leverages a Federated Learning (FL) approach for a privacy-preserving approach to train language models tokenizer across distributed datasets. This approach enables entities to train and refine their tokenizer models locally, with vocabulary aggregation performed on a centralized server. This method ensures the creation of a robust, domain-specific tokenizer while preserving privacy. Supported by theoretical analysis and empirical results from experiments on a real-world distributed financial dataset, our findings demonstrate that the federated tokenizer significantly outperforms off-the-shelf and individual local tokenizers in vocabulary coverage. This highlights the potential of federated learning to address training language model tokenizers in a privacy-preserving setting.
Fichier non déposé

Dates et versions

hal-04705757 , version 1 (23-09-2024)

Identifiants

  • HAL Id : hal-04705757 , version 1

Citer

Farouk Damoun, Hamida Seba, Radu State. Federated Learning-Based Tokenizer for Domain-Specific Language Models in Finance. ASONAM 2024: Advances in Social Networks Analysis and Mining, Sep 2024, Rende, Italy. ⟨hal-04705757⟩
10 Consultations
0 Téléchargements

Partager

More