Generating Efficient FPGA-based CNN Accelerators from High-Level Descriptions - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Journal of Signal Processing Systems (JSPS) Année : 2022

Generating Efficient FPGA-based CNN Accelerators from High-Level Descriptions

Résumé

The wide landscape of memory-hungry and compute-intensive Convolutional Neural Networks (CNNs) is quickly changing. CNNs are continuously evolving by introducing new layers or optimization strategies to either improve accuracy, reduce memory and computational needs or both. Moving such algorithms to on-device enables smarter edge products. However, hardware designers find this constant evolution hard to master, which keeps CNN accelerators one step behind. More approaches are using reconfigurable hardware, such as FPGAs, to design customized inference accelerators that are more suited to the newly-emerging CNN algorithms. Moreover, high-level design techniques, such as High-Level Synthesis (HLS), are adopted to address the time-consuming RTL-based design and the design space exploration problems. HLS allows generating RTL source code from high-level descriptions. This paper presents a hardware accelerator generation framework targeting FPGAs that relies on two steps. The first step characterizes the input CNN and produces hardware-aware metrics. The second step exploits the generated metrics to produce an optimized C-HLS source code for each layer of the input CNN, then it uses an HLS tool to generate a synthesizable RTL representation of the inference accelerator. The main goal of this approach is to reduce the gap between the evolving CNNs and the hardware accelerators, thus reducing design time of new systems.

Domaines

Electronique
Fichier non déposé

Dates et versions

cea-03760568 , version 1 (25-08-2022)

Identifiants

Citer

Nermine Ali, Jean-Marc Philippe, Benoit Tain, Philippe Coussy. Generating Efficient FPGA-based CNN Accelerators from High-Level Descriptions. Journal of Signal Processing Systems (JSPS), 2022, pp.1. ⟨10.1007/s11265-022-01797-w⟩. ⟨cea-03760568⟩
45 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More