Learning Intermediate Representations using Graph Neural Networks for NUMA and Prefetchers Optimization
Abstract
There is a large space of NUMA and hardware prefetcher configurations that can significantly impact the performance of an application. Previous studies have demonstrated how a model can automatically select configurations based on the dynamic properties of the code to achieve speedups. This paper demonstrates how the static Intermediate Representation (IR) of the code can guide NUMA/prefetcher optimizations without the prohibitive cost of performance profiling. We propose a method to create a comprehensive dataset that includes a diverse set of intermediate representations along with optimum configurations. We then apply a graph neural network model in order to validate this dataset. We show that our static intermediate representation based model achieves 80% of the performance gains provided by expensive dynamic performance profiling based strategies. We further develop a hybrid model that uses both static and dynamic information. Our hybrid model achieves the same gains as the dynamic models but at a reduced cost by only profiling 30% of the programs.
Fichier principal
Learning_Intermediate_Representations_using_Graph_Neural_Networks_for_NUMA_and_Prefetchers_Optimization.pdf (986.03 Ko)
Télécharger le fichier
Origin | Files produced by the author(s) |
---|