Determining a consistent experimental setup for benchmarking and optimizing databases

Abstract: 

The evaluation of the performance of an IT system is a fundamental operation in its benchmarking and optimization. However, despite the general consensus on the importance of this task, little guidance is usually provided to practitioners who need to benchmark their IT system. In particular, many works in the area of database optimization do not provide an adequate amount of information on the setup used in their experiments and analyses. In this work we report an experimental procedure that, through a sequence of experiments, analyzes the impact of various choices in the design of a database benchmark, leading to the individuation of an experimental setup that balances the consistency of the results with the time needed to obtain them. We show that the minimal experimental setup we obtain is representative also of heavier scenarios, which make it possible for the results of optimization tasks to scale.


Publication type: 
Congress
Published in: 
2021 The Genetic and Evolutionary Computation Conference (GECCO)
ISBN/ISSN: 
ACM ISBN 978-1-4503-8351-6
D.O.I.: 
https://doi.org/10.1145/3449726.3463180
Project: 
Energy Efficiency - Internet of things
Publication date: 
July 2021
CeDInt Authors: 
Other Authors: 
Moisés Silva-Muñoz, Alberto Franzin, Hugues Bersini