An Empirical Study of the Impact of Test Strategies on Online Optimization for Ensemble-Learning Defect Prediction

Kensei Hamamoto, Masateru Tsunoda, Amjed Tahir, Kwabena Ebo Bennin, Akito Monden, Koji Toda, Keitaro Nakasai, Kenichi Matsumoto

Research output: Chapter in Book/Report/Conference proceedingConference paperAcademicpeer-review

Abstract

Ensemble learning methods have been used to enhance the reliability of defect prediction models. However, there is an inconclusive stability of a single method attaining the highest accuracy among various software projects. This work aims to improve the performance of ensemble-learning defect prediction among such projects by helping select the highest accuracy ensemble methods. We employ bandit algorithms (BA), an online optimization method, to select the highest-accuracy ensemble method. Each software module is tested sequentially, and bandit algorithms utilize the test outcomes of the modules to evaluate the performance of the ensemble learning methods. The test strategy followed might impact the testing effort and prediction accuracy when applying online optimization. Hence, we analyzed the test order's influence on BA's performance. In our experiment, we used six popular defect prediction datasets, four ensemble learning methods such as bagging, and three test strategies such as testing positive-prediction modules first (PF). Our results show that when BA is applied with PF, the prediction accuracy improved on average, and the number of found defects increased by 7% on a minimum of five out of six datasets (although with a slight increase in the testing effort by about 4% from ordinal ensemble learning). Hence, BA with PF strategy is the most effective to attain the highest prediction accuracy using ensemble methods on various projects.

Original languageEnglish
Title of host publicationProceedings - 2024 IEEE International Conference on Software Maintenance and Evolution, ICSME 2024
PublisherIEEE
Pages642-647
ISBN (Electronic)9798350395686
ISBN (Print)9798350395693
DOIs
Publication statusPublished - 2024
Event40th IEEE International Conference on Software Maintenance and Evolution, ICSME 2024 - Flagstaff, United States
Duration: 6 Oct 202411 Oct 2024

Publication series

NameProceedings - 2024 IEEE International Conference on Software Maintenance and Evolution, ICSME 2024

Conference/symposium

Conference/symposium40th IEEE International Conference on Software Maintenance and Evolution, ICSME 2024
Country/TerritoryUnited States
CityFlagstaff
Period6/10/2411/10/24

Keywords

  • fault prediction
  • multi-armed bandit problem
  • overlooking
  • risk-based testing

Fingerprint

Dive into the research topics of 'An Empirical Study of the Impact of Test Strategies on Online Optimization for Ensemble-Learning Defect Prediction'. Together they form a unique fingerprint.

Cite this