Navigation auf zora.uzh.ch

Search ZORA

ZORA (Zurich Open Repository and Archive)

Applying test case prioritization to software microbenchmarks

Laaber, Christoph; Gall, Harald C; Leitner, Philipp (2021). Applying test case prioritization to software microbenchmarks. Empirical Software Engineering, 26(6):133.

Abstract

Regression testing comprises techniques which are applied during software evolution to uncover faults effectively and efficiently. While regression testing is widely studied for functional tests, performance regression testing, e.g., with software microbenchmarks, is hardly investigated. Applying test case prioritization (TCP), a regression testing technique, to software microbenchmarks may help capturing large performance regressions sooner upon new versions. This may especially be beneficial for microbenchmark suites, because they take considerably longer to execute than unit test suites. However, it is unclear whether traditional unit testing TCP techniques work equally well for software microbenchmarks. In this paper, we empirically study coverage-based TCP techniques, employing total and additional greedy strategies, applied to software microbenchmarks along multiple parameterization dimensions, leading to 54 unique technique instantiations. We find that TCP techniques have a mean APFD-P (average percentage of fault-detection on performance) effectiveness between 0.54 and 0.71 and are able to capture the three largest performance changes after executing 29% to 66% of the whole microbenchmark suite. Our efficiency analysis reveals that the runtime overhead of TCP varies considerably depending on the exact parameterization. The most effective technique has an overhead of 11% of the total microbenchmark suite execution time, making TCP a viable option for performance regression testing. The results demonstrate that the total strategy is superior to the additional strategy. Finally, dynamic-coverage techniques should be favored over static-coverage techniques due to their acceptable analysis overhead; however, in settings where the time for prioritzation is limited, static-coverage techniques provide an attractive alternative.

Additional indexing

Item Type:Journal Article, refereed, original work
Communities & Collections:03 Faculty of Economics > Department of Informatics
Dewey Decimal Classification:000 Computer science, knowledge & systems
Scopus Subject Areas:Physical Sciences > Software
Uncontrolled Keywords:Software
Scope:Discipline-based scholarship (basic research)
Language:English
Date:1 November 2021
Deposited On:14 Oct 2022 14:21
Last Modified:27 Nov 2024 02:38
Publisher:Springer
ISSN:1382-3256
OA Status:Hybrid
Free access at:PubMed ID. An embargo period may apply.
Publisher DOI:https://doi.org/10.1007/s10664-021-10037-x
PubMed ID:34776757
Other Identification Number:merlin-id:22829
Project Information:
  • Funder: SNSF
  • Grant ID: 200021_165546
  • Project Title: MINCA - Models to Increase the Cost Awareness of Cloud Developers ?
  • Funder: Universität Zürich
  • Grant ID:
  • Project Title:
Download PDF  'Applying test case prioritization to software microbenchmarks'.
Preview
  • Content: Published Version
  • Language: English
  • Licence: Creative Commons: Attribution 4.0 International (CC BY 4.0)

Metadata Export

Statistics

Citations

Dimensions.ai Metrics
5 citations in Web of Science®
8 citations in Scopus®
Google Scholar™

Altmetrics

Downloads

10 downloads since deposited on 14 Oct 2022
4 downloads since 12 months
Detailed statistics

Authors, Affiliations, Collaborations

Similar Publications