Institutsseminar/2022-07-22

Aus SDQ-Institutsseminar
Version vom 18. Juli 2022, 09:04 Uhr von Larissa Schmid (Diskussion | Beiträge)
(Unterschied) ← Nächstältere Version | Aktuelle Version (Unterschied) | Nächstjüngere Version → (Unterschied)
Termin (Alle Termine)
Datum Freitag, 22. Juli 2022
Uhrzeit 11:30 – 12:15 Uhr (Dauer: 45 min)
Ort MS Teams
Webkonferenz https://sdqweb.ipd.kit.edu/wiki/SDQ-Oberseminar/Microsoft Teams
Vorheriger Termin Fr 15. Juli 2022
Nächster Termin Fr 12. August 2022

Termin in Kalender importieren: iCal (Download)

Vorträge

Vortragende(r) Philipp Uhrich
Titel Empirical Identification of Performance Influences of Configuration Options in High-Performance Applications
Vortragstyp Masterarbeit
Betreuer(in) Larissa Schmid
Vortragssprache
Vortragsmodus online
Kurzfassung Many modern high-performance applications are highly-configurable software systems that provide hundreds or even thousands of configuration options. System administrators or application users need to understand all these options and their impacts on the software performance to choose suitable configuration values. To understand the influence of configuration options on the run-time characteristics of a software system, users can use performance prediction models, but building performance prediction models for highly-configurable high-performance applications is expensive. However, not all configuration options, which a software system offers, are performance-relevant. Removing these performance-irrelevant configuration options from the modeling process can reduce the construction cost. In this thesis, we explore and analyze two different approaches to empirically identify configuration options that are not performance-relevant and can be removed from the performance prediction model. The first approach reuses existing performance modeling methods to create much cheaper prediction models by using fewer samples and then analyzing the models to identify performance-irrelevant configuration options. The second approach uses white-box knowledge acquired through dynamic taint analysis to systematically construct the minimal number of required experiments to detect performance-irrelevant configuration options. In the evaluation with a case study, we show that the first approach identifies performance-irrelevant configuration options but also produces misclassifications. The second approach did not perform to our expectations. Further improvement is necessary.
Neuen Vortrag erstellen

Hinweise