Institutsseminar/2020-08-14

Aus SDQ-Institutsseminar
Version vom 10. August 2020, 07:13 Uhr von Jakob Bach (Diskussion | Beiträge) (Die Seite wurde neu angelegt: „{{Termin |datum=2020/08/14 11:30:00 |raum=Raum 348 (Gebäude 50.34) }}“)
(Unterschied) ← Nächstältere Version | Aktuelle Version (Unterschied) | Nächstjüngere Version → (Unterschied)
Termin (Alle Termine)
Datum Freitag, 14. August 2020
Uhrzeit 11:30 – 12:00 Uhr (Dauer: 30 min)
Ort Raum 348 (Gebäude 50.34)
Webkonferenz
Vorheriger Termin Fr 24. Juli 2020
Nächster Termin Fr 4. September 2020

Termin in Kalender importieren: iCal (Download)

Vorträge

Vortragende(r) Cem Özcan
Titel Meta-Learning for Feature Importance
Vortragstyp Bachelorarbeit
Betreuer(in) Jakob Bach
Vortragssprache
Vortragsmodus
Kurzfassung Feature selection is essential to the field of machine learning, since its application results in an enhancement in training time as well as prediction error of machine learning models. The main problem of feature selection algorithms is their reliance on feature importance estimation, which requires the training of models and is therefore expensive computationally. To overcome this issue, we propose MetaLFI, a meta-learning system that predicts feature importance for classification tasks prior to model training: We design and implement MetaLFI by interpreting feature importance estimation as a regression task, where meta-models are trained on meta-data sets to predict feature importance for unseen classification tasks. MetaLFI calculates a meta-data set by characterizing base features using meta-features and quantifying their respective importance using model-agnostic feature importance measures as meta-targets. We evaluate our approach using 28 real-world data sets in order to answer essential research questions concerning the effectiveness of proposed meta-features and the predictability of meta-targets. Additionally, we compare feature rankings put out by MetaLFI to other feature ranking methods, by using them as feature selection methods. Based on our evaluation results, we conclude that the prediction of feature importance is a computationally cheap alternative for model-agnostic feature importance measures.
Neuen Vortrag erstellen

Hinweise