Institutsseminar/2021-02-19 Zusatztermin: Unterschied zwischen den Versionen

Aus SDQ-Institutsseminar
(Die Seite wurde neu angelegt: „{{Termin |datum=2021/02/19 11:30:00 |raum=https://conf.dfn.de/webapp/conference/979160755 }}“)
 
Keine Bearbeitungszusammenfassung
 
Zeile 1: Zeile 1:
{{Termin
{{Termin
|datum=2021/02/19 11:30:00
|datum=2021/02/19 11:30:00
|raum=https://conf.dfn.de/webapp/conference/979160755
|online=https://conf.dfn.de/webapp/conference/979160755
}}
}}

Aktuelle Version vom 14. Januar 2022, 13:17 Uhr

Termin (Alle Termine)
Datum Freitag, 19. Februar 2021
Uhrzeit 11:30 – 12:00 Uhr (Dauer: 30 min)
Ort
Webkonferenz https://conf.dfn.de/webapp/conference/979160755
Vorheriger Termin Fr 12. Februar 2021
Nächster Termin Fr 19. Februar 2021

Termin in Kalender importieren: iCal (Download)

Vorträge

Vortragende(r) Mohamed Amine Chalghoum
Titel A comparative study of subgroup discovery methods
Vortragstyp Bachelorarbeit
Betreuer(in) Vadim Arzamasov
Vortragssprache
Vortragsmodus
Kurzfassung Subgroup discovery is a data mining technique that is used to extract interesting relationships in a dataset related to to a target variable. These relationships are described in the form of rules. Multiple SD techniques have been developed over the years. This thesis establishes a comparative study between a number of these techniques in order to identify the state-of-the-art methods. It also analyses the effects discretization has on them as a preprocessing step . Furthermore, it investigates the effect of hyperparameter optimization on these methods.

Our analysis showed that PRIM, DSSD, Best Interval and FSSD outperformed the other subgroup discovery methods evaluated in this study and are to be considered state-of-the-art . It also shows that discretization offers an efficiency improvement on methods that do not employ internal discretization. It has a negative impact on the quality of subgroups generated by methods that perform it internally. The results finally demonstrates that Apriori-SD and SD-Algorithm were the most positively affected by the hyperparameter optimization.

Neuen Vortrag erstellen

Hinweise