High-Dimensional Neural-Based Outlier Detection

Aus SDQ-Institutsseminar
Version vom 7. September 2017, 15:10 Uhr von Daniel Popovic (Diskussion | Beiträge) (Die Seite wurde neu angelegt: „{{Vortrag |vortragender=Daniel Popovic |email=daniel.popovic@live.de |vortragstyp=Diplomarbeit |betreuer=Edouard Fouché |termin=Institutsseminar/2017-10-06 |k…“)
(Unterschied) ← Nächstältere Version | Aktuelle Version (Unterschied) | Nächstjüngere Version → (Unterschied)
Vortragende(r) Daniel Popovic
Vortragstyp Diplomarbeit
Betreuer(in) Edouard Fouché
Termin Fr 6. Oktober 2017
Vortragssprache
Vortragsmodus
Kurzfassung Outlier detection in high-dimensional spaces is a challenging task because of consequences of the curse of dimensionality. Neural networks have recently gained in popularity for a wide range of applications due to the availability of computational power and large training data sets. Several studies examine the application of different neural network models, such an autoencoder, self-organising maps and restricted Boltzmann machines, for outlier detection in mainly low-dimensional data sets. In this diploma thesis we investigate if these neural network models can scale to high-dimensional spaces, adapt the useful neural network-based algorithms to the task of high-dimensional outlier detection, examine data-driven parameter selection strategies for these algorithms, develop suitable outlier score metrics for these models and investigate the possibility of identifying the outlying subspace for detected outliers.