Datum
|
2021/12/01 12:00 – 2021/12/01 13:00
|
Ort
|
MS Teams
|
Vortragende(r)
|
Nicolas Boltz
|
Forschungsgruppe
|
AbQP
|
Titel
|
Making Big Data, Privacy, and Anonymization Work Together in the Enterprise: Experiences and Issues
|
Autoren
|
Jeff Sedayao, Rahul Bhardwaj, Nakul Gorade
|
PDF
|
https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6906834
|
URL
|
https://www.doi.org/10.1109/BigData.Congress.2014.92
|
BibTeX
|
https://dblp.org/rec/conf/bigdata/SedayaoBG14.html?view=bibtex
|
Abstract
|
Some scholars feel that Big Data techniques render anonymization (also known as de-identification) useless as a privacy protection technique. This paper discusses our experiences and issues encountered when we successfully combined anonymization, privacy protection, and Big Data techniques to analyze usage data while protecting the identities of users. Our Human Factors Engineering team wanted to use web page access logs and Big Data tools to improve usability of Intel's heavily used internal web portal. To protect Intel employees' privacy, they needed to remove personally identifying information (PII) from the portal's usage log repository but in a way that did not affect the use of Big Data tools to do analysis or the ability to re-identify a log entry in order to investigate unusual behavior. To meet these objectives, we created an open architecture for anonymization that allowed a variety of tools to be used for both de-identifying and re-identifying web log records. In the process of implementing our architecture, we found that enterprise data has properties different from the standard examples in anonymization literature. Our proof of concept showed that Big Data techniques could yield benefits in the enterprise environment even when working on anonymized data. We also found that despite masking obvious PII like usernames and IP addresses, the anonymized data was vulnerable to correlation attacks. We explored the tradeoffs of correcting these vulnerabilities and found that User Agent (Browser/OS) information strongly correlates to individual users. While browser fingerprinting has been known before, it has implications for tools and products currently used to de-identify enterprise data. We conclude that Big Data, anonymization, and privacy can be successfully combined but requires analysis of data sets to make sure that anonymization is not vulnerable to correlation attacks.
|