Vis enkel innførsel

dc.contributor.advisorOtneim, Håkon
dc.contributor.authorJuelsen, Eirik
dc.contributor.authorThoresen, Marius Andre
dc.date.accessioned2022-04-06T11:53:20Z
dc.date.available2022-04-06T11:53:20Z
dc.date.issued2021
dc.identifier.urihttps://hdl.handle.net/11250/2990213
dc.description.abstractThe General Data Protection Regulation implemented in 2018 by the European Union imposes strict requirements when handling personal data regarding European citizens. This is especially true when processing said data in combination with machine learning and AI. Within these requirements lies an inherent focus on the rights of the subject and their ability to exercise these rights. Through the GDPR the subject is required to be informed about any existence of machine learning models utilising their personal data and provided meaningful information concerning the logic of the model and an explanation of the inferences made by this model. This master thesis examines the possibility of employing Shapley values in the process of building a machine learning model and its ability to provide meaningful information to the subjects affected by this model. We review the compliance of Shapley values according to the GDPR throughout the machine learning process and highlight how the framework is affected by specific articles in the GDPR. We argue that the most applicable categories of the GDPR in relation to machine learning models explained with Shapley values are Consent, Personal Data, Processing, and the Right to be informed. The GDPR significantly affects all aspects of a machine learning model, from data collection to prediction explanation. We argue that by utilising Shapley values as a framework throughout the process, we have trained, and are able to explain the predictions of, a binary classification model. We believe this model both complies with the strict demands set forth by the GDPR as well as provides strong predictions, indicative of the ability to utilise Shapley values within the legal framework of the GDPR.en_US
dc.language.isoengen_US
dc.subjectbusiness analyticsen_US
dc.titleShapley values in the context of GDPR Can Shapley Values be used as a means of interpreting black-box machine learning models while also complying with the General Data Protection Regulation?en_US
dc.typeMaster thesisen_US
dc.description.localcodenhhmasen_US


Tilhørende fil(er)

Thumbnail

Denne innførselen finnes i følgende samling(er)

Vis enkel innførsel