Show simple item record

dc.contributor.authorCONTISSA, Giuseppe
dc.contributor.authorDOCTER, Koen
dc.contributor.authorLAGIOIA, Francesca
dc.contributor.authorLIPPI, Marco
dc.contributor.authorMICKLITZ, Hans-Wolfgang
dc.contributor.authorPALKA, Przemyslaw
dc.contributor.authorSARTOR, Giovanni
dc.contributor.authorTORRONI, Paolo
dc.date.accessioned2019-02-04T12:50:43Z
dc.date.available2019-02-04T12:50:43Z
dc.date.issued2018
dc.identifier.urihttps://hdl.handle.net/1814/60795
dc.descriptionPosted: 25 Jul 2018en
dc.description.abstractThis report contains preliminary results of the study aiming at automating legal evaluation of privacy policies, under the GDPR, using artificial intelligence (machine learning), in order to empower the civil society representing the interests of consumers. We outline what requirements a GDPR-complaint privacy policy should meet (comprehensive information, clear language, fair processing), as well as what are the ways in which these documents can be unlawful (if required information is insufficient, language unclear, or potentially unfair processing indicated). Further, we analyse the contents of privacy policies of Google, Facebook (and Instagram), Amazon, Apple, Microsoft, WhatsApp, Twitter, Uber, AirBnB, Booking.com, Skyscanner, Netflix, Steam and Epic Games. The experiments we conducted on these documents, using various machine learning techniques, lead us to the conclusion that this task can be, to a significant degree, realized by computers, if a sufficiently large data set is created. This, given the amount of privacy policies online, is a task worth investing time and effort. Our study indicates that none of the analysed privacy policies meets the requirements of the GDPR. The evaluated corpus, comprising 3658 sentences (80.398 words) contains 401 sentences (11.0%) which we marked as containing unclear language, and 1240 sentences (33.9%) that we marked as potentially unlawful clause, i.e. either a "problematic processing” clause, or an “insufficient information” clause (under articles 13 and 14 of the GDPR). Hence, there is a significant room for improvement on the side of business, as well as for action on the side of consumer organizations and supervisory authorities.en
dc.language.isoenen
dc.relation.ispartofseriesEuropean Consumer Organisation (BEUC) Study Reporten
dc.relation.ispartofseries2018en
dc.relation.urihttp://www.claudette.eu/gdpr/en
dc.relation.urihttps://www.beuc.eu/publications/beuc-x-2018-066_claudette_meets_gdpr_report.pdf
dc.rightsinfo:eu-repo/semantics/openAccessen
dc.titleCLAUDETTE meets GDPR : automating the evaluation of privacy policies using Artificial Intelligenceen
dc.typeWorking Paperen


Files associated with this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record