BEUC: Large Tech Companies Privacy Policies are not GDPR Compliant

The European Consumer Organization (BEUC) used artificial intelligence to review the privacy policies of 14 large tech companies, including those from Google, Facebook, Amazon and Apple, and found that a number of these are not fully compliant with the European Union’s General Data Protection Regulation (GDPR).

Using technology called Claudette, BEUC examined the privacy policies and, wwith the assistance of researchers, found which language was problematic or confusing. The technology is a web crawler that monitors privacy policies and those policies are then processed using supervised machine learning technology. The technology highlighted sentences and categorized them as follows: insufficient information, unclear language and problematic processing.

The report stated that that privacy policies “are the main point of reference for civil society and individual consumers when it comes to controlling how personal data is being processed by the data controllers. None of the 14 analyzed privacy policies gets close to meeting the standards put forward by the GDPR,” the report continued. “Unsatisfactory treatment of the information requirements; large amounts of sentences employing vague language; and an alarming number of ‘problematic’ clauses cannot be deemed satisfactory,” the report said.

A problematic clause, for the purposes of the report, is defined as a clause is one that is possibly unlawful. Eleven percent of the sentences, among all of the privacy policies reviewed, contained confusing terminology, according to the finding published.

For instance, the report shows that Facebook’s privacy policy shows an awareness of the GDPR regulations “but gives rather the impression of the company using…legal terms and buzzwords and catch-phrases, [instead of attempting to construct] a truly user-centric, GDPR compliant policy.”

The report stated that “hopefully, they would start taking a more user-centric approach towards these documents, instead of treating them simply as a box to be checked,” the report said. “Moreover, if this study is treated as an inspiration to others, civil society might be soon equipped with artificial intelligence tools for the automated analysis of privacy policies. When this is the case, they will leave no stone untouched, no policy unread, no infringement unnoticed.”