• No results found

N ECESSITY AND PROPORTIONALITY

14.1 The principle of proportionality

The concept of necessity is made up of two related principles, namely proportionality and subsidiarity. Personal data which are processed must be necessary for the purpose pursued by the processing activity. Proportionality means the invasion of privacy and the protection of the personal data of the data subjects is proportionate to the purposes of the processing. Subsidiarity means that the purposes of the processing cannot reasonably be achieved with other, less invasive means. If so, these alternatives have to be used.

Proportionality demands a balancing act between the interests of the data subject and the data controller. Proportionate data processing means that the amount of data processed is not excessive in relation to the purpose of the processing. If the purpose can be achieved by processing fewer personal data, then the data controller needs to limit the processing to personal data that are necessary.

Therefore data controllers may only process personal data that are necessary to achieve legitimate purpose. The application of the principle of proportionality is thus closely related to the principles of data protection from Article 5 GDPR.

14.2 Assessment of the proportionality

The key questions are: are the interests properly balanced? And, does the processing not go further than what is necessary?

To assess whether the processing is proportionate to the interests pursued by the data controller(s), the processing must first meet the principles of Article 5 of the GDPR. As legal conditions they have to be complied with in order to make the data protection legitimate.

Fair and lawful

Data must be ‘processed lawfully, fairly and in a transparent manner in relation to the data subject’

(Article 5 (1) (a) GDPR). This means that data subjects must be informed about the processing of their data, that all the legal conditions for data processing are adhered to, and that the principle of proportionality is respected. As analysed in Sections 11.1 and 11.2 of this report, Google nor the universities currently have a legal ground for any of the processing through G Suite (Enterprise) for Education. This means the personal data are not processed lawfully.

Transparent

Google does not process the data in a transparent manner either. Google does publish extensive documentation for administrators about the 19 different audit log files they can access to monitor end-user behaviour. However, at the time of completion of this DPIA Google did not publish documentation about other Diagnostic Data it collects through its own system-generated log files.

The logs that can be accessed by admins do not contain any information about the website data Google collects, nor information about the use of Features, Additional Services, the Technical Support Services or the Other related services, or an exhaustive overview of all activities performed with a Google Account.

Google equally fails to provide any public explanation to its Enterprise customers in the EU about the other kinds of Diagnostic Data it collects through the use of the G Suite (Enterprise) for Education services, such as the telemetry data. Administrators and end-users cannot inspect the contents of these telemetry data either, nor does Google provide access thereto in response to a formal Data Subject Access request, as laid down in Article 15 of the GDPR.

The lack of transparency makes the data processing inherently unfair. The lack of transparency also makes it impossible to assess the proportionality of the processing.

Data minimisation and privacy by design

The principles of data minimisation and privacy by design require that the processing of personal data is limited to what is necessary: the data must be 'adequate, relevant and limited to what is necessary for the purposes for which they are processed' (Article 5(1)(c) of the GDPR).’ This means that the controller may not collect and store data which are not directly related to a legitimate purpose.

The principle of privacy by design (Article 25 (2) GDPR) requires that “the data controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed.” According to this principle, the default settings for the data collection should be set in such a way as to minimise data collection by using the most privacy friendly settings.

As described in Section 3 of this report, Google processes personal data in the Core Services with a privacy unfriendly default setting. This is the case for the accessibility of the Other related service Feedback.

Through Feedback Google can process personal data in Customer Data for unauthorised purposes.

In view of the possibly sensitive nature of such data, lack of transparency and possible risks for data subjects if Customer Data are processed for unlawful purposes and absence of opt-out controls, this processing is disproportionate.

Google equally fails to apply the principle of privacy by design with regard to data processing in the context of the Google Account, the Diagnostic Data, the Technical Support Services and the Additional Services.

Google frequently offers opt-out choices, instead of active opt-in choices. Google offers opt-out choices in many locations (different menus on the devices, in the browser and on different webpages). This places an unnecessary burden on the shoulders of the employees and students and makes the data processing disproportionate.

There is only one Google Account, that can be used in both an enterprise and consumer environment.

All end-users with a Google Account must accept the same general (consumer) Terms of Service and the (consumer) Privacy Policy, regardless if they create the account as a consumer or as an employee.

Google explains that this is the case because end-users may use their Google Account to sign into, and use, any of Google’s 92 consumer services, if their administrator does not restrict such use.

Google allows end-users to sign-in with multiple Google Accounts. This design of the services does not sufficiently and systematically take the specific data protection risks for students, employees and the universities into account. Universities need to draw strict lines between processing of personal data in the consumer and educational environments, in order to prevent data breaches and unauthorised processing of personal data and confidential information.

At the time of writing of this report, administrators could block access to the existing Additional Services for Google Education accounts (by default such access is not blocked to any new Additional Services). However, admins could not completely prevent logged-in users from

accessing Additional Services. When an end-user accesses an Additional Service such as Youtube or Google Search while logged-in with a Google Education account, and the administrator has

centrally disabled the use of the Additional Services, the user can still use those services, but Google ensures that the end-user is logged-out from the Google Education account. However, if the user has simultaneously logged-in with a private (consumer) Google account, Google just switches from the school to the private account, and thus processes information about the behaviour in the school environment for targeted advertising purposes in the private environment. As described in Sections

3.2.1 and 3.2.5 of this report, admins cannot technically prevent such simultaneous log-in on all devices. They either need to procure the separate Chrome Enterprise service (with its own privacy conditions outside of the negotiated privacy amendment), apply a https proxy on the school network, and/or force end-users with Android devices to download the Device Policy App from the Play Store (an Additional Service outside of the negotiated privacy guarantees).

This automatic log-out (a privacy friendly procedure) does not apply to the use of all Additional Services. End-users can for example use Google Photos with their G Suite (Enterprise) for Education Google Account. Google has not provided such a guarantee for Google Scholar, but has explained Google Scholar is a part of Google Search. It is not clear why Google applies different rules to different Additional Services. The absence of a technical separation between educational and consumer Google Accounts, combined with the privacy unfriendly default setting that Personalized Advertising is turned On, leads to spill-over from personal data in Customer Data to Google’s consumer environment. This is the case for (i) Ad Personalization, (ii) providing access to all Customer Data for the Chrome browser as ‘trusted’ app, (iii) the sending of telemetry data (Diagnostic Data) from Android devices, Chrome OS and the Chrome browser with data about app usage and use of biometric authentication, and (iv) installing three kinds of unique identifiers in ChromeOS and the Chrome browser and use these for installation tracking, tracking of promotional campaigns and field trials.

As long as these settings remain privacy unfriendly by default, and admins do not have controls to block or at least minimise the data processing with tools provided in G Suite (Enterprise) for Education, the use of the Chrome OS, the Chrome browser and Android devices disproportionately infringes on the interests and rights of data subjects, in particular as regards confidential data or data of a sensitive nature or special categories of data. As joint controllers with Google, universities are accountable for the risks of any unlawful processing of personal data.

Storage limitation

The principle of storage limitation requires that personal data should only be kept for as long as necessary for the purpose for which the data are processed. Data must 'not be kept in a form which permits identification of data subjects for longer than is necessary for the purposes for which the personal data are processed' (Article 5(1)(e), first sentence, GDPR). This principle therefore requires that personal data be deleted as soon as they are no longer necessary to achieve the purpose pursued by the controller. The text of this provision further clarifies that 'personal data may be kept longer in so far as the personal data are processed solely for archiving purposes in the public interest, for scientific or historical research purposes or for statistical purposes in accordance with Article 89(1), subject to the implementation of appropriate technical and organisational measures required by this Regulation in order to safeguard the rights and freedoms of the data subject' (Article 5(1)(e), second sentence, GDPR).

As explained in Section 10 of this report, Google will delete Customer Data actively deleted by the customer as soon as reasonably practicable, but can retain these data for half a year. This maximum period seems long, once a university has decided to delete Customer Data.

With regard to the Diagnostic Data, the retention period at Google of 6 months for most of the audit logs seems proportionate to the objectives pursued by admins, to be able to look back in case of data security incidents, and to regularly inspect the logs for correct application of the access rules.

Google does not have a fixed retention period for other types of Diagnostic Data, such as the telemetry and website data. The general rule is to retain these data for 6 months as well, but Google explained that “other Diagnostic Data is retained for much longer periods (e.g. account deletion

events).” 320 Cookie-based data are generally anonymized after 18 months, wrote Google. G Suite admins cannot customize these retention periods.

It is difficult to argue that 6 month old Diagnostic Data, or 18 months old data in the case of cookies, are necessary, adequate and relevant for all of the 22 or 33 purposes for which Google processes the Diagnostic Data as joint controller with the universities.

In sum, the processing of the Diagnostic Data through the Core Services and Additional Services, including the telemetry data and website data, does not meet the proportionality requirements. This is due to the lack of transparency, the privacy unfriendly default settings, the absence of technical opt-outs and the risk of unauthorised further processing of personal data in Customer Data by Google.

14.3 Assessment of the subsidiarity

When making an assessment of subsidiarity, the key question is whether universities can reach the same objectives (of using secure, bug free, modern communication and productivity software), with less intrusive means.

Google takes the view that end-users of its G Suite (Enterprise) for Education services voluntarily provide their consent to, or enter into a contract with, Google, (also) for the purpose of using consumer services. However, Google does not seem to take into account that the processing occurs in the context of an employment relationship. As assessed in Sections 11.2.1 and 11.2.2 of this report, employees are not free to give consent or enter into a contract with Google. There is no evidence that the specific contract with the data subject cannot be performed if the specific processing of the personal data in question does not occur. Reliance on either of these two legal grounds requires adequate purpose limitation to ensure that the personal data will not be processed for other purposes for which no legal grounds are available.

The consumer Terms of Service, and the (consumer) Privacy Policy apply to all the Additional Services (as well as Additional Product Terms), including the Chrome OS and the Chrome browser, to the use of the Google Account in these Additional Services and to all Diagnostic Data. These terms allow Google to process personal data for 33 broad purposes.

Universities can choose an alternative software provider and use a different browser. They can decide to use Microsoft Office 365 as an alternative, or open source software. SLM Microsoft Rijk has published several DPIAs on Microsoft 365. Regardless of a choice for an alternative software provider, universities must identify the privacy and security risks of any software or cloud service they plan to use, and assess whether the software offers the necessary functionalities.