By Dave Whitelegg | Published May 25, 2018
An application’s privacy risk considers the potential distress and harm caused to application users, should users lose control over their application-managed personal information. Organizations that directly provision applications to users are subject to their own privacy risk, which takes into account the controller organization’s potential reputation damages and financial impact, should the application fail to provide EU citizen users with the privacy rights afforded by the General Data Protection Regulation (GDPR). Therefore, integrating an application privacy by design, also referred to as data protection by design by the GDPR, within the software development lifecycle is an essential practice to ensure an application’s privacy risk is thoroughly understood and evaluated, and where deemed appropriate, mitigated to an acceptable level of risk.
This article is Part 2 in a series of articles that provides guidance for developing applications that are compliant with the European Union’s General Data Protection Regulation. Part 1 summarizes the GDPR and Part 3 explores practical application development techniques that can alleviate an application’s privacy risk.
GDPR Article 25 states data protection by design applies to organizations that are in the role of a controller, this directly applies to internally developed applications for use by the developer’s organization. Where applications are developed for use by external organizations, those organizations as a controller, will seek to understand and verify the application’s privacy risk with the application’s developers. Therefore a data protection by design development approach should be adopted regardless of whether an application is intended for internal use, or for use by other organizations that are in a controller role.
GDPR Article 25 (2) requires data protection by design by default. This means that all of the application’s privacy settings must be set to protect a user’s privacy when the application is provisioned to users.
Article 25 Data protection by design and by default1. Taking into account the state of the art, the cost of implementation and the nature, scope, context and purposes of processing as well as the risks of varying likelihood and severity for rights and freedoms of natural persons posed by the processing, the controller shall, both at the time of the determination of the means for processing and at the time of the processing itself, implement appropriate technical and organisational measures, such as pseudonymisation, that are designed to implement data-protection principles, such as data minimisation, in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of this Regulation and protect the rights of data subjects.2. The controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed. That obligation applies to the amount of personal data collected, the extent of their processing, the period of their storage and their accessibility. In particular, such measures shall ensure that by default personal data are not made accessible without the individual’s intervention to an indefinite number of natural persons.An approved certification mechanism pursuant to Article 42 may be used as an element to demonstrate compliance with the requirements set out in paragraphs 1 and 2 of this Article. Supported by Recital 78
An essential tool in developing applications with data protection by design is a data rotection Impact Assessment.
An application data protection impact assessment (DPIA) helps developers to identify, analyse, and explain how an application impacts the privacy rights and freedoms of its users. A DPIA should be performed against applications intending to collect, maintain, and/or share personal information.
By embedding a DPIA early within the software development lifecycle, developers are able to mitigate unacceptable privacy risks and resolve GDPR user privacy rights compliance issues before the completion of the application’s design. This ensures that the application is efficiently developed to minimize privacy risk, which benefits users, and the risk to data controller organizations provisioning the application.
DPIAs should also be performed retrospectively against applications that have been previously released and in production to assure their privacy risk is acceptable and to verify their compliance with the GDPR.
The expected roles and responsibilities involved with an application DPIA include:
The application’s GDPR privacy rights obligations should be documented as requirements within the requirements analysis phase of the Software Development Lifecycle (SDLC). The DPIA should be performed within the design phase of the SDLC. Then, further privacy risk verification should be conducted throughout the latter phases of the SDLC, to assure the application’s privacy requirements are all achieved, and the application design mitigates or eliminates privacy risks as intended.
The Application SDLC (Waterfall Model) phases include:
The GDPR privacy rights obligations are explained in the first part of this guidance series. Identified privacy risks should be evaluated with a view to designing technical application solutions that minimize the privacy risk, such as using database encryption and pseudonymization, these technical solutions are discussed in the third part of this guidance series.
There are six phases to an application DPIA:
GDPR Article 35 (1) requires a data protection impact assessment to be performed where “new technologies” are “likely to result in a high risk” to individuals. The supporting Recital 75 describes privacy risks that are typical of applications processing personal data, stating “where the processing may give rise to discrimination, identity theft or fraud, financial loss, damage to the reputation, loss of confidentiality of personal data protected by professional secrecy, unauthorised reversal of pseudonymisation, or any other significant economic or social disadvantage”.
The development of a new application can be regarded as a “new technology” under Article 35 (1). While the GDPR does not specifically define “high risk”, where applications process personal data on mass, it is likely most supervisory authorities would consider such applications as having an inherent high risk to user privacy, especially where the application is internet facing. Therefore unless the application has no potential to process EU citizen’s personal information, it is recommended to perform a DPIA on all developed applications to comply with GDPR Article 35 (1).
Article 35 (1)Where a type of processing in particular using new technologies, and taking into account the nature, scope, context and purposes of the processing, is likely to result in a high risk to the rights and freedoms of natural persons, the controller shall, prior to the processing, carry out an assessment of the impact of the envisaged processing operations on the protection of personal data. A single assessment may address a set of similar processing operations that present similar high risks.
Article 35 (7)Data protection impact assessment__“The assessment shall contain at least: (a) a systematic description of the envisaged processing operations and the purposes of the processing, including, where applicable, the legitimate interest pursued by the controller; (b) an assessment of the necessity and proportionality of the processing operations in relation to the purposes; c) an assessment of the risks to the rights and freedoms of data subjects referred to in paragraph 1; and (d) the measures envisaged to address the risks, including safeguards, security measures and mechanisms to ensure the protection of personal data and to demonstrate compliance with this Regulation taking into account the rights and legitimate interests of data subjects and other persons concerned.
The application DPIA must be documented, which should start with a description of the application’s general purpose, and a simple overview of the application’s functions. Also record whether the application developing organization will be in a GDPR role of a controller (i.e. provisioning the application), processor (i.e. just hosting the application) or neither.
The next step is to determine and describe how personal information will be obtained, processed, stored, and moved out of the application. With assistance from the application architect, review and map the application’s data flows and processes. Documenting with a data flow chart, allows the application’s processes and personal data flows to be clearly described and understood, an example is shown in Figure 1.
In addition to the data flow diagram, further more comprehensive details about the application’s personal data usage should be recorded. Review the application’s design and explore the following aspects with the technical leads.
A spreadsheet or a table is a useful method for documenting the application’s technical personal data usage. Figure 2 depicts an example.
The next step of the DPIA is to identify privacy risks by reviewing the personal data flow documentation. This review should be supported by the data protection officer or a data privacy subject matter expert. Also involve a risk manager or risk professional, to formally guide and document the privacy risk assessment.
There are three general groups of privacy risk to consider and document with an application DPIA:
When assessing the application’s user privacy risk, consider the possible application intrusion scenarios to a user privacy rights and freedoms, with the likelihood of each privacy intrusion, and the potential impact on the user, such as personal distress and financial loss.
The following types of application privacy intrusions can negatively impact users and should be explored:
Describe each privacy risk scenario identified on a risk register. Each risk on the register should include an evaluated risk rating from factoring the privacy intrusion likelihood with the impact on user’s privacy. An example of an application risk register is depicted in Figure 3.
There are a number of GDPR user privacy rights that impact applications, these rights are explained in the first part of this guidance series. With the support of the application architect, review the application’s design documentation and assess how the application facilitates, or not, each of the applicable GDPR user privacy rights, these include:
Document the application processes that adequately facilitates each of the GDPR privacy rights within the DPIA documentation. Where any of the applicable GDPR privacy rights are not adequately facilitated by the application, document these as privacy risks on the risk register.
Organizations that are in the role of a controller or processor that fail to comply with the GDPR, are at risk of significant sanctions that include large fines by supervisory authorities, legal claims for damages by users, bans on processing personal data, and reputational damages. The organization’s privacy risks should be assessed with the support of the DPO, and also recorded on the risk register.
Where the software developing organization intends to be in the role of a controller, namely provide the application directly to users, the following risks and their potential impact on the developing organization should be considered:
The next stage is to assess each privacy risk recorded on the risk register with a view to accepting, reducing or eliminating each risk. The mitigation of privacy risk may be achieved by devising solutions and making changes to the application’s design, implementing enhancements to the SDLC processes, or introducing procedures and controls at the controller organization. The application architect, technical lead, the Information Security Officer, and the DPO should all be involved in determining and evaluating privacy risk mitigating solutions.
The third part of this guidance series explains several common application privacy risk mitigation solutions, such as database encryption and pseudonymization.
The privacy risks on the risk register should be updated to reflect any risk mitigation changes made to the application’s design, and include a residual risk rating, which is the assessed risk rating post the change. Ensure all privacy risk mitigation changes and solutions are incorporated within the application’s design and project plan. The development project manager should be informed of the DPIA outcomes, to ensure risk mitigation actions are formally included within the development plan.
The completed DPIA documentation should include:
The DPO must review the completed DPIA documentation, if the DPIA is considered to be an accurate representation, and the privacy risk is deemed to be acceptable, and the application, and the organization where applicable as a controller or processor, fully compliant with the GDPR, the DPO should sign off the DPIA. Else the DPO should cite any issues and request further assessment.
The development project sponsors should be provided with a copy of the completed DPIA. As project sponsors are typically responsible for the overall privacy risk of the developed application, it is recommended they also should sign off the DPIA.
Where applications are intended to be resold, hosted, or used by external controller organizations, developers should write and provide application implementation guidance, to assist controller organizations to provision the application to protect user privacy as designed, and by default. The implementation guidance should include concise step-by-step instructions on installing and configuring the application’s privacy settings. The guidance should also recommend the security configuration of the application’s supporting components, such as web servers, databases and the network environment.
To provide additional transparency and GDPR support to controller organizations, the implementation guidance should include a full explanation of all privacy related application functions and processes. The guidance should also detail where the controller organization is expected to be responsible for providing any user GDPR privacy rights that are not facilitated by the application, and for responsibilities on the controller organization to mitigate the application’s privacy risk. This information will have already been obtained and included within the DPIA documentation.
GDPR Article 35 requires a data protection impact assessment (DPIA) to be performed where ‘new technologies’ are likely to result in a high risk to the privacy rights and freedoms of individuals. This requirement can be construed to apply to the development of any application that intends to process EU citizen personal data.
Completing a formal DPIA early within the software development lifecycle is an essential practice to efficiently develop applications that protects user privacy rights from the outset. Developers should seek support from data privacy subject matter experts, information security, and risk management professionals throughout the DPIA processes. In leveraging their expertise, developers are able to formally identify and evaluate both application user and organization privacy risks, and where risks are considered unacceptable, devise technical solutions to mitigate them within the application’s design.
This article is the first part of a three part series on developing applications that are compliant with the European…
Part 3 of this series provides practical application development techniques that can alleviate an application''s privacy risk. These solutions include…
Back to top