2021-04 How Not To Do A DPIA

2021-04 How Not To Do A DPIA

Oct 28, 2022

(aka: ‘how most organisations do DPIAs’)

Think of something really cool to do with shiny technology.

Conjure up rationalisations that make your idea look like it will achieve something useful.

Generate a document that puts the most positive and optimistic spin on the mechanisms and outcomes of the idea. Don’t lose this, because you’ll need to copy and paste from it later.

Tackle governance and compliance paperwork by simply assuming nothing could possibly go wrong; everything will work as exactly as it needs to, and everyone gets a pony.

Assume that all human beings involved in executing the idea are perfect robots, never under the influence of confusion, stress, personal dislike, nefarious intent, overwhelming pressure, conflicting imperatives or ignorance.

Pretend that all the corporate entities involved have robust, mature systems of governance, risk management, data management, acceptable conduct, management capacity, knowledge transfer and ethics overwatch. Proceed on the basis that unlimited amounts of time, money and energy will be infinitely available to maintain all of the above.

Buy a tool to do all your thinking for you and reduce the whole ‘impact assessment’ exercise to a series of clicks on a screen, for the sake of expediency.

Turn the tool you bought into a hurdle that automates the really hard parts of the DPIA (like actually thinking about impacts to people and assessing how they might occur) but which makes your own staff spend stupid amounts of time trying to find the least-inaccurate option among its preconfigured multiple-guess menus. Bonus points if the interface is 70% vendor advertising and 30% functionality.

Input the bare minimum that the system will let you get away with, leaving non-mandatory fields blank and liberally applying copypasta from your earlier documents to save you the effort of thinking more deeply about what you’re doing.

Choose inputs that you know will result in low-risk findings, omit or misrepresent aspects that you know will produce any of those pesky high-risk indicators which will require critical thinking skills in order to tackle. Senior management doesn’t like to see red on a report, your goal is to make sure that you only show them soothing shades of green.

Apply as many lawful bases at once as you can stretch to fit over some part of the processing, but do not back up those assertions with explanation, justification or precision. The more the merrier! You’ve got prefabricated answers available for the data subject rights questions anyway, so there’s really no need to narrow the lawful basis down or refine the purpose descriptions so that they align. Decisions about the data and processing activities themselves have already been made and can’t/won’t be altered at this point, but that would only worry you if you didn’t have a paragraph on how you take minimisation very seriously, ready and waiting to be pasted in.

Make sure you constrain your considerations of impact so that you only consider consequences that will noticeably affect the organisation - and focus on security breaches because that’s the only part of data protection law that exists.

Add any controls which you think might look good on a report, whether or not they are applicable to any particular risk, or more effective in mitigating that risk than not. You’re not quite sure how encryption helps establish the lawfulness of the processing, or what any of the proposed mitigations will actually look like in practice, but the tool didn’t spit out anything red, so it must be fine. That’s comforting because the tool is omniscient, which makes up for the level of thoughtful consideration by the human beings involved.

Save the report as a PDF with no OCR, content indexing, metadata or accessibility-compatible features.

Distribute with a ‘job done’ message to the project team and hope the fact of having gone through some kind of paperwork-producing process is enough to shield you from the outcomes of having missed the entire point of the exercise.

If/when later challenged on the fairness, lawfulness, transparency, necessity, efficacy or morality of your processing, make sure you shout about how you DID A DPIA, as though the ritual itself should invoke some protective enchantment.

From this day on, refuse to provide copies of the DPIA to anyone who might look at it with a critical eye. That’s your proprietary, commercial, confidential information and too valuable to show to anyone else. Data subjects in particular must be prevented from discovering how much time and attention the organisation has paid to protecting their rights and freedoms – they might start looking harder at your privacy info as well, and then where would you be?!  Prospective corporate clients just want you to tell them everything’s okay, after all, they came to you because you’re fast and cheap and top of the sponsored ads in their search results; they don’t want to hear about things that might cost them to put right.

Ta-da!


Now, obviously this is an extreme and hyperbolic example which I have confected for educational and cathartic purposes -however; every attitude, decision and error I’ve included here are ones that have encountered in the real world at some point or other in my data protection career. Most have been made unconsciously; out of habit, under pressure, in competition with other – more easily quantifiable – imperatives, like cost, time, convenience, volume, growth, service level metrics. It’s easy to spot them with hindsight, much more challenging to recognise them in real-time, harder still to make a case for slowing down and thinking really carefully about why and how and who, but most of all what if.

Forget software, forget checklists, and yeet that terrible ICO template into the sun. The difference between a rigorous DPIA and a waste of time, is the amount of thinking that went into it. Thinking is not highly valued in the culture of management-by-metrics, being indistinguishable from nonproductivity in the the lens of surveillance, and unquantifiable in nature. In fact, thinking gums up the wheels of the prevailing just-in-time response model most of us are required to operate within. Unfortunately, some things are complex and need to be thought about carefully, if they’re going to have any chance of being useful.

One such thing is a Data Protection Impact Asssessment. GDPR Article 35 says

carry out an assessment of the impact of the envisaged processing operations on the protection of personal data

The assessment shall contain at least:

(a) a systematic description of the envisaged processing operations and the purposes of the processing, including, where applicable, the legitimate interest pursued by the controller;

(b) an assessment of the necessity and proportionality of the processing operationsin relation to the purposes;

(c) an assessment of the risks to the rights and freedoms of data subjectsreferred to in paragraph 1; and

(d) the measures envisaged to address the risks, including safeguards, security measures and mechanisms to ensure the protection of personal data and to demonstrate compliance with this Regulation taking into account the rights and legitimate interests of data subjects and other persons concerned.

If you’re a supporter, you can also read my much less snarky but possibly more helpful commentary on tips to avoid the ‘how not to’ pitfalls.

Enjoy this post?

Buy Rowenna Fielding a Pizza

More from Rowenna Fielding