New on @InfosecurityMag talked to @datachainrisk about her PhD research into data privacy and the effect of breaches on people, as well as how #GDPR came along at the right time. @CityUniLondon https://t.co/lq29fQ6ans
— DanRaywood (@DanRaywood) May 24, 2018
On 17th April 2018, I was one of the speakers in the GDPR Press Briefing held at City, University of London (City). Checkout the hot off the press ‘City academics discuss GDPR at press briefing’
My written prepared talk is shared below.
Privacy and the Individual – What difference will GDPR Make?
Thanks John for the introduction. A warm welcome to all.
Any talk on privacy and the GDPR invariably uses terms or phrases that may be blurry or obscure. So just to set the scene, when I say the ICO I’m referring to the UK’s data protection watchdog – The Information Commissioner’s Office. When I say ‘data’ I’m referring to personal data as described in the GDPR.
Although the GDPR did not reference privacy – itself a complex term, privacy is embedded as information or data privacy and expressed in phrases such as:
‘respect for human rights and fundamental freedoms (Art. 12 – exercise of the rights of the data subject); ‘High risk to the rights and freedoms of natural persons’ (Art. 35 -Data protection impact assessment), and ‘Risks to the rights and freedoms of natural persons (individuals)’ (Recital 75).
It is no longer just about protecting personal data or processing of personal data but data privacy.
With this comes obscure or unclear terms.
What is ‘high risk’? How do you express ‘rights and freedoms’ of natural persons (individuals) especially in the context of privacy impact assessment (PIA) or data protection impact assessment (DPIA)?
We know that the GDPR describes DPIA (Art. 35) and also breach notification (Art. 33 – notify the ICO, and Art. 34 – communicate to the data subjects).
I know fresh in our minds is the recent Facebook-Cambridge Analytica scandal. Flashback to October 2015, anyone here still remembers the TalkTalk data breach incident?
Would you all agree that both Facebook & TalkTalk responded or handled the data breach announcement or notification to affected individuals rather badly or failed to do so in the eyes of the public and the affected individuals?
Certainly, under the GDPR both would be required to notify the ICO within 72 hours and to affected UK individuals without undue delay or ‘as soon as possible’ (Guidelines on Personal data breach notification under Regulation 2016/679)
As we know the GDPR requires organisations to notify the ICO where there is a risk to the rights and freedoms of individuals, and only notify the individuals where there is high risk.
My research examines data incidents response, in particular, the privacy harm to individuals as a consequence of the data incident. I have designed a prototype dashboard and have conducted user evaluation study with industry practitioners. The dashboard is for assessing privacy data harm by addressing the initial breach notification question to notify or not affected individuals and to the ICO during initial data incident response.
There is still fear in organisations when it comes to disclosure of data incidents. However, the GDPR will held organisations accountable e.g. with the fines and penalties, and to be transparent to report data incidents. Affected individuals have the right to know.
The outcome of my study also revealed that it is possible to do an initial data breach assessment even with the unclear terms: ‘high risk’ and the ‘rights and freedoms’ of individuals. The prototype dashboard also shows notification alerts with the countdown to 72 hrs from the point of being aware of the incident. One participant remarked: ‘It (the dashboard) provides a calm objectivity in time of panic & stress. Because you’re going to be stressed, you immediately think your personal reputation and your organisation’s reputation. Would we be fined? And all these things come in rather than actual thinking of the consequences to individuals’.
When the data incident happened, the genie was out of the bottle, out in the wild – the harm was already done.
The GDPR would not bring the genie back into the bottle or stop the harm. So as a matter of good business practice and in the spirit of the law, organisations should notify their customers.
Thank you.
Cher
p.s.
May post a photo taken by John Stevenson (City’s Senior Communications Officer)
Normally I enjoy learning and exploring new apps/tools especially tools that help me to ‘think’ and/or communicate or express my ideas visually.
Today I watched 14 YouTubes on NVivo. Nothing wrong with those ‘introductory’ videos. It’s just me. I learn by doing and experimenting. I don’t have plenty of time to learn any tools as I’m on very tight schedule to finish my Thesis. So I need to get up to FAST speed with NVivo. Arghhh…If I have a choice I would not use NVivo at all!
Maybe one day someone will research on wuxing for cyber warfare or cyber this & that. Anything is possible as long as one is prepared to go beyond the confine of ‘scientific thinking’.
Just reminiscing about the episodes of my interests in Chinese philosophy. Perhaps one day I will revisit wuxing and re-kindle my ‘mad/crazy’ interests.
My initial website on wuxing
I have created so many mindmaps using Freemind – my favorite tool for capturing & logging stuff.
Now I just need to revisit most of them and start writing up my Thesis.
In going through my many maps I came across the 1st map I did on ‘data’. Posting the
data map here.
Yesterday evening, 29th March 2018 I attended a BCS Law Specialist Group event – GDPR: Anonymisation,re-identification risk and GDPR profiling. The talk was presented by Dr. Amandine Jambert from the French Data Authority CNIL. The anonymisation slide is interesting. I asked whether the WP29 thinking (& their opinions) about the 3 properties are for the ‘direct and indirect’ way of identification of the personal data. The answer was not in the method itself but that the properties are for ‘all data types’ i.e. any dataset. Her exact wordings ‘ use by anyone on any dataset’. Also, the DPA (DPO/Organisation?) needs to prove (or justify or show) that the dataset has indeed been anonymised (using any of the 2 options). My understanding is that the anonymisation if done (risk-based, database and/or algorithmic-driven) should not enable the direct and indirect re-identification of the individual(s).
As noted on this slide: ‘No single technique eliminates all risks’.
It’s near impossible to identify/isolate ‘all the direct/indirect re-identification risks’ associated with any dataset, assuming the dataset is available and not hidden in some Cloud and/or in a chain of hidden registers.
We really need to re-think personal data in terms of ‘the harm to individuals’ as there’s no absolutely sure way of preventing re-identification risks (i.e. singling out, linkability or inference/deduction etc.)
Overall a great talk.
I just noticed the slides and talk are available online: BCS Law talk 29th March 2018
My final piece of study – user evaluation
I am now extending my user evaluation (January-February schedule) to March as January was a quiet month. It has been difficult to get practitioners in industry to commit their time to participate in my user evaluations study. Personal data incidents are still regarded ‘scary’ stuff to disclose or to talk about openly or even privately with a researcher.
Even after I reassure folks that my research does not require disclosing any personal or commercially sensitive information, folks (esp. senior managers) still won’t allow their employees (those that have the relevant knowledge/experience) to share and participate in my research.This is a pity as they will certainly learn something in sharing and participating in my user evaluation. According to this news, the #FCA is to require UK banks to make details of cyber security #incidents public from August 2018. Under the GDPR, organisations processing personal data of EU residents/citizens will need to report certain breaches to the ICO and also to affected individuals. My prototype dashboard will help organisations to conduct an initial personal data harm assessment.
So far, practitioners who took my user evaluation study involving a questionnaire and the prototype dashboard have expressed positive remarks and provided suggestions for further improvement or commercialisation of the prototype concepts.
I’m hopeless at making New Year Resolutions (& don’t believe in making them), so will just remind myself by posting this blog with this message:
I will finish my PhD by the end of October 2018
One nicety of being a research student is learning interesting stuff, stuff that other researchers have done, in particular in data visualisation (one of my many interests!).
Most recently I attended two seminars at City, both interesting in many ways. Their websites:
Microsoft researchers on data driven storytelling
Fanny CHEVALIER – Research Scientist at Inria
If I have spare time, I’ll blog about my own challenges (& less notable achievement but still an achievement in my own terms!) with data visualisation tools. Finding and getting a grip with ONE storytelling tool to do a nice, neat & brain-cracking (or mind-blowing) visualisation of what I THINK my intended users want to see from my research output is beyond my research domains.
ICO’s brief on Breach Notification.
Next to watch the Article 29 Working Party guidelines.
Great video from Sisense on How to build a better dashboard.