I’ve just posted in my ‘other blog’ (just for my PhD stuff). Been reflecting today…
One practical application of my privacy harm model is for making sense of “high risk to individuals’ interests” in carrying out data protection impact assessments or privacy impact assessments.
For anyone interested in what I did for my PhD, here’s a bit about it.
Enjoy!
I participated in SPRITE+ ‘Accountability & Ethics in a Digital Ecosystem’ Workshop led by Dr Jonathan Foster and Dr Julie Gore last week. There were many interesting brainstorming discussions via Zoom breakout rooms from diverse academics and industry practitioners. It was well facilitated and far more enjoyable (and productive) than all the Zoom and similar webinars and meetings that I’ve attended during the past couple of weeks. I will take a break from online meetings/webinars – ‘online’ fatigue .
Accountability & ethics are both interesting and topical issues not only in a digital ecosystem but beyond the digital sphere. A good example is the current lockdown situation whereby we are all dependent on getting connected online or digitally connected because we have to obey or adopt social distancing (outside the digital sphere). Have ethics or ethical norms changed because we’re or have experienced ‘social distancing’ (more time alone or with family) and are also spending more time online to connect with people? I think researchers need to ask new questions on the interplay between the digital and non-digital or ‘real world’ sphere and where accountability and/or ethics matters or not. Do we have accountability or can impose accountability (via laws?) when we have runaway ethics? Just a random question!
A long time ago I was involved in assessing and using contact tracking solutions. It was during the Kosovo War (1998-1999) and I was working with an international charity organisation. At that time, I have not heard of Data Protection or privacy or Human Rights and did not have any safety procedures or standards to follow.
Did I do any impact assessments? I probably did but nothing related to data protection or privacy. Did the solution cause any harm to anyone? I don’t think so. However, I remembered I spent a fair amount of time reflecting on the users of the tracing systems. The main users were those affected /impacted by the War and the administrators of the systems. Would any Data Protection Impact Assessments (DPIAs) and Privacy Impact Assessments (PIAs) method help me back in 1998?
As part of my PhD, I examined various DPIAs and PIAs and I can say I’m not any wiser in using those impact assessments method except that now I am aware of Data Protection, data privacy and Human Rights.
Right now, I’m reflecting on the contact tracing app after reading the article – The NHS Contact Tracing App: 11 key talking points –published by the BCS. I wonder whether any of the DPIAs or PIAs method has been used by the app designers. I suspect not.
Today, 25th May 2020 marks the 2nd anniversary of the GDPR.
accessnow.org issued this report: TWO YEARS UNDER THE EU GDPR
Extracted from the conclusion:
Crippled by a lack of resources, tight budgets, and administrative hurdles, Data
Protection Authorities have not yet been able to enforce the law adequately. Worse,
some public authorities have misused the GDPR to undermine other fundamental rights.
While the GDPR in itself is not to blame for these failures, fingers are sure to be pointed
at the law if urgent actions are not taken. We hope that the recommendations put
forward in this report will help address the situation.
I wonder even with more resources available, would the DPAs be better able to enforce the law adequately?
I suspect not. More resources will not guarantee better outcomes.
Governments should ensure the application of the GDPR and the
protect the right to data protection in their COVID-19 response,
particularly in the areas concerning the collection and use of health
data, the use of tracking and geolocation, and the conclusion of
public-private partnerships for the development and deployment of
contact-tracing apps.
Balancing the rights – the rights and freedoms of individuals & public interests – for the Covid-19 response is probably where we will see or have seen the workings or misuses of GDPR.
I was out walking in my local park today. I couldn’t help noticing that there were less people in the park today compared to 2 days ago. Why? Well the weather wasn’t as nice today, dry but not sunny (between 5.35-6.30pm). Also, I forgot to wear my mask today and didn’t notice this until I was back home. I suddenly realised that I’m back to my ‘normal’ self or routine. Well…what is ‘normal’ or the ‘new norm’? For me, breaking or changing my habits are not ‘normal’ for me. Perhaps, because I’m quite an introvert at heart and enjoy peaceful stroll in quiet parks and quiet spaces with less people around. Also, I tend to put my mobile on silent mode or forget to bring it with me.
So, it’s just me and my mobile on my ‘normal’ walk. Do I want to be traced or tracked or monitored by my mobile or by ‘someone’ capturing my location or interaction with a tree or a passer-by or another nearby walker? Isn’t this an intrusion into my private life or space or sacred space? Sacred because I just want to be left in peace or to be left alone (privacy?) Does the ‘new norm’i.e. when the lockdown is removed, and we regain a sense of normality also means we regain our sense of freedom and privacy? Do I value my privacy above all else or above what the public or others demand? I value my privacy, but I know I don’t have absolute rights.
In the current epidemic the public interests or public health or save our NHS take centre stage above an individual’s right to privacy. The NHS is the hero in saving lives and also other essential services which we ‘normally’ take for granted. What I find upsetting is the uptake and/or the urgency in using technology (tech) for tracing or locating people or contact tracing. Yes, we all want to be back to ‘normal’ and get rid of the virus. Which is more dreadful – the virus or being under surveillance or tracked (without the opt-out)? The virus can kill – if your immune system is weak & you have no access to the needed medical aids and care. With the contact tracing or any tech that collects data about our ‘normal’ life (privacy?), it won’t kill anyone outright, right? It can save lives, so we’re told.
As we know we’re relying on tech to save lives like the use of tech in hospitals etc. Also, like in normal times, we use tech daily and now we rely on tech more than ever in lockdown. More than ever, we also need to check that we’re using tech that does not go beyond one’s safety net (our privacy) or intrude into one’s normal life. This is what privacy is all about, isn’t it? A crisis isn’t ‘normal’. Privacy is sacred in normal times and more so during a crisis.
August 28, 2019 – 8:58 pm
This post at pressread.com caught my attention today which drove me to attend the networking event.
More action from Facebook to watch out for: ‘We have heard that words and apologies are not enough and that we need to show action.‘
Also, note The FTC press release:
‘As part of Facebook’s order-mandated privacy program, which covers WhatsApp and Instagram, Facebook must conduct a privacy review of every new or modified product, service, or practice before it is implemented, and document its decisions about user privacy. The designated compliance officers must generate a quarterly privacy review report, which they must share with the CEO and the independent assessor, as well as with the FTC upon request by the agency. The order also requires Facebook to document incidents when data of 500 or more users has been compromised and its efforts to address such an incident, and deliver this documentation to the Commission and the assessor within 30 days of the company’s discovery of the incident.’