Tweets
The latest GDPR:data privacy:disputes! paper.li/edisclosure/14… Thanks to @RisuToInu @RespMktgAssoc @xeho91 #ethics #digital
About 3 hours ago from cher's Twitter · reply · retweet · favorite
The latest GDPR:data privacy:disputes! paper.li/edisclosure/14… #gdpr #cybersecurity
Yesterday from cher's Twitter · reply · retweet · favorite
The latest GDPR:data privacy:disputes! paper.li/edisclosure/14… #gdpr #js
About 2 days ago from cher's Twitter · reply · retweet · favorite
The latest GDPR:data privacy:disputes! paper.li/edisclosure/14… Thanks to @theextendedmind #gdpr #technology
About 3 days ago from cher's Twitter · reply · retweet · favorite
The latest GDPR:data privacy:disputes! paper.li/edisclosure/14… Thanks to @enderton_justin #gdpr #facebook
About 4 days ago from cher's Twitter · reply · retweet · favorite
-
The latest The Data Chain! paper.li/iadrt/13349388… Thanks to @mbanchero @XaynHQ @svMareNostrum #dataprotection #cybersecurity
The latest The Data Chain! paper.li/iadrt/13349388… Thanks to @dpryor22 @answeringlaw @DevIntelligence #technology #edinburgh
Yesterday from cher devey's Twitter via Paper.li
The latest The Data Chain! paper.li/iadrt/13349388… Thanks to @stnle @Jmw66 #cybersecurity #infosec
About 2 days ago from cher devey's Twitter via Paper.li
The latest The Data Chain! paper.li/iadrt/13349388… Thanks to @richardnohpure1 #databreach #internetofthings
About 3 days ago from cher devey's Twitter via Paper.li
The latest The Data Chain! paper.li/iadrt/13349388… Thanks to @CyberSematic @welcometobora #cybersecurity #malware
About 4 days ago from cher devey's Twitter via Paper.li
-
Recent Posts
Recent Comments
- Aspire² – Delivering specialist advice and services in information management and regulation | Data Privacy Day 2014 – the road ahead on Data Privacy Day January 2014
- admin on My techlaw talk – EU Data Protection Regulation
- Research Case Study: Data Breach Incident Response, Technology, Law and Management | Citylibresearchcasestudies on About
- admin on discovery project manager
- edisclosure myth or reality? » open source ediscovery on More Collaboration coming from Open Source
LinksToSites
Elsewhere
Categories
blog cloud
2009 2010 2012 ADR Arbitration Asia BCS Beijing blog book Britain china CIArb cloud computing Collaboration CPR CPR Part 31 Data Data Protection Digital Forensics disclosure E-Disclosure edisclosure ediscovery ESI EU event Events evidence facebook FRCP Google guidelines guides Handy Hong Kong IBA IBM ICC international IT lawyers privacy Process USJyutsu
SMO v TikTok
On BBC news: TikTok faces legal action from 12-year-old girl in England.
Some interesting remarks/statements – additional info/comments enclosed in brackets () and italics- are extracted from the Judgment:
‘This is a pre-action application for anonymity on behalf of a child claimant in an intended claim for breach of privacy‘.
‘The papers explained that the urgency stemmed from the fact that the end of the Brexit transition period on 31 December 2020 will bring about changes in the law which are, or are at least said to be, relevant to the intended claim. One change relates to the GDPR. It is said that under the law as it stands before the end of the period this Court has jurisdiction over that aspect of the claim and over the Second Intended Defendant, which is a company registered in England and Wales. The position from 1 January 20201 is “less clear”; jurisdiction will be decided on the basis of the common law rules “which may prejudice the ability of the claimant to bring the claim and/or defend any jurisdictional challenge brought by the Intended Defendants (i.e. defendants outside UK. What about the UK GDPR?)’.
‘Some of the claimant’s paperwork devotes attention to the importance of keeping the claimant’s address a secret. I do not regard that as an issue of particular significance in the context of this case. It is said that its disclosure might give rise to a risk of harm, regardless of the facts of the case, as it would increase the risk of attention from people who intend the claimant serious harm. That appears to me to be unsupported by the evidence. In any event, the claimant’s address is not a weighty aspect of open justice, save in so far as it may lead to the identification of the claimant. The real issue is whether the claimant should be identified. If not, an order for non-disclosure of the address would seem to follow.
‘The common law exceptions did not include the rights or interests of children, other than in the context of wardship. But by virtue of the Human Rights Act 1998 there is now, effectively, a statutory exception. The Court must act compatibly with the Convention Rights, including the right to respect for private life protected by Article 8. And Article 6 provides that the general rule of open justice may be departed from
“where the interests of juveniles or the protection of the private life of the parties so require.” This does not provide any automatic protection for children, regardless of the circumstances: see ZH (Tanzania) v Secretary of State for the Home Department [2011] UKSC 4 [46] (Lord Kerr), ETK v News Group Newspapers Ltd [2011] EWCA Civ 439 [19] (Ward LJ). A balance must always be struck, and attention must be paid to the specifics of the individual case, not just generalities. But, as Mr Ciumei QC has pointed out in presenting his client’s case, Article 3(1) of the United Nations Convention on the Rights of the Child and other international and domestic instruments require the Court to accord “a primacy of importance” to the best interests of a child: ZH (Tanzania) ibid. (NB: UNCRC Art 3(1) provides the balancing or tipping act when it comes to a child’s privacy rights).
‘It is reasonable to suppose that some of that attention would be focussed on the claimant, if their identity was known. But that is not enough of itself to justify anonymity. Nor is the mere fact that the claimant is 12 years old. It is necessary to consider the nature of the likely attention, and the harm that it could cause. (NB: the likely attention is a trigger for harm).
‘The Commissioner’s witness statement identifies a risk of direct online bullying by other children or users of the TikTok app; and a risk of negative or hostile reactions from social media influencers who might feel their status or earnings were under threat. Both appear to me to be realistic assessments. That is not to say that such behaviour is inevitable, but it is reasonably foreseeable. (NB: risk associated with social media influencers).
‘…the intended claim involves serious criticisms of what may be key aspects of the platform’s (TikTok) mode of operation’
‘I accept the Commissioner’s evidence that children are particularly sensitive to the sort of attention and scrutiny to which she has referred, and that such attention can have a marked and detrimental impact on a child’s mental health, and emotional and educational development. I would characterise the risk of harm as significant‘
‘The assessment of the parents deserves respectful attention.’
‘The main characteristics of importance appear to be age and use of TikTok, and those are shared with all the represented parties. The evidence is that the damages claim will not be peculiar to the circumstances of the claimant, as for instance with a claim to compensate for distress. As in Lloyd v Google, the claim will be for a standard “tariff” figure to compensate the claimant and each of the represented parties for the abstract “loss of control” over personal data. In all likelihood, the main focus of attention for those who wish to understand and scrutinise the workings of the justice system in the intended litigation will be the activities or alleged conduct of TikTok and the role of the defendant companies in its operation’.
‘…if the Court required the claimant to be named that could have a chilling effect on the bringing of claims by children to vindicate their data protection rights. On that footing, the grant of anonymity supports the legitimate and important aim of affording access to justice, and the order is necessary in order to secure the administration of justice.’
Anonymisation Decision-making Framework
I’ve received the following announcement from the team at UKAN, University of Manchester:
Just to let you know that the second edition of the Anonymisation Decision-making Framework has now been published.
The Framework has been given a significant overhaul and for the first time there is a systematic method for evaluating your data environment.
You can download the new book and the companion documents from here:
https://ukanon.net/framework/
Many thanks to Prof Mark Eliot for sharing the book and the various companion documents.
From noyb.eu (Schrems): steps for EU companies
Following from the CJEU’s judgment on EU-US data transfers (SchremsII), Schrems has posted comprehensive steps and FAQs on noyb.eu.
I tweeted the news on 24th July 2020:
From #Schrems & https://t.co/c7eI7oxpgT – Next Steps for EU companies & FAQs https://t.co/GPsRPP03L7
— cher devey (@datachainrisk) July 24, 2020
International data transfer
I just browsed #Schrems on my twitter streams. We now have a sequel to SchremsI – SchremsII came into force on 16 July 2020. Never a dull day when it comes to human rights and fundamental freedoms especially when such inalienable rights shine as actionable rights against other ‘rights’.
The CJEU’s judgment and the press release have been summarised by various folks. The essence of #SchremsII – extracted from Center for Democracy & Technology:
No doubt international data transfer or international trade will continue to flow (and flourish) even without Privacy Shield as there is still GDPR Article 49. Data transfer has to be read in terms of adequacy, derogation, surveillance and also trade politics.
For now, our inalienable rights shine until another round of drama in the courts.
Thesis
I’ve just posted in my ‘other blog’ (just for my PhD stuff). Been reflecting today…
One practical application of my privacy harm model is for making sense of “high risk to individuals’ interests” in carrying out data protection impact assessments or privacy impact assessments.
For anyone interested in what I did for my PhD, here’s a bit about it.
Enjoy!
runaway ethics
I participated in SPRITE+ ‘Accountability & Ethics in a Digital Ecosystem’ Workshop led by Dr Jonathan Foster and Dr Julie Gore last week. There were many interesting brainstorming discussions via Zoom breakout rooms from diverse academics and industry practitioners. It was well facilitated and far more enjoyable (and productive) than all the Zoom and similar webinars and meetings that I’ve attended during the past couple of weeks. I will take a break from online meetings/webinars – ‘online’ fatigue .
Accountability & ethics are both interesting and topical issues not only in a digital ecosystem but beyond the digital sphere. A good example is the current lockdown situation whereby we are all dependent on getting connected online or digitally connected because we have to obey or adopt social distancing (outside the digital sphere). Have ethics or ethical norms changed because we’re or have experienced ‘social distancing’ (more time alone or with family) and are also spending more time online to connect with people? I think researchers need to ask new questions on the interplay between the digital and non-digital or ‘real world’ sphere and where accountability and/or ethics matters or not. Do we have accountability or can impose accountability (via laws?) when we have runaway ethics? Just a random question!
Contact Tracing
A long time ago I was involved in assessing and using contact tracking solutions. It was during the Kosovo War (1998-1999) and I was working with an international charity organisation. At that time, I have not heard of Data Protection or privacy or Human Rights and did not have any safety procedures or standards to follow.
Did I do any impact assessments? I probably did but nothing related to data protection or privacy. Did the solution cause any harm to anyone? I don’t think so. However, I remembered I spent a fair amount of time reflecting on the users of the tracing systems. The main users were those affected /impacted by the War and the administrators of the systems. Would any Data Protection Impact Assessments (DPIAs) and Privacy Impact Assessments (PIAs) method help me back in 1998?
As part of my PhD, I examined various DPIAs and PIAs and I can say I’m not any wiser in using those impact assessments method except that now I am aware of Data Protection, data privacy and Human Rights.
Right now, I’m reflecting on the contact tracing app after reading the article – The NHS Contact Tracing App: 11 key talking points –published by the BCS. I wonder whether any of the DPIAs or PIAs method has been used by the app designers. I suspect not.
#GDPR
Today, 25th May 2020 marks the 2nd anniversary of the GDPR.
accessnow.org issued this report: TWO YEARS UNDER THE EU GDPR
Extracted from the conclusion:
Crippled by a lack of resources, tight budgets, and administrative hurdles, Data
Protection Authorities have not yet been able to enforce the law adequately. Worse,
some public authorities have misused the GDPR to undermine other fundamental rights.
While the GDPR in itself is not to blame for these failures, fingers are sure to be pointed
at the law if urgent actions are not taken. We hope that the recommendations put
forward in this report will help address the situation.
I wonder even with more resources available, would the DPAs be better able to enforce the law adequately?
I suspect not. More resources will not guarantee better outcomes.
Governments should ensure the application of the GDPR and the
protect the right to data protection in their COVID-19 response,
particularly in the areas concerning the collection and use of health
data, the use of tracking and geolocation, and the conclusion of
public-private partnerships for the development and deployment of
contact-tracing apps.
Balancing the rights – the rights and freedoms of individuals & public interests – for the Covid-19 response is probably where we will see or have seen the workings or misuses of GDPR.