Tag Archive for Data Protection Act

Is a disproportionate fear of “Big Brother” preventing us from seeing the big picture?

FOI Man asks if we’re in danger of throwing the baby out with the bathwater through an increasingly negative portrayal of the use of personal data.

It’s easy to see why many of us have concerns over the possibility of the security services accessing our email or listening in to our phone calls. What I’m increasingly worried about is what appears to be a widely held and instinctive view that any sharing of personal data – and even data that has been anonymised – is necessarily a “bad thing”.

The Liberal Democrats in particular were highly critical of the last government’s use of technology. One development which David Laws, now a Minister, criticised as “intrusive” was a national database called ContactPoint. It had been developed as a result of a recommendation by Lord Laming in his report on the death of Victoria Climbie. It allowed doctors, social workers and police to access details of any child, thereby helping to prevent situations where abuse of children went undiscovered because of poor communication between these services. When the current Government came to power, the system was scrapped.

The last government also tried to introduce central medical records for all NHS patients, which would mean that when you turned up at a hospital far from home, as I have done myself, doctors would have access to your medical records and history. Believe me, when you are in pain and desperate to be treated, the last thing that you want to do is to answer questions about your medical history. And that’s if you are in a position to answer those questions. This project was scuppered by its complexity and expense fundamentally, but there was a big campaign by critics to encourage patients not to allow their doctor to upload their details.

One aspect of recent NHS reforms is that GPs will be asked to share data about their patients’ care with a central body called the Health and Social Care Information Centre. Patients can choose to opt out if they wish by writing to their GP. The data will be shared with approved partners, for example the Department of Health. It will be used, for example, by medical researchers trying to find out what treatments are effective. The data is invaluable to such researchers – it could well save more lives than donating organs or the odd litre of blood. It will normally be shared in anonymised form unless the research concerned requires more information to be effective.

There has been the predictable outcry against this. And that’s really my point. It has become fashionable to criticise any sharing of personal data, even if anonymised, no matter what the purpose. It’s all about big brother.

I can understand some of the concerns. There are risks in building up big central datasets. There are lots of stories of individuals abusing access to personal data. Police workers who misuse the Police National Computer to check up on a neighbour, or GPs’ receptionists who read their ex-husband’s new wife’s medical records. But firstly, where this is discovered staff can be – and should be – disciplined and/or prosecuted. Protection of this data is what the Data Protection Act is all about, and breaches should be taken seriously. And secondly – we’re surely not saying that the Police National Computer should be shut down as a result of breaches. The greater good of being able to solve crimes through linking a large pool of data is generally accepted as justification. Indeed police were criticised following the Soham murders for not keeping data on there. Instead what we really want is a proportionate use of this data, and for effective safeguards to be put in place.

One popular claim is that there is no such thing as “anonymised data”. Academic studies are widely cited showing that it is possible to identify individuals within large datasets. However, what isn’t so widely reported is that there are other academics who argue that there are deficiencies in those studies and that they are, in any case, being misreported.

As a Data Protection Officer (as well as an FOI Officer), I would certainly want any organisation to assess the impact on individuals’ privacy of any proposed plan involving their personal data. I would expect them to consider which condition of the Data Protection Act justified this processing of the data. But it does worry me that we seem to be moving to a position where we assume that any processing of our data must be wrong by its very nature. Where organisations are discouraged from innovating or using data to potentially save lives because there is a risk, however small, that an individual might be identified (and an even smaller risk that that would actually have any real impact on the individual concerned).  What’s more, because this has become a political issue, there are few in government now prepared to champion the use of personal data for the benefit of all.

In my view, the current trend is damaging. If we continue to portray all use of personal data as wrong, it will become more and more difficult to offer public as well as private sector services. It will certainly become more difficult to improve them. Contributing personal data to society is at least as important as paying our way financially. Data Protection shouldn’t be about saying “no” all the time.



News Bulletin – week to 9 August 2013

FOI Man summarises the key developments in a busy week in the information law arena.

You would think that this time of year would be a quiet one in the information law arena, but apparently not. Last week saw the publication of new fees regulations and Information Commissioner’s Office (ICO) guidance to support new FOI requirements in relation to datasets which come into force on 1 September; a new Code of Practice from the ICO on handling subject access requests under the Data Protection Act; and a draft Code of Practice on conducting privacy impact assessments. That’s leaving aside criticism of the Information Commissioner – and a forthright response from him – over the way he has reacted to the Cabinet Office’s apparent attitude to FOI. Here I’ve summarised the key developments last week and provided some links in case you want to read more about them.

FOI and Datasets

The Protection of Freedoms Act 2012 amended the Freedom of Information Act to oblige public authorities to release “datasets” in a reusable format. I’ve written about what these changes mean in a previous post. Late last month it was announced that the changes would come into force on 1 September this year, and a special datasets Code of Practice under section 45 of the Act has already been published by the Ministry of Justice. On Friday new regulations setting out the circumstances under which public authorities can charge to licence re-use of datasets were published, as was new guidance on these provisions from the Information Commissioner. Steve Wood, the ICO’s Head of Policy Delivery, has blogged about what these changes mean for public authorities and the ICO.

Data Protection and Subject Access Requests

Probably the most well known provision of the Data Protection Act 1998 (DPA) is the right of individuals (or “data subjects”) to ask organisations for information held about themselves. Earlier this year the ICO consulted on a Code of Practice on the handling of such requests, and this week the finalised Code was published. Anya Proops of 11KBW has given her reaction to the new Code, and has highlighted an apparent conflict between case law and the Commissioner’s approach in respect of the requester’s purpose. Meanwhile, DPA and subject access requests were considered in a High Court case. It has proved a rarity for DPA to be tested in the courts, certainly at that level, so this was an important ruling.

Data Protection and Privacy Impact Assessments

The ICO likes a good Code of Practice these days, so no sooner has the ink dried on its subject access Code of Practice, than it has published a new draft Code on privacy impact assessments. Privacy impact assessments have been promoted by the ICO for many years as a form of risk assessment to be carried out at an early stage of projects that are likely to involve, or relate to processes that involve, the processing of personal data. The ICO wants to know what individuals and organisations think of the draft Code.

ICO publishes statistics on data breach reports

Breaches of DPA have become big news these days, often featured in the national media. Last week the ICO began to publish statistics on data breach reports made to them, starting with the period from 1 April to 30 June 2013. In a welcome move, they have also put together a spreadsheet listing details of all civil monetary penalties issued and this can be accessed on their website. Sally-Anne Poole, Group Enforcement Manager at the ICO, has blogged about the thinking behind these latest developments.

Three more organisations to be monitored over FOI response times

The ICO has announced that the Home Office, Sussex Police and South Tyneside Council will be monitored for three months due to concerns over delays in responding to FOI requests. At the same time, the results of the January to March monitoring period have been reported. They have now cleared the Departments for Education and Work and Pensions, but have remaining concerns about the Office of the First Minister and Deputy First Minister of Northern Ireland. The Chief Executive of another public authority, Wirral Borough Council, has had to sign an undertaking promising to make improvements.

Mr Cook, the teeth, Cabinet Office strife and some bother

The start of last week saw a newspaper editorial criticise the Information Commissioner himself for failing to enforce FOI in respect of failings by the Cabinet Office. This follows the FT’s Chris Cook’s investigations into use of private email accounts by government ministers and special advisers. In a typically robust response, Christopher Graham made clear that he felt these criticisms unfair. I commented here that I thought the Commissioner’s defensiveness misplaced. Others – Jon Baines and Tim Turner – highlighted evidence of the Cabinet Office’s shortcomings and missed opportunities for ICO action. This one will run and run.


Do information practitioners need to get out of the way?

FOI Man reviews a seminar hosted by the University of Winchester’s Centre for Information Rights and questions whether those of us who are information practitioners are helping or hindering attempts to protect the vulnerable.

Back at the end of April I attended a seminar hosted by the University of Winchester’s excellent new Centre for Information Rights. The title for the seminar was “Data Sharing and the Vulnerable”, and given recent scandals around Jimmy Savile and Winterbourne View, it was timely.

The first speaker was Sue Gold, who is a solicitor with Osborne Clarke, but had previously worked for the Disney Corporation. Sue highlighted the difficulties not so much of sharing data, but of collecting data – specifically from children. If you asked most organisations whether they collect data from children, their automatic response would be no. But Sue pointed out that most websites will, even if their owners don’t intend them to, at some point collect data (eg registration data) from those who are much younger than their target audience. Most companies have some form of “age gating” to try to prevent children accessing products or services, but as Sue demonstrated, very few – if any – of these are effective. Frankly if you can think of a way to prevent children from accessing your site, they will have already thought of a way to bypass it. And if you have a system that requires parental consent…well, you probably remember what happened when you had to get someone to sign your homework diary.

This was followed by Helen James, Winchester’s Head of Law, who talked about the limitations of the UK’s whistleblowing legislation in a culture where 80% of nurses in a survey thought they would be victimised if they blew the whistle on their employer. Helen pointed out that there are hints that the mood is changing following Winterbourne View and the Mid-Staffordshire NHS Foundation Trust inquiry.

Finally, Jerry Brady of Dorset County Council’s Children’s Services looked at information sharing in services for vulnerable children. Jerry pointed out that following the Laming Report into the circumstances surrounding the death of Victoria Climbie, there had been a new emphasis on the importance of sharing data to protect vulnerable children. The aim was to integrate services, but limited progress has been made, not least because of political arguments over information sharing. There has been a shift away from integration towards alignment of services.

One of the important points that Jerry made was that the key to ensuring that data is shared where it needs to be is the development of trust between frontline team members. Noticing that he hadn’t mentioned the role of data protection officers or information governance staff, I asked him what his experience of working with information professionals was. His response was a little disheartening for me as one of those information professionals. His experience has been that if you ask an information professional for advice, you get an information professional’s answer – a cautious one. This isn’t helpful for frontline staff who need to feel confident that they are doing the right thing.

But Jerry was far from dismissive of data protection. One of his 7 golden rules for information sharing is:

“Remember that the Data Protection Act is not a barrier to sharing information but provides a framework to ensure that personal information about living persons is shared appropriately.”

Information sharing is very contentious. This week we’ve been hearing about data sharing in a very different context – between technology companies and the US security services. Not so long ago we were debating whether GPs’ surgeries should be sharing data with the Health and Social Care Information Centre. All of this adds to the difficulty for us as supposed “experts” when asked to advise whether data sharing is appropriate, even in circumstances where there seem clear benefits for data subjects. I know from my own experience how difficult it can be to balance my own concerns with data protection compliance (which is after all the expertise I’m paid for) with the desire to help an employer achieve its – usually well-intentioned – aims. But it seems to me that if practitioners like me are seen by Jerry and his peers as being barriers to protecting the vulnerable, then we need to find a better way of working.

The Information Commissioner earns his spurs, says Committee

FOI Man highlights a new report from the Justice Select Committee calling for more help for the Information Commissioner.

Say what you like about the Information Commissioner’s Office (ICO), but without it, the handling of personal data and FOI would be a little like the old west. Your rights would only be meaningful if you could afford a gunslinger (or expensive lawyer for those not following the metaphor).

The Justice Select Committee has taken a good hard look at the ICO and identified some major issues. And they’re worth noting.

In particular, the Committee has highlighted a major problem which may result from the proposed EU Data Protection Regulation, which, if passed, will replace our existing Data Protection Act (DPA) in the next couple of years. The existing draft will see the end of notification, which currently requires every organisation that processes personal data (with a few exceptions) to register (or notify) with the Information Commissioner every year. Depending on the size of your organisation, you have to pay either £35 or £500 for the privilege.

And that’s the problem. The ICO’s data protection work is financed by this notification fee. So even if you don’t have much time for the form-filling, box-ticking nature of the notification process (I’m a little lukewarm about it in all truth), that fee is essential to ensuring that the ICO can do its job on DPA. If the regulation removes the requirement from our statute book, the ICO will be left with a shortfall of £42.8 million. Bearing in mind that some suggest that the ICO doesn’t do enough as it is – including criticisms from Lord Justice Leveson – and the fact that it is highly unlikely that the Government will want to fund data protection enforcement directly – this is a major problem. As the Committee says, “No one seems to know where resources would come from to replace the notification fee if it is abolished.”

Interestingly, the Committee is not impressed with Leveson’s recommendation to change the status of the Commissioner’s Office to create an “Information Commission”. It repeats the call (which it rolls out every time it looks at anything to do with the ICO) for the Information Commissioner to be made directly responsible to and funded by Parliament. This is just as regularly rejected by Government, but it’s worth another shot.

Others have pointed out that successive Governments have failed to commence existing sections of the Criminal Justice and Immigration Act 2008 which introduced custodial sentences for data protection breaches. Some have suggested that bringing these into force would have been a better way to deal with the problems discovered by Leveson than the Royal Charter. The Committee calls for the the sections to be brought into force.

Similarly, Government has the power to bring in regulations allowing the ICO to carry out compulsory audits of parts of the public sector. This power hasn’t been used much, and the Committee suggests that it should be now to allow the ICO to go into councils and hospitals when there appears to be a problem.

So overall, the ICO will be happy with this report. Let’s hope the Ministry of Justice take note and enact at least some of these recommendations, as otherwise, we’ll be back in the wild west. And I’m rubbish at riding horses.




FOI Man reports on the ICO’s new Code of Practice on anonymisation.

FOI Officers tend to be caught between a rock and a hard place on a pretty much continual basis. If it isn’t navigating between the Scylla of senior management and the Charybdis of requester ire, then it’s trying to balance the often competing demands of the Freedom of Information and Data Protection Acts (DP).

So new guidance from the Information Commissioner on the important subject of anonymisation is very welcome. Though at over 100 pages, some FOI and DP Officers may struggle to find the time to read it between fielding requests and CMP notices. But, ever at your service, I attempt to extract the key points for you here.

The Code notes DPA does not require anonymisation to be completely risk free – the role of the Code is to help organisations mitigate the risks involved with anonymisation. Similarly, it points out that – in line with R (on the application of the Department of Health) v Information Commissioner [2011] EWHC 1430 (Admin) – anonymised information ceases to be personal data. So if your data is truly anonymised, section 40 of FOI won’t apply to it, and the sort of large datasets that that nice Mr Maude likes Government departments to publish can be unleashed without concern.

But that’s the trick. We’ve got to be very careful that what we put out there is truly anonymised. The Code summarises the problems with that neatly – firstly, there are a number of ways that an individual could be identified, so just taking a name out may not be enough. And secondly, we have no way of knowing what information you folks out there might already have access to.

There are well documented examples of how individuals have been identified from supposedly anonymised datasets once put together with information available on the internet or with personal knowledge. The ICO point out that organisations aren’t omniscient – they can’t know for sure what is, and what will be, available to people. So what do they say about how FOI and DP Officers should reach the judgment as to whether or not it is safe to disclose an anonymised dataset?

Effectively – and I hate to throw a buzz word at you – it’s a risk assessment. They cite a Tribunal concept of the “motivated intruder”. Basically this is someone who will do anything short of commit crime to identify individuals where there is some motive, eg the information is newsworthy, of interest to the village gossip, perhaps politically sensitive. We need to consider whether someone like that could identify people using libraries, archives, the internet, social media. In other words, we’re talking about those people who you see on TV sometimes tracking down people for an inheritance. Or the producers of Who Do You Think You Are. Could they identify individuals from the data?

Of course, this is better than nothing, but it still relies on FOI and DP Officers or their colleagues to have the time to work out whether someone could be identified from all of these sources. If they haven’t got that time, then there is a risk that the Code just leaves us where we started – with authorities reluctant to release information for fear of individuals being identified.

Thankfully the ICO do recognise the difficulty of this with large datasets – the desire for publication of which is pretty much what prompted this Code. They say:

“It will often be acceptable [with larger datasets] to make a more general assessment of the risk of prior knowledge leading to identification, for at least some some of the individuals recorded in the information and then make a global decision about the information.”

But it still means that many FOI and DP Officers will be left feeling uncomfortable whenever considering disclosure of anonymised datasets. Have I checked enough sources? What if I’d tried that other search engine? Should I subscribe to that genealogy site to check what someone could find there? It’s difficult to see what else the ICO could have advised, but FOI Officers will take limited comfort from the Code on this point.

There is some useful practical advice in the Code such as the best ways to present personal and spatial data (eg in crime maps). The case studies that form the last half of the publication will be helpful as well.

Overall, the Code is a useful guide to the issue of anonymisation for FOI and DP Officers and anyone working with datasets containing personal data. But it won’t be the last word and it will be interesting to see what comes out of the new UK Anonymisation Network announced yesterday by the Information Commissioner.