I think breaches like this are the reason that there's been more of a focus on creating 'trusted research environments' lately, rather than actually transferring data out of the healthcare system.
There was a really interesting review recently by Prof. Ben Goldacre which touches on a lot of this stuff, I recommend skimming it: https://www.goldacrereview.org/Reply
This is the perfect area for a vigilante to regulate the marketReply
Most crimes are left unpunished, even more so corporate crimes, because it's usually very hard to prove there is one, to demonstrate who is guilty, and costly to sue for little gain.
In fact for corporate crimes, it's even worse, because there is little skin in the game: rare jail time and fines that are only a subset of the money earned by committing the crime, when there is any.
Make it 2 months of jail + 1 one day for each patient data which has been breached for eveyrbody in the chain of command responsible, and you'll wager it will happen a lot less.Reply
> If an organisation is non-compliant with their agreement, we work with them to address any problems and conduct follow up audits to ensure they are fully resolved.
This feels like the right response to me. In most of these cases, we're talking about a data provider with reasonable governance controls in place, who grants access to a requester who says they'll use the data responsibly, then just does not.
If the requester is part of a large research university, it doesn't make sense to say "researchers in Study A violated the data use agreement, therefore hundreds of other researchers in studies B-Z must now erase the data they've already downloaded, and never apply for access to more data from the largest research data provider in the country ever again." Those other studies had nothing to do with the violation, so shouldn't be punished.
The institution should punish the offending individuals, and the data provider should blacklist those individuals, as well as carefully audit both the institution (for its education and oversight of its research teams) and the principal investigators of the offending study for some length of time.Reply
Perhaps the US needs national/federal GDPR/CCPA-style legislation?Reply
> Should NHS Digital curtail their access?
Depends. Will curtailed access harm them or harm their patients?Reply
Selective enforcement is a key method for... something. Some political thing. I forget the name.Reply
What's disappointing about issues like this is I worked on a specific transfer of health information from a govenrment health system to a university, and the attitude of the school researchers to privacy was practically obnoxious. They absolutely exploited the pandemic to squeeze the toothpaste out of the tube and get their hands on data sets that we actually have privacy protecting technologies to facilitate access to, but they were using the emergency powers to do a wholesale seizure of the data sets themselves.
Among the risks I specifically identified were that access to the patient data was the decision of a research ethics board whose decisions were not covered under privacy and access to information laws, and that the research organizations outright refused to allow their researchers to be identified individually - as per privacy laws that require all access to PHI to be by named individuals. The greater concern was once the data was in the hands of the university, they had no way of formally separating clinical research and broader access by social scientists, or worse, administrators with similar agendas, and questioning the integrity of some of the truly demented individuals who inhabit those institutions is apparently just not done.
Govenrment health information systems have rigorous privacy logs that admins check on a weekly and monthly basis to see if their staff are trolling through records for people they know, but the universities have no such controls, and their IT organizations are not enterprise quality. Modern tools like differential privacy, tokenization, data synthesis, and other techniques are absolutely sufficient to test hypothesis before working on production data, but their skillsets appeared more in navigating bureaucracy and political leverage, so they used the tools they had. I say obnoxious because there is a certain archetype of person who spits and sneers when they hear words like "privacy," and their skillset often reduces to creating crisis' that always seem to have themselves in the middle, and it was well represented in the groups I dealt with. The NHS should take heed, as they are clearly being hustled.Reply