Google’s DeepMind artificial intelligence research was dealt a minor setback Monday after U.K. regulators ruled a hospital it partnered with on a medical diagnostic app improperly handled medical data, and Google admitted it was working too fast on the project.
The ruling from the Information Commission Office Monday said that a project between Royal Free NHS Foundation Trust in London and DeepMind improperly handled patient data during a recent trial project, failing to properly inform patients how their data was being used. It required the hospital to re-evaluate its data-handling policies and publish an audit of its decision-making process throughout the trial.
DeepMind and the Royal Free struck a deal in 2015 to let DeepMind access 1.6 million patient records in order to build an app that could help diagnose serious kidney issues, but the full extent of that deal was not made public until 2016. “Our investigation found that the Trust did carry out a privacy impact assessment, but only after Google DeepMind had already been given patient data. This is not how things should work,” the ICO wrote in a blog post.
For its part, Google’s DeepMind unit has not been accused by regulators of doing anything wrong, but the company admitted that patient privacy rights weren’t at the forefront of its thinking as the trial unfolded.
“We were almost exclusively focused on building tools that nurses and doctors wanted, and thought of our work as technology for clinicians rather than something that needed to be accountable to and shaped by patients, the public and the NHS as a whole. We got that wrong, and we need to do better,” DeepMind wrote in a blog post.
The saga could make it harder for London-based DeepMind to find medical partners to work with in the U.K., which could slow the progress of artificial intelligence research into health issues. Google said the Streams app developed as part of the research was able to save nurses several hours a day processing test results and detected serious kidney issues faster than the traditional process would have allowed.
“No-one suggests that red tape should get in the way of progress,” the ICO wrote. “But when you’re setting out to test the clinical safety of a new service, remember that the rules are there for a reason.”