Ethics expert Hugh Whittall says it is not apparent how collected data will be used
The government’s proposed Covid-19 contract-tracing app lacks clarity of purpose, a leading ethics expert has said.
Hugh Whittall, director of the Nuffield Council on Bioethics, made the comment during a House of Lords Science and Technology Committee hearing on the science of Covid-19 looking at the ethics of the government’s test and trace strategy.
“It is not evident that, on the one hand, there has been a gathering of understanding of what people might expect their data to be used for, and what kind of values and priorities they might have,” he told peers. “Nor… is it necessarily understood which data will be gathered, in which context, and how it might be combined.”
Whittall added that there was still “work to be done both in involving the public in the design, preparation and management of the systems, as well as in explaining how it will be played out and how those two things will be governed so as to secure correspondence between expectations and actions”.
His comments come after the government announced it had abandoned plans for the rollout of the NHSX app in favour of a model based on technology provided by Apple and Google.
Asked about the possible ethical implications of so-called “immunity passports”, Whittall said he could “absolutely see the prospect of discrimination, disadvantage and stigmatisation that might arise from that”, because those people who have positive antibody tests or a low-risk profile would then get access to employment or services that other people might not have access to. Moreover, he added, people who are already subject to structural disadvantages would be doubly impacted.
“This is not just about the state introducing its own certification system,” he continued. “What might be even more concerning would be if we had private and commercial organisations applying effectively an informal verification system, whereby people needed to have a private test of immunity to gain access to private commercial spaces.
“It’s not a question of ‘should the government introduce this’ but how might the government think about this when other people might also introduce equivalent systems?”
Michael Veale, a lecturer in digital rights and regulation at University College London and a digital charter fellow at the Alan Turing Institute, said that legislation was lacking on protection against such abuses.
Commenting on the ethical implications around access to data, Veale warned: “There is a risk here that some academics in epidemiology see this as a chance to gather a data trove for analysis to have PhD students to work on for years and years in advance. But if that comes at the cost of public trust and success of these schemes, then we must push back against that.”
Whittall said it was important to keep in mind the foundations of the Data Protection Act, which is about the fair, lawful and transparent use of data.
“What we should not do is hoover up as much data as possible simply so that we can have more…[or] keep data longer than it is needed.”
In order to gain public trust, he suggested that people be invited to participate in the design of data-gathering systems in order to gain “a good understanding of what people expect from those systems and what they expect them to deliver”.
Veale added that the publishing of data-protection impact assessments was “critical”.
“We haven’t seen it happen and indeed it appears that NHS England has not carried out a data-protection impact assessment for its test and trace system,” he said.