UK review urges mandatory transparency on use of algorithms in public sector

By on 01/12/2020 | Updated on 01/12/2020
A new report calls for public sector organisations in the UK to open up about how they use algorithms to make decisions. Credit: Jan Tinneberg/Unsplash

There should be a “mandatory transparency obligation” for UK public sector organisations that use algorithms to make decisions affecting people’s lives, an independent review has advised.

The Centre for Data Ethics and Innovation (CDEI), a panel that advises the UK government on artificial intelligence and data-driven technology, said public bodies should be required to publish information on how the decision to use an algorithm was made, the type of algorithm used, how it was used, and the steps taken to ensure fair treatment.

Transparency is needed to “build and maintain public trust”, the CDEI said in its final report on the risk of bias in algorithmic decision-making, commissioned by the UK government in 2018. The report is expected to kick off work with the Cabinet Office’s Government Digital Service to embed its principles into public sector operations.

Problematic perception

The organisation noted that there is significant growth both in the amount of data available and the use of algorithmic decision-making across many sectors. “There is a window of opportunity to ensure that we get this right as adoption starts to increase,” the report said.

When the review began, there was little public attention or awareness in the use of algorithms, it said. But the issue has since “exploded into mainstream attention in the context of exam results”. This refers to the controversy in England this summer over the use of an algorithm to moderate A-level results: nearly 40% of teacher-assessed grades were downgraded, and the government was forced to overturn the results.

The episode has resulted in a “strong narrative that algorithms were inherently problematic”, the CDEI said.

But the problem is not just domestic. To date, the design and deployment of algorithmic tools has not been good enough, the report said. “There
are numerous examples worldwide of the introduction of algorithms persisting or amplifying historical biases, or introducing new ones.”

Public sector accountability

In the UK, it proved difficult to find out what algorithmic systems the public sector was using and where, making it impossible to get a true sense of the scale of adoption of the technology or to understand the potential harms, risks and opportunities, the CDEI noted.

But organisational leaders need to be clear that they retain accountability for decisions made by their organisations, regardless of whether an algorithm or humans made them, the review stressed.

This extends to the use of third-party suppliers too. Decision-making using an algorithm often involves external suppliers, the report noted, but “ultimate accountability for good decision-making always sits with the public body”.

The report recommended that the Cabinet Office and the Crown Commercial Service “update model contracts and framework agreements for public sector procurement to incorporate a set of minimum standards around ethical use of AI, with particular focus on expected levels transparency and explainability, and ongoing testing for fairness.”

Organisations should actively use data to identify and mitigate bias, make sure that they understand the capabilities and limitations of the tools they are using, and carefully consider how they will ensure that individuals are fairly treated, it said.

Government should also issue guidance that clarifies the application of the UK equality legislation to algorithmic decision-making, including on the collection of data to measure bias and the lawfulness of bias mitigation techniques. 

Opportunity tool

Although CDEI found strong evidence that digital tools could entrench previous human biases or introduce new ones, it was far less clear whether algorithmic decision-making tools carry more or less risk than previous human decision-making processes, according to Mark Durkee, review team leader at the CDEI.

There are good reasons to think that better use of data could have a role in making decisions fairer, if done with appropriate care, he wrote in a blog.

“Data gives us a powerful weapon to see where bias is occurring and measure whether our efforts to combat it are effective; if an organisation has hard data about differences in how it treats people, it can build insight into what is driving those differences, and seek to address them,” he noted.

The CDEI will be working with the Government Digital Service on a pilot approach to algorithmic transparency in the UK public sector in the coming months, it said.

About Catherine Early

Catherine is a journalist and editor specialising in government policy and regulation. She writes predominantly about environmental issues and has held permanent roles at the Environmentalist (now known as Transform), the ENDS Report, Planning magazine and Windpower Monthly, and has also written for the Guardian, the Ecologist and China Dialogue. She was a finalist in the Guardian’s International Development Journalism competition 2009, and was part of the team that won PPA Business Magazine of the Year 2011 for Windpower Monthly. She also won an outstanding content award at Haymarket Media Group’s employee awards for data-led stories in Planning magazine. She holds a 2:1 honours degree in English language and literature from Birmingham University.

Leave a Reply

Your email address will not be published. Required fields are marked *