OECD report highlights ethical risks around use of algorithms

By on 18/02/2019 | Updated on 24/09/2020
Algorithms can be used to support citizens – or to monitor and coerce them (Image courtesy: bluelightpictures/Pixabay).

Governments around the world are experimenting with the use of data-driven algorithms to develop policy, but the technique raises new ethical risks and challenges, according to a new report on innovation by an OECD body.

The conclusion is included in the Innovation in Government Global Trends 2019 report, released by the Observatory of Public Sector Innovation (OPSI) last week. The report highlights a number of countries where data scores are being used to classify citizens and make decisions based on these groupings.

“Innovative and sometimes controversial efforts are underway in many countries to make human characteristics and decisions machine-readable, enabling them to be analysed by automated decision algorithms,” says the report.

Use cases emerging

These techniques have long been used by financial institutions to rate the credit score of potential customers, the report notes, but they’re now gaining traction within the public sector. In one Swiss pilot initiative, it says, data on individual refugees is being analysed by algorithms in a system designed to place them in neighbourhoods where they might have the best chance of successfully integrating.

According to a desktop study, this algorithm could increase employment outcomes by 40-70% on average, compared to the status quo. The Swiss government is now piloting the algorithm to ascertain whether it works in practice.

At the local level, in the UK the London Borough of Hackney is developing a predictive risk model which brings together data from multiple agencies to identify children at the most risk, according to the OPSI report.

And in Australia, the state Government of Queensland is using machine-learning algorithms to map and classify land use features, aiming to improve both its planning of the use of natural resources and agricultural land, and its monitoring of disease vectors.

“By using technology to make land features machine readable, governments and scientists can make much faster decisions about how to use land, much earlier than was previously possible,” the report says.
Ethical risks and misuse

However, the use of these technologies raises questions over how decisions are being made. At last year’s Global Government Summit, speakers raised concerns over the use in public service delivery of algorithms so complex that public sector staff can’t see how individual decisions are being made. In a “system beyond human cognition,” said UK civil service chief executive John Manzoni, “who’s responsible for that algorithm?”

In some countries, these technologies are being used in ways which dramatically increase the government’s power over citizens. The report points to China’s ‘social credit’ system, which is used to punish or reward citizens based on rankings which can be affected by actions such as posting news deemed ‘fake’ by government, buying large numbers of video games, and poor driving.

OPSI says that around nine million Chinese citizens have already been banned from buying domestic plane tickets, while some parents have been denied access for their children to the best schools. According to the report: “Such profiles and algorithms raise a number of ethical questions for governments.”

A lot to learn

Some of the detrimental approaches resulting from the use of algorithms to make decisions are only starting to come to light, the report says. “As the technology and capabilities of these types of algorithms are only going to become more advanced, governments will need to understand how these approaches work and be able to consider the underlying ethics, in order to determine if and how they can be used to improve public services – as well as understand and react to their use in other sectors.”

The report finds that if ethical issues can be overcome, algorithms could be extremely useful tools. However, governments should exercise caution over experimenting with emerging technologies, which face a greater risk of failure than tried and tested methods, it says.

Although some researchers argue that the government has a crucial role to play in funding early-stage research into new technologies, the report notes: “One could argue that governments should take a ‘wait and see’ approach, and let the private sector ascertain which applications of emerging tech work well and which do not, instead of pursuing risky ventures with public funds.”

About Colin Marrs

Colin is a journalist and editor with long experience in the government and built environment sectors. He cut his teeth in local newspaper journalism before moving to Inside Housing in 1999. He has worked in a variety of roles for built environment titles including Planning, Regeneration & Renewal and Property Week. After a spell at advertising industry bible Campaign magazine, he became a freelancer in 2010. Since then he has edited PublicTechnology.net, local government finance publication Room151.co.uk and contributed news and features to Civil Service World, Architects’ Journal, Social Housing, management titles and written white papers for major corporate and public sector clients.

Leave a Reply

Your email address will not be published. Required fields are marked *