Author : Sam Trendall
Research has found that the majority of citizens oppose the use of automation tools in making decisions in a number of areas of the public sector, including criminal justice, immigration and social support.
The Royal Society for the Encouragement of Arts, Manufactures and Commerce (RSA) commissioned a YouGov poll of 2,000 UK citizens, who were asked whether or not they were aware of the use of automated decision-making tools for a variety of sectors and use cases.
Large numbers of respondents cited in the report, ‘Artificial Intelligence: Real Public Engagement’, were not aware of the use of automation in many areas, with just nine per cent familiar with its use in the criminal justice system.
The figures were little higher for immigration, on 14 per cent, healthcare, with 18 per cent, and social support on 19 per cent.
The study also found a stark lack of public support for the use of automation in all these areas.
In criminal justice, just 12 per cent supported the use of automated decision-making tools, with 60 per cent opposing them.
For immigration, 16 per cent were supportive and 54 per cent opposed, while in healthcare, 20 per cent of respondents support the use of technology and 48 per cent oppose it.
The use of automation in decisions about social welfare had support levels of just 17 per cent, with 53 per cent opposed.
Respondents were asked to pick their two biggest concerns from a list of six.
The absence of empathy in making decisions that affect individuals and communities emerged as the greatest worry, cited by 61 per cent of respondents.
A lack of accountability in decision-making was picked by 31 per cent, ahead of a lack of oversight and regulation on 26 per cent, and the loss of jobs on 22 per cent.
Around 18 per cent said they believed artificial intelligence could reinforce the biases of decision-making systems, and 13 per cent felt there was a lack of clarity in how decisions are reached.
Just six per cent indicated that they have no concerns.
When asked how their support for the use of automation could be increased, 29 per cent said that nothing could do so, but 36 per cent said they would be more be supportive if people had a right to request an explanation of how decisions were reached, while 33 per cent wanted to see punishments for companies who fail to comply with regulation on monitoring and auditing of systems.
About one in four would feel better about the use of AI if there were common principles to guide organisations’ use of automated tools, while 24 per cent wanted governments and businesses to engage more with the public, and 20 per cent would be receptive to automation if it was only used “if it could be explained to someone with no technical expertise”.
In his foreword for RSA chief executive Matthew Taylor said: “Currently, it can feel that the growing ubiquity and sophistication of AI is closely matched by growing public concern about its implications.
“On the one hand, unless the public feels informed and respected in shaping our technological future, the sense will grow that ordinary people have no agency – a sense that is a major driver in the appeal of populism.
“At worst, it could lead to a concerted backlash against those perceived to be exploiting technological change for their own narrow benefit.”