Зарегистрируйтесь сейчас для лучшей персонализированной цитаты!

UK's AI strategy does not address equality concerns, regulator says

13 ноября 2023 г Hi-network.com

The UK's Equality and Human Right's Commission (EHRC) has expressed concern that current proposals regarding the regulation of AI in the country are inadequate to protect human rights.

"Proposals to regulate Artificial Intelligence fall short of what's needed to tackle the risks to human rights and equality," the watchdog wrote in an open letter, noting that while responsible and ethical use of AI can bring many benefits, "we recognise that with increased use of AI comes an increased risk of existing discrimination being exacerbated by algorithmic biases."

The EHRC said it was undertaking research with local authorities to understand how they are deploying the technology and had published guidance for public bodies, specifically addressing around the Public Sector Equality Duty (PSED). The PSED is a duty on public authorities to consider how their policies or decisions affect people who are protected under the Equality Act. 

While the regulatory body said it had found some good practice in how local authorities consider equality when purchasing and using AI, it noted that there is more that can be done to ensure the AI systems being rolled out do not lead to discrimination.

"We have also found a lack of transparency in how AI systems work, limiting the ability of local authorities to consider the equality impact of such technologies," the EHRC said.

How is the UK government proposing to regulate AI?

In a white paper outlining its policy on AI regulation, the UK government said it was opting not to give responsibility for AI governance to a new single regulator, instead calling on existing regulators such as the Health and Safety Executive, Equality and Human Rights Commission, and Competition and Markets Authority to come up with their own approaches that best suit the way AI is being used in their sectors.

In the coming months, regulators are expected to start issuing practical guidance to organizations, handing out risk assessment templates and setting out how to implement the government's principles of safety, security and robustness; transparency and explainability; fairness; accountability and governance; and contestability and redress.

The EHRC said that while it is committed to promoting responsible AI innovation, the government needs to ensure that the regulatory bodies tasked with supporting the strategy are provided with sufficient funding and resources to ensure they can fulfil their roles effectively.

"If any new technology is to bring innovation while keeping us safe, it needs careful oversight. This includes oversight to ensure that AI does not worsen existing biases in society or lead to new discrimination," said Baroness Kishwer Falkner, Chairwoman of the EHRC, in comments posted alongside the open letter.

"To rise to this challenge, we need to boost our capability and scale up our operation as a regulator of equality and human rights. We cannot do that without government funding," she said.

Threat of discrimination is greatest AI risk

The EHRC's expression of concern comes weeks after Margrethe Vestager, the European Commissioner for Competition, argued that AI-fuelled discrimination poses a greater risk to society than the prospect of human extinction.

"Probably [the risk of extinction] may exist, but I think the likelihood is quite small. I think the AI risks are more that people will be discriminated [against], they will not be seen as who they are," Vestager said in an interview with the BBC earlier this month.

"If it's a bank using it to decide whether I can get a mortgage or not, or if it's social services on your municipality, then you want to make sure that you're not being discriminated [against] because of your gender or your colour or your postal code," she said.

tag-icon Горячие метки: Искусственный интеллект Правила и положения

Copyright © 2014-2024 Hi-Network.com | HAILIAN TECHNOLOGY CO., LIMITED | All Rights Reserved.