Regular Users Could Detect AI Bias Before Deployment: New Research

UK researchers present a framework for involving everyday users in AI evaluation, aiming to catch bias and harm before deployment.

UK researchers propose participatory AI auditing, a model where everyday users help detect bias and risks in AI systems before deployment, adding a human layer to machine accountability.

Researchers from multiple UK universities have introduced a framework that enables people without technical expertise to help identify bias and potential harm in artificial intelligence systems during early development stages.

The study, presented at the ACM CHI 2026 conference, outlines a concept called participatory AI auditing, which involves everyday users in evaluating AI applications before deployment.

The approach aims to improve the fairness and reliability of automated decision-making systems by incorporating social and ethical perspectives often missed in traditional audits.

The research suggests that while non-experts may lack technical knowledge, they can effectively identify risks and unintended consequences. However, they require structured guidance and tools to assess how those risks should be measured.

Professor Simone Stumpf, of the University of Glasgow’s School of Computing Science and the project’s lead investigator, said decisions by governments, financial institutions, and private-sector organizations are increasingly being made by AI, and that the use of such applications is likely to expand.

She said the research aims to provide a systematic framework and tools to help people without AI expertise use their lived experience to identify and report harms through participatory audits. 

ALSO READ: From Twitter to $2B: Parag Agrawal’s Parallel Raises $100M

The findings are based on co-design workshops with 17 participants who had no AI background, including patient representatives, teachers, and parents. They were asked to audit two AI applications: SPARRA (Scottish Patients at Risk of Readmission and Admission), used by NHS Scotland to predict which patients are likely to require hospital treatment, and the School Attachment Monitor (SAM), a prototype developed at the University of Glasgow to analyze children’s speech and assess caregiver relationships.

Participants were asked to identify potential impacts of each system, determine how those impacts could be measured, and propose how audit tools should function. Each participant expressed that people affected by AI applications should be involved throughout development, from the design stage onward.

Dr. Eva Fringi, one of the study’s first authors, said participants easily identified potential problems, sometimes flagging issues that developers did not appear to have considered. She noted they found it more difficult to determine how impacts should be measured, but responded positively to step-by-step prompts and examples. 

Participants also emphasized the need for transparency in audit processes, including clear explanations of an application’s purpose and disclosure of who is conducting the review. They noted that audits should capture both positive and negative outcomes, and allow for results that are not strictly binary.

What participatory auditing offers is a potential new selling point for AI: this application has been vetted by a diverse range of people right from the start, making it much safer and more reliable than software developed with a philosophy of ‘move fast and break things.
Dr Patrizia Di Campli San Vito, co-author of the paper

Researchers from the Universities of Sheffield, Stirling, Strathclyde, and York also contributed to the work. The study was funded by the Engineering and Physical Sciences Research Council through Responsible AI UK.

The paper is titled “Empowering Stakeholders with Participatory Auditing of Predictive AI: Perspectives from End-Users and Decision Subjects without AI Expertise.”

Avatar photo
NN Desk

Lasă un răspuns

Adresa ta de email nu va fi publicată. Câmpurile obligatorii sunt marcate cu *

Stay updated with NervNow Weekly

Subscribe now