This work was done during one weekend by research workshop participants and does not represent the work of Apart Research.
ApartSprints
Howard University AI Safety Summit & Policy Hackathon
67240c8fb0416a4520d2b4b6
Howard University AI Safety Summit & Policy Hackathon
November 21, 2024
Accepted at the 
67240c8fb0416a4520d2b4b6
 research sprint on 

Implementing a Human-centered AI Assessment Framework (HAAF) for Equitable AI Development

Current AI development, concentrated in the Global North, creates measurable harms for billions worldwide. Healthcare AI systems provide suboptimal care in Global South contexts, facial recognition technologies misidentify non-white individuals (Birhane, 2022; Buolamwini & Gebru, 2018), and content moderation systems fail to understand cultural nuances (Sambasivan et al., 2021). With 14 of 15 largest AI companies based in the US (Stash, 2024), affected communities lack meaningful opportunities to shape how these technologies are developed and deployed in their contexts. This memo proposes mandatory implementation of the Human-centered AI Assessment Framework (HAAF), requiring pre-deployment impact assessments, resourced community participation, and clear accountability mechanisms. Implementation requires $10M over 24 months, beginning with pilot programs at five organizations. Success metrics include increased AI adoption in underserved contexts, improved system performance across diverse populations, and meaningful transfer of decision-making power to affected communities. The framework's emphasis on building local capacity and ensuring fair compensation for community contributions provides a practical pathway to more equitable AI development. Early adoption will help organizations build trust while developing more effective systems, delivering benefits for both industry and communities.

By 
Elise Racine
🏆 
4th place
3rd place
2nd place
1st place
 by peer review
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

This project is private