This paper examines three key questions in participatory HCI: who initiates, directs, and benefits from user participation; in what forms it occurs; and how control is shared with users, while addressing conceptual, ethical, and pragmatic challenges, and suggesting future research directions.
This article explores how legal documents can be treated like software programs, using methods like software testing and mutation analysis to enhance AI-driven statutory analysis, aiding legal decision-making and error detection.
Led by the Digital Benefits Network in partnership with Public Policy Lab, the Digital Doorways research project amplifies the lived experiences of beneficiaries to provides new insights into people’s experiences with digital identity processes and technology in public benefits. This executive summary gives an overview of the project’s findings.
This article explores how integrating behavioral science into public administration can improve government effectiveness, equity, and trust by redesigning public services with human behavior in mind.
This paper examines the challenges U.S. state and local digital service teams face in retaining talent and offers strategies to improve retention and team stability.
This article explores how AI and Rules as Code are turning law into automated systems, including how governance focused on transparency, explainability, and risk management can ensure these digital legal frameworks stay reliable and fair.
This video documents the Digital Benefits Network's Digital Identity Community of Practice launch, covering mission review, 2025 goals, California authentication innovations, and peer networking for equitable and effective digital identity in public benefits.
The Digital Benefit Network's Digital Identity Community of Practice held a session to hear considerations from civil rights technologists and human-centered design practitioners on ways to ensure program security while simultaneously promoting equity, enabling accessibility, and minimizing bias.
Algorithmic impact assessments (AIAs) are an emergent form of accountability for organizations that build and deploy automated decision-support systems. This academic paper explores how to co-construct impacts that closely reflects harms, and emphasizes the need for input of various types of expertise and affected communities.
ACM Conference on Fairness, Accountability, and Transparency (ACM FAccT)