Publication Forms

FormFest 2024 Recap: Bringing SNAP Benefits to Scale

An event recap from one of FormFest 2024’s breakout sessions featuring speakers from State of Maryland and the U.S. Office of Evaluation Sciences.

Author: Quaylin Dang
Published Date: Mar 24, 2025
Last Updated: Mar 25, 2025
March 2025. Bringing SNAP Benefits to Scale. Event Recap. Graphic of two hands holding sprouting plants within a hexagon. Beeck Center and Code for America logos. FormFest.

These FormFest 2024 sessions focused on the innovative approaches governments are using to bring their Supplemental Nutrition Assistance Program (SNAP) benefits programs to scale. Participants heard about the government of Maryland’s integrated benefits application, and the Office of Evaluation Sciences’ (OES) lessons in adjusting the questions on their SNAP application.

Key Learnings from Maryland’s Integrated Benefits Pilot Application

Key learnings from Maryland Integrated Benefits pilot application. Headshot of Yui Sahoko: Code for America, Senior Qualitative Research. Headshot of Jamie Renee Williams: Code for America, Staff UX Designer. Headshot of Sheetul Agrawal: State of Maryland, Chief Product Officer MD Think. Headshot of Felicia Billingsley: Baltimore County Department of Social Service, Assistant Director. FormFest.

Code for America’s Sahoko Yui and Jamie Renee Williams  partnered with Maryland government officials Sheetul Agrawal and Felecia Billingsly to share their experiences creating a pilot application to improve user experiences when applying for state benefits. In a study published in 2023, the Maryland benefits application took customers an average of 70 minutes to complete, making it one of the longest amongst the 12 states that participated in the research. The need to reduce the burden of this application was clear, prompting Maryland to team up with Code for America to create a pilot program. However, the team faced two major challenges: (1) the existing application was complex, unclear, and not user-friendly, and (2) the application was not optimized for mobile devices, making it inaccessible to households without a computer. To address these challenges, the team focused on creating an improved application that used mobile-first design, featured a transparent application process, and included clear call-to-actions that were written with simplified, trauma-informed language. 

To complete this work, Yui and Williams stressed the importance of building relationships with stakeholders early and often. Before the project began, they used in-person workshops with stakeholders to listen carefully to client needs and cater to their specific circumstances. Throughout the three-month pilot, they engaged stakeholders, applicants, and case managers in their feedback process through Miro brainstorming sessions, EasyRetro feedback, weekly meetings, and additional in-person workshops. Using the feedback they collected, the team successfully reduced the benefits application time to 18 minutes. Similarly, they improved the completion rate to 85%, which helped reduce the administrative burden on both sides of the application. Overall, these changes were instrumental in improving the application process for individuals and building trust between applicants, caseworkers, and the state of Maryland.

Evaluating Form Modifications at Scale to Encourage Benefits Application Submission

Key learnings from Maryland Integrated Benefits pilot application. Headshot of Stephanie Tepper: Office of Evaluation Sciences, Associate Evaluator. Headshot of Blair Read: Office of Evaluation Sciences, Portfolio Lead/Behavioral Scientist. FormFest.

This session led by Stephanie Tepper and Blair Read from the Office of Evaluation Sciences (OES) explored how randomized evaluations, or A/B testing, could help agencies design and conduct impact evaluations of forms. This approach involves randomly assigning individuals to two groups. One group receives the standard version of the form while the second group receives  the improved version of the form. The collected data on user experiences from both groups is then compared. Tepper and Read argue that using random evaluations enables organizations to determine if a form improvement works at scale by attributing any differences between groups to the variations in the forms. 

As an example of this type of testing, OES walked participants through a case study on how changing the income reporting format on the SNAP application affected applicants with multiple jobs. The OES team noted that SNAP applicants with multiple jobs were statistically less likely to submit their applications. To simplify income reporting for this population, they tested adding a structured reporting condition where applicants listed income for each job separately instead of totaling it in one field. Surprisingly, after completing A/B testing, the team found that there was no significant difference between the submission rates for forms with the structured reporting condition versus the unstructured reporting condition. However, this testing did provide other valuable insights, such as applicants in one-person households or on mobile devices were less likely to submit the SNAP application with the structured reporting condition. Therefore, while the outcome was unexpected, the random evaluations approach helped the team successfully evaluate if their form “improvement” actually improved the user experience. To replicate this testing in other organizations, Tepper and Read pointed to the importance of using baseline data to first identify the target population for a form improvement and then define the key question about the user experience. Being specific about what you are looking to improve for users—such as administrative burden, emotional experience, submission rates, or data accuracy—helps organizations better evaluate whether their form modifications are operating at scale. 

Key Takeaways

  • Relationships are the key to success. Innovation doesn’t happen in a box, so learning from others is essential for making good forms. Meeting with stakeholders early and often can help ensure that the improvements you implement positively impact form users. Additionally, collaborating with other organizations by learning from their experiences or encouraging them to use your form can also help ensure it remains effective. 
  • Conduct usability testing and randomized evaluations to evaluate your form. Developing a pilot for testing can serve as a low-risk foundation to  build trust with users and identify service delivery gaps. To test out form improvements, conduct usability testing with  randomized evaluations to help identify how people are engaging with updates and changes. 
  • Be specific! Define who your users are and what needs and abilities they bring with them when attempting to fill out your form. Williams recommends using persona spectrums as a tool to easily build user profiles to better understand and anticipate user needs. Additionally, to effectively measure if an improvement works, identify its intended outcome and define the data that would indicate a positive impact.

Watch the session recording and more from FormFest 2024. 

Bringing SNAP Benefits at Scale

This session from FormFest 2024 focused on how governments are scaling their SNAP benefits programs, with Maryland’s improved integrated benefits application and the Office of Evaluation Sciences’ changes to questions on the SNAP application.

About FormFest

FormFest is a free virtual event showcasing governments working to make services accessible to everyone through online forms. Discover best practices and tools that are shaping the future of form design and service delivery.