Over the spring and summer of 2023, Baltimore Digital Services helped the City of Baltimore tackle improvements to its special event permitting process. We cover the problems and solution we created in a previous post by developer Maggie Epps, and now we're sharing the user research process and assets that we did as part of this iterative process.
1. Discovery
Before we built anything, we conducted initial qualitative and quantitative user research initially led by Emily Ianacone, the City's Director of Government Transformation. We interviewed 5 event organizers and sent out a survey to everyone who had applied for a permit in the past couple of years. While our goal was to get 100 responses to the survey, we only got 17 . Getting high participation in surveys has been a challenge in other City projects as well.
Findings:
- The best part of the permitting process for organizers were the staff of the Department of Transportation (DOT) event permitting office. They're friendly, knowledgeable, and helpful, providing excellent customer service.
- Common pain points included lack of transparency into the process once an application was submitted, the length of time for review, lack of clarity around what documents are required, lack of transparency into what applications got approved vs rejected, and the inability for organizers to submit online.
2. Observation
We shadowed permit technicians at DOT and Baltimore City Rec and Parks (BCRP). While DOT handles all "Special Event" permits, BCRP manages permits for some types of park events and some special events are held in parks and require coordination with BCRP, which DOT facilitates. However, event organizers don't always know which department to go to for which type of event permit and may come to BCRP instead of DOT first. That's why we decided to shadow permit staff at BCRP as well as DOT.
Shadowing meant sitting quietly to watch while permit technicians supported applications. We were silent observers when technicians interacted with clients, and between clients we asked technicians questions and learned more about their processes and software systems.
Findings:
- The software in use to support permitting in DOT is a couple decades old (now being upgraded this year!) and left a lot to be desired in terms of functionality and user experience for permit technicians. BCRP uses a different system that does not integrate with the DOT system, requiring duplicate data entry.
- The in-person application process can be incredibly time-intensive for event organizers. It could include wait time, walking time between buildings, and time to go home and back to pick up missing documents. On top of all this, if you drove, you'd have to pay for parking and may have to step out of line to feed the meter or risk a ticket.
- A small number of event organizers still strongly prefer an in-person experience and told us they would not use an online version.
3. Usability testing
Once we launched the first version of the online form and updated website, we conducted in-person and remote user testing. We conducted a formative study with the goal of understanding the design inconsistencies and usability problem areas in the user interface and content of our new web content and form.
For remote user testing, we recruited participants who had submitted permit applications previously. For in-person testing, we wanted to reach people who had never applied for a permit before and were coming into the office for the first time to do so.
Our assumptions entering the study were that event organizers were confused about the requirements for a Special Event permit application and that the complicated form structure was causing users to submit incomplete applications, which was adding burden to DOT permit technicians.
Check out our testing scenarios in our notetaker's guide!
Findings:
- Overall, users liked the new web content and form design, but we could improve the content by providing better structure, making it less wordy, and using consistent verbiage.
- Users liked the idea of the checklist for required documents, but the current implementation was confusing and took too many clicks.
Next Steps
After the usability study, we updated the web content and redesigned the checklist to be shorter, structured, and more user-friendly. We also changed the workflow to require that users complete a required information checklist before going on to the online application. The DOT team reported a decrease in incomplete applications afterwards, and we have consistently received positive user feedback and a Net Promoter Score (NPS) score of around 8.
Even though we were done with this phase of user research, we wanted to continue measuring success and identify areas of improvement in an ongoing way. We created an online feedback survey with these questions, which users see after they submit an online application. This helps us ensure user feedback continues to drive future enhancements.
Assets
Our user research assets such as interview guides and surveys are available in this public github repository.