Tech-Enabled
Data Consistency
Role
UX Designer
Platform
Responsive Desktop App
Deliverables
Wireframes, Mockups, Prototype
Results
Advanced data sanity checks.
New repair shop specific user logins for centralized system.
Empowered repair vendors to upload their own reports.
Proactive alerts notify users of next steps or errors.
Overview
Bottlenecked Process for Monthly Uploads
The vehicle repair data management process at Access, the ADA Complementary Paratransit service for functionally disabled individuals in Los Angeles County, relied on manual input and verification by the fleet manager who managed 3,300 vehicles. This approach results in inefficiencies, errors, and a significant time burden on the manager.
Despite doing similar maintenance and repairs on the same fleet of vehicles, each service vendor tracked work with different codes.
The fleet manager was responsible for ensuring that prior to upload to the fleet management system, their reports used the correct repair codes from Access' master list. This process consumed most of the fleet manager's time monthly. Any issue required following up with a shop technician from the service provider to confirm data accuracy.
Design Solution
Vendor-Specific Logic
After interviewing the fleet manager to get an understanding of their business goals, process, and problem I proposed a solution: a web portal.
CollectiveFleet would conduct the data sanity checks & send email notifications if anything went wrong. Further, the shop technicians could upload the reports themselves vs sending them to the fleet manager via email.
So within the existing fleet maintenance system, a "web portal" was designed for the service vendors.
Contraints
Limited Proprietary Editors
All design decisions were constrained by the proprietary app editors from which all Collective Data applications are built from. My design decisions were based on what elements the user needed to interact with on a particular view, what data it was referencing, and what logic could be implemented to aid the user.
This meant I needed to get creative to solve user problems when I couldn't implement a more obvious solution.
Challenges
Nervous Users
This project wasn't without it's challenges. The two main roadblocks came from verification of the data sanity checks both internally and with the client.
🐞 Q/A Testing
Given the scope of the project, it took around 2 weeks to write up the testing specs and verify them.
📝 Fleet Manager Sign-Off
The fleet manager was very busy and was hesitant to dedicate the time required to confirm the new functionality on their end.
This is why I partnered with them and walked them through the processes over many meetings. Sometimes holding the customer's hand and leading them is required.
Measuring Success
Audit Logging Analytics
The system automatically logged certain actions like user logins and database record creation, so not only could we quickly verify if the service provider techs were logging in to the system, but also which ones were using the new workflow, and which had more/fewer data issues.
Our customer had KPI's to track user logins, who had the most errors and where those errors were happening the most so they could communicate with the service providers and assist them.
Reflection and Growth
AI Does It Better
If I had this project to do all over again, I would utilize AI/LLM in the data verification process. Specifically using Make's automation system, Airtable, and ChatGPT.
Records would still be flagged, but a LLM could be used to "best guess" the correct answer based on past data correction, thereby saving time for the users.