Coordinators are essential in ensuring that payments to study participants (of clinical studies) and athletes are tracked and paid on time.
Vincent is a payment solution application that allows for the issuance of anonymous, reloadable stored value cards by coordinators of clinical studies and coaches of athletic teams for Universities. It allows distributed payments while maintaining centralized control.
The app's design was inspired by a need for a more innovative solution to paying participants for clinical studies. However, when it was decided to open the tool to other uses like university athletic per-diems, additional discovery and design led to a whole new commercialized product.
Vincent is a payment solution app that allows for the issuance of anonymous, reloadable stored value cards. The app's design was inspired by a need for a more innovative solution to paying participants for clinical studies. However, when it was decided to open the tool to other uses like university athletic per-diems, additional discovery and design led to a whole new commercialized product.
Initial discover included a heuristic evaluation of the system, as well as several workshops with Study Coordinators. This is a sample of the report, written to summarize the findings of both the review and the workshops.
The main issue uncovered by the discovery research was that the architecture was flat. It left all things in the system equally important and created a frustrating navigation structure.
Working collaboratively with a few other designers from another team, we white-boarded solutions. With those concepts in hand, I was able to take them and generate wireframes that would improve WePay.
... and then a Pivot! After several months after I had been put on another project, I was brought back on the team to commercialize the product to work with other use cases. Primarily for Universities.
Aligning the new use-cases to the system, required more research with the Athletic Department at The University of Pittsburgh. I analyzed use cases and the architecture of the system to ensure that the system would work for both study participants and Universities.
The main thing that needed to change to accommodate the new use case for Universities was the addition of “Groups”. Additionally, elements in the system were renamed be more generic and less study participant-centric.
While frameworks and components are great for design patterns and visual elements. They don’t mean anything to the user. The core content makes up the parts of the system that makes the most sense to users.
After ranking content, the page hierarchy begins to come together based on what’s most important to users. Relationships between pages and content are also made.
Beginning to explore the layout and design patterns of each section and map out the workflows. Each workflow was assessed over several days with whiteboard sessions including Product and Engineering.
In-person usability testing was performed with multiple roles of the system, for a few iterations. This led to an 82.5 SUS score. There were some issues with adding a team in that large rosters were difficult to put into the system one-by-one.
For each iteration of testing we increased the fidelity of the design.
High-fidelity design was finalized by creating a design system in Sketch. I created naming conventions based on the BEM framework. Additionally, the components were built in HTML and packaged for developers.