Overview: Designing a portal for jurors to evaluate competition entries/ submissions.
Client: UNI.xyz ,a web platform for designers focusing on crowd-sourcing design solutions and making design more accessible.
Timeline: Nov 2019 — Jan 2020
UNI.xyz is a web platform that organizes design challenges and has one of the largest competition catalogues. Although UNI had an evaluation portal since the beginning, it was in a way disconnected from other products. So, in our efforts to unify the platform, we needed to develop a new portal (integrating existing one posed technical constraints and compatibility issues). That gave me a chance to reflect on how an evaluation portal should be and design accordingly.
For this update, I nearly re-designed the entire portal and was involved throughout the development process. Apart from design, I was involved in quality testing.
UNI invites the pioneers of the industry as jurors in the evaluation panel. These jurors graciously spend time evaluating entries and providing quality feedback to the participants. Thus, we wanted the portal to be hassle-free allowing jurors to spend their entire time evaluating rather than struggling with navigation.
In the initial stages of design, I tried to gather input in form of feedback/ queries and suggestion we received from our jurors. Those helped me create a preliminary list of items we needed to introduce and get rid of. Also, to understand the pain points better, I conducted a mock jury session where I acted as a juror and evaluated about 30 sample entries.
The seemingly simple flow turned out to be hectic as there was no option for jurors to jump directly from one project to another. Repeating the process 30 times made me recognize the extra steps which needed to be curbed and features we needed to introduce to smoothen the flow. Also, we wanted to re-use the existing components as much as possible and it was designed keeping this requirement in mind
While talking about evaluation, the fairness of the system becomes the topic of prime consideration. Let’s face it — it’s not always possible for jurors to overlook their bias while evaluating. So instead of completely leaving upon them to score fairly, we needed a system in place. All possible metrics that could impact jurors decision in an unfair way were hidden from the portal. That includes any personal information of participants, public votes on their entries and even comments.
However, the biggest challenge in the project wasn’t designed, but the execution. Given a large number of competitions, at any given time there is at least one evaluation round scheduled on the platform. Thus, it was hard to get a free time window to test our changes on the live platform. Local testing wasn’t enough as the project was built from scratch and conflicts on integrating with existing products were possible. Due to that, we decided to move some of the functionality from MVP to future version and launch with bare essentials. This decision did result in creating a new design for MVP, however, it was necessary at the moment.
Although the update was a very stripped-down version of the actual planned product, it received a fair response from the jurors. Our queries reduced drastically and feedback was mostly positive. Impressed with the authenticity and fairness of the system some jurors even participated in other competitions.
This project made me realize the importance of stepping into the user’s shoes to understand the issues with a product. The revelation I had upon mimicking the entire evaluation process was not possible by simply sketching the wireframe and performing a sample test.