Angela Han
Work          About          Resume



Spare money made easier with paid user testing
0 to 1   Shipped   Startup   Internship




Role:
Product Designer



Team:
1 Designer, 5 Engineers,
1 Product Manager



Know the Users:
College students, stay-at-home parents, freelancers

Duration:
3 Weeks (2024)



Skills:
0 to 1, Interaction Design, User Testing, Iteration, Collaboration


Goals:
Increase user engagement and streamline the overall process


VisualCamp is a Series B startup specializing in eye-tracking software. We are developing a new service to help businesses gain insights and improve their products.

I designed a new app called Eyedid Survey, a mobile app that allows users to participate in paid user testing and redeem earnings for gift cards.

This case study outlines how I crafted the experience from users exploring options to finishing user testing as the sole designer. Early testing revealed that this feature received 90% positive feedback and drove 238 beta user registrations.








Context

Get rewarded for helping research.


In April 2024, we launched Eyedid, a web-based platform to help researchers and designers set up and analyze eye-tracking data. However, we discovered that customers struggled to recruit participants and collect data.

To solve this, we built Eyedid Survey, a mobile app that offers paid user testing and encourages participation from people seeking easy ways to earn money.











[Fig. 1] Illustration of the steps before and after the app implementation. With Eyedid and Eyedid Survey, UX researchers can manage the eye-tracking research process more efficiently.


Highlight

We track your gaze, you track your earnings.


Unlike traditional survey apps that make you click and type endlessly, Eyedid Survey makes earning money easy. Simply pick a topic you're interested in, watch images or videos, answer a few quick questions, and see your balance grow. 

[Fig. 2] Just look at images or videos, answer a few questions, and get paid instantly.









Research Summary: User Interviews
What were competitors’ users saying?


As a new app with no users or features, our first challenge was understanding current industry practices. To ensure it met real user needs, I worked closely with the product manager to develop interview questions focused on user pain points.

We interviewed seven experienced users who had earned over $10 using competitor research apps. Their feedback revealed gaps in existing apps and guided us in addressing user needs.

[Fig. 3] Three user pain points to address.


Research Summary: Competitive Analysis
Analyzing competitor solutions to address user pain points.


After identifying key issues, I compared how leading survey platforms addressed them. This analysis helped pinpoint potential features to incorporate and provided insights into industry trends and competitor strategies.

[Fig. 4] Survey platform feature comparison showing what works for our product.


Research Summary: Visual Audit
Evaluating study layouts across competitor platforms.


User interviews revealed frustration with buried study details. To understand effective solutions, I analyzed ten competitor app layouts. The audit identified three effective presentation patterns and showed that incentives (10/10), time (9/10), and topics (5/10) are the most commonly highlighted details.

[Fig. 5] Color-coded incentive, time, and topic to compare how they are displayed.














I've collected the insights; now it's time to focus on design.


Design Explorations 01: Points Display and Filter

Enhancing efficiency in discovering relevant studies and tracking earnings.


User interviews revealed frustration with finding studies and checking earnings. Since both issues impacted efficiency, I designed layouts that combined a persistent earnings display with streamlined study filtering. 

The goal was to balance clear earnings visibility with easy study discovery. Testing with six regular survey takers showed that the 'Filter-First' layout was most effective, with 5/6 preferring its visible earnings at the top and horizontal-scrolling filter chips.

[Fig. 6] Testing wireframes to balance easy earnings visibility and study discovery.


Design Explorations 02: Study Details
Enhancing study layout for better readability.


Using insights from effective patterns in competitor apps, I designed three variations of the study details layout, specifically focusing on how incentives are presented, as that’s what users care about most. I tested these layouts to measure how quickly users could identify and compare payment amounts across studies. 5/6 preferred Layout 3 for its effectiveness in quick payment comparison.

[Fig. 7] Testing wireframes to improve scanning study details.


Anatomy

Designing an intuitive interface.


The three most important things the user should clearly notice on one screen are: points balance, filters, and available test options.

[Fig. 8] Displaying the main screen's essential features: points balance, filters, and test options.


Iteration

The need for more visible incentives.


Regular feedback is essential for improving design. Through usability testing and team reviews, I actively sought input to refine the experience. One key takeaway was the need for more visible incentives. Taking this feedback to heart, I adjusted the layout and information hierarchy to make the incentives more prominent.

[Fig. 9] Improved visibility of incentives based on user suggestions.


Solutions

Seamless paid user testing at your fingertips.


Users can track earnings, filter, and compare studies in one place without navigating between menus. They can filter studies by type, category, or compensation to find opportunities that suit their preferences. After selecting an opportunity, users proceed through the steps to agree to terms, view content, answer questions, and see their earnings displayed upon test completion.

[Fig. 10] Main Screen.


[Fig. 11] Filter screen.


[Fig. 12] Points balance screen.


[Fig. 13] User testing flow.


[Fig. 14] Completion screen.


Impact
Early testers are loving it!


I led this project from initial concept to final handoff, and my designs are now in development for a launch in Q1 2025. We've received 19 out of 21 positive feedback responses from early testers, and so far, it has driven 238 beta user registrations. I can't wait to see these features help more people earn while contributing to research.

[Fig. 15] Positive feedback from early users.


Learnings
Things I wish I knew on day one.


Share early and often (even the messy bits).
Sharing ideas and prototypes early helped me gather the feedback I needed to refine my designs and stay on track. It also improved communication and built trust along the way. Future me, don't overthink it. Just share.

Trust your gut, but keep those ears open.
Owning my projects helped me trust my design decisions while staying open to feedback. This balance made it easier to adapt quickly and pivot when faced with new requirements or user insights.

[Fig. 16] A quick glimpse of the promo site I designed to drive beta user registrations.