Card Benefit Question

Question within the pre-approval flow to help understand our customers’ interests

 
 
 

Role

Lead UX/UI Designer

Responsibilities

Find alignment amongst multiple stakeholders, perform competitor analysis, run and synthesize results from unmoderated user tests, design high fidelity mock ups, create prototypes, hand off spec docs for development

Duration

One year

Note:

Cannot post certain mocks and pictures due to Capital One policy.

Users & Audience

All customers who go through the pre-approval process at Capital One.

Context

When I first joined Capital One, the pre-approval team had already tested out a version of the card benefit question. It was originally requested by the marketing team because they wanted more data on our customers so that they can send relevant marketing material.

Goal

To see the impact on completion rate with the goal of using this data to inform risk and personalization in the future.

Solution

Allow customers to select multiple benefits that are more important to them in order to eventually use this information to call out differentiators between various Capital One credit cards, as well as affect the hierarchy of offers shown to make decision making easier.

Scope & Constraints

Unfortunately, we wouldn’t be able to immediately use the data from this question to inform recommendation logic, sorting, and such because another team handles feeding this data into the code that handles such things.


01 | understand the why

When I first joined Capital One, the pre-approval team had already tested out a version of the card benefit question. The question asked “What’s most important to you?” and had “Rewards,” “Low Interest,” and “Not sure yet” as answer choices.

With a bit digging, I realized that the question wasn’t solving for any user problem. It was originally requested by the marketing team because they wanted more data on our customers so that they can send relevant marketing material. Not ideal.

I asked my design team lead for a list of stakeholders and decided to create a Zoom meeting to get clarity on what they hope to learn from this question within the pre-approval flow. I learned that our stakeholders eventually want to use this question to help assess risk before pre-approving someone for a credit card. Since there were so many stakeholders for pre-approval, it came with a lot of swirl because stakeholders couldn’t decide what terminology would capture all the nuances that they wanted to convey. The user was beginning to get lost. Our stakeholders couldn’t reach a consensus and I wanted to bring the focus back to the user. I recommended doing an unmoderated qualitative test in order to understand how users interpret the card benefit question and discover any potential user problems.

02 | unmoderated user testing

I recruited a content designer, Kate O’Toole, to help me refine the content of the question and figure out what type of research questions to ask. The research objective was to understand users’ expectations and sentiment around the card benefit question, so we can increase form submit rate and use this question to 1. display relevant, personalized offers and 2. inform credit decisioning.

Research questions:

  1. What do users think of the card benefit question?

  2. What are users’ reaction to seeing the name question after the card benefit question? Why might a user drop off after the card benefit question?

  3. What kind of information do users think is required to evaluate pre-approval status?

  4. Do users recall being asked about their preferences in the beginning when they see the offer page?

  5. What questions do users expect to be asked when we’re trying to determine the right card?

Through user testing, we learned these key findings:

  • 3/9 participants experienced friction after the card benefit question because they thought the carousel, which showed different cards, was showing their pre-approved card offers.

  • 6/9 participants thought the question would impact their offer in some way (by getting a recommendation, narrowing their results, etcetera) but we aren’t currently using the question to influence the offer page.

  • Participants could easily decide between rewards and low interest but they exerted some mental effort weighing these options against each other. They also expressed interest in other benefits such as no annual fee.

  • 7/9 participants expected to see a question like the card benefit question when filling out a pre-approval form. Those that didn’t expect it still liked having it there and noted how they didn’t expect it because they usually don’t get to choose.

03 | Socialize recommendations to stakeholders

Based on our research, I created a presentation deck for our stakeholders, which included the findings, recommendations, and plans for iterative testing. I’ve learned through my many conversations with stakeholders that they feel more confident our design strategy when they can visually understand what we’re describing and show an actionable plan of getting to an ideal end state. Also, there was more buy-in amongst stakeholders when we approached test iteratively and cautiously. Once the deck was presented, the iterative phases were translated into Jira tickets. Here was my recommended phased approach to making iterative changes:

  1. Phase 1: update content and design to address the user problems discovered in qualitative testing

  2. Phase 2: test the impact of adding more answer choices, such as new card member offer, no annual fee, travel rewards, and cash back

  3. Phase 3: test single-select versus multi-select questions

04 | Hi-Fi designs, build, and test in-market

Using Figma and our design system, I created all the high fidelity mocks and spec docs for development. In these mocks, I incorporated all the findings from our previous user research. These specs were passed to developers and built out to be tested in-market.

05 | results

In the final test, my recommended design was the winning variant, resulting in:

  • 1.7% statistically significant increase in Incremental pre-approval submits (291.9k)

  • 2.1% statistically significant increase in incremental new accounts booked (~70.9k)

  • 1.9% statistically significant increase in incremental present value ($25.5 million)

Lessons Learned

Customers like to feel like they have the option to choose what kind of card they’re being pre-approved for. They don’t like to be forced to choose one benefit that’s most important to them because they don’t feel like we have the complete picture of what they’re looking for in a card. Having a “fun” question like card benefit question leads to a higher engagement in the form.

Next Steps

We want to use the answers to recommend certain cards or call out differentiators so that customers can make decisions easier and have a clearer understanding of all our different cards.