OVO Tech Blog
OVO Tech Blog

Our journey navigating the technosphere

Alex Morris
Author

Senior UX Designer @ Boost Power (OVO Energy Group)

Share


Tags


Delighting customers by thinking beyond energy

OVO launched Boost, a Pay As You Go (PAYG) energy brand, in 2017. With PAYG, you need to keep your energy account topped up with credit to ‘keep the lights on’. Boost’s main product, ‘Smart PAYG+’, lets customers do this through a smartphone app linked to their smart meter.

PAYG customers are free to change suppliers whenever they want without paying exit fees. And they were leaving us in droves. The team had lots of ideas for improvements but as we’d never done in-depth customer research, it was impossible to judge if these ideas would solve real customer problems. We needed to take a step back to identify the problems PAYG users actually face. And then agree which problems to go after to improve customer retention.

Discovery phase – Thinking beyond energy

To better understand customers’ pain points, I reviewed customer feedback from social media and our leavers survey. Many complained about the cost of their energy, particularly in the winter, when they might fork out five times as much than in the summer.

social-media-complaint

Complaint on social media about energy costs in winter

The team had formed the following assumption to explain why people leave:

Customers think they’re paying more in the winter because we’ve put our prices up (we hadn’t) rather than understanding that they’re using more energy.

I was keen to use the research to help get to the bottom of this.

Lots of our customers are lower income mums and as most of the negative feedback was around energy costs, I was keen to explore:

  1. What challenges do these mums face managing a tight budget?
  2. Where does managing their energy costs sit in this broader context of managing their money?

researching-customers-context-1

To really understand our customers, we needed to understand the broader context

Our research process

We believed that interviewing PAYG users in their homes would help us build rapport, making them more comfortable talking about the sensitive subject of money. We’d also get a better sense as to how they lived their lives and what they really spend their money on — reducing any social desirability bias we may have got through interviews in a neutral location.

With a limited research budget, we decided to run a pilot study to help inform the interview guide for the more costly home visits. We ran online interviews through usertesting.com with PAYG users, experimenting with different questions as we went along.

At Boost, we strive to have a multidisciplinary team of problem solvers. It was therefore key that we involved different team members, from developers to copywriters, right from the beginning, to build a shared understanding around the customers’ problems.

We watched the usertesting.com interviews as a team and captured observations as we went, which we then grouped into pain points, behaviours, attitudes and understanding and goals.
analysing-online-interviews

Through this activity, some key insights emerged.

These insights allowed us to create a more focussed discussion guide for the home visits.

Home visits

As we didn’t have any experience running interviews in people’s homes, we pulled in another researcher from the OVO group to help out. She trained us and joined for the first interview.
We travelled around the country interviewing financially struggling mums. This included our customers and competitor’s customers so we could learn how they’d solved problems that our customers faced.

At Ovo and Boost, we give a lot of thought to the participant experience when we’re conducting research (check out OVO Research Lead’s post about this).
As we were interviewing single mums in their homes, we wanted to ensure they felt comfortable, so we made sure a female team member joined every session. Choosing the right number of people to attend was also challenging. We needed to balance the benefits of letting team members experience the interviews first-hand with creating an overwhelming experience for the interviewee. We had 3 people attend one interview but the interview felt too awkward so we stuck to 1 interviewer and 1 note taker for the remaining interviews. Despite these constraints, we managed to have different members of the development team join all but one interview.

Additional interviews

In parallel to running the home interviews, we had phone interviews with customers who’d downgraded from our main PAYG+ product. We stayed away from the topic of how they manage their money as we felt they were unlikely to open up as much over the phone. Instead, we focussed on why they were unhappy with the PAYG+ product.

Research analysis

Following the interviews, we regrouped as a team to analyse the interview transcripts (which I’d written up in-between interviews).

developers-sharing-observations-from-interview-analysis

Developers sharing their observations from analysing the interview transcripts

We learnt that interviewees really struggled to save money generally and with no reserves, paying for energy in the winter was using most of their budget.

Some people had strategies like topping up more than they needed in the warmer months, building up a surplus, which would ease the pain in the winter. Not everyone was that savvy. We heard stories about others with no savings who had to choose between heating and eating.

Based on these insights, we came up with an alternative interpretation of the social media comments mentioned earlier.

Customers think they’re paying more in the winter because we’ve put our prices up (we hadn’t) rather than understanding that they’re using more energy.

Customers are venting their frustration because they’re really struggling with the winter energy costs.

We agreed to tackle this and reframed the problem as:

HOW MIGHT WE...
Help customers save money in the warmer months so they can cope better with energy costs in the colder months

Design phase

Using behaviour change insights

Over the last half century, there’s been a growing body of psychology and behavioural economics research that’s helping us understand why we behave the way we do. To be successful, we’d need to help our customers change their behaviour (start saving for the winter) so it made sense to build our solutions on top of existing evidence of what works.

In our workshop the next day, I presented some of the relevant behaviour change research and then I facilitated a ‘design studio’. To avoid groupthink, where everyone’s thinking about solving the problem in one way, the design studio format allowed the team to generate a variety of ideas independently, receive critique from the group and then iterate on those ideas.

design-studio

Team member sketching ideas during design studio

design-studio-critiquing-ideas

Team reviewing ideas from design studio

With the next round of interviews booked in, we had the opportunity to test these concepts so I quickly refined the most promising ideas into a basic prototype. We had 2 ways of helping people cope better with the additional winter energy costs:

1. Energy Savings Pot
You set a savings target and decide what percentage of each future energy top up you make should be transferred to a separate savings pot, which you’d be able to use in the winter.
savings-pot-concept-1

2. Scheduled Top-ups
You see a breakdown of your predicted energy usage across the next 6 months. You can setup a scheduled top up into your energy account e.g. £25 a week. This would be more than you’re actually using in the warmer months so credit would build up on your energy account, which you’d rely on in the winter months when your weekly payments on their own wouldn’t cover your energy usage.

scheduled-payments-concept

Concept testing
We tested the concepts in one-to-one interviews with financially struggling mums held in our office. The idea of a separate savings pot got people really excited.

"I think people would be stupid if they didn’t take it up.... ‘It’s no different to saving for Christmas"

Financially struggling mum

In contrast, the scheduled payments concept got a mixed reaction. Some felt like they’d lose one of the key benefits of being PAYG — the ability to track of how much energy they were using each week.

Something that we’d not previously considered was people’s maths literacy. The testers who struggled with maths, found the percentages confusing with the savings pot concept, and the graphs confusing with the scheduled payments concept.

We also explored different reward mechanics to incentive people to keep saving. Not surprisingly, the idea of rewarding people with a fixed bonus if they hit their target was also very positively received.

Iterating on the savings pot concept

Through further user testing and peer feedback, I refined how you contribute to your savings pot. I ditched the fixed percentage idea, which was over complicated and did not suit those who are not maths literate.

We still wanted to keep the incidental nature of contributing to your savings pot - as in, if you’re topping up your energy account weekly, then that’s a perfect opportunity to encourage people to top up their savings pot.

With the next iteration, once you’ve chosen how much to top up your energy account, you’re asked to contribute to your savings pot. We added instant feedback, showing how close you are to reaching your savings target based on the amount you’ve chosen. And to reduce decision paralysis, we set a £20 default amount (spoiler alert: this came back to bite us when we launched).

savings-pot-iterations

Revised designs for savings pot concept

Going live

As summer was coming to a close, we knew we had to launch the most basic version of the savings pot straight away so people still had time to build up savings before winter. We launched the initiative, which we rebranded the Winter Wallet, in the last week of August. The key features of the Winter Wallet were:

  1. Choose a savings target and if you hit this by the end of the savings period, you’ll get a 5% bonus e.g. you’d get £7.50 bonus if you hit a target of £150. Applying the learnings from the concept testing, we converted the percentages into concrete amounts to make it easy to understand.
  2. Add money to your Winter Wallet once you’ve decided how much you want to top up your energy account.
  3. You can’t access your savings until the end of the savings period (unless you called up). We decided on this because
    a) we didn’t have time to build the functionality needed
    b) we learnt from our interviews that our customers have self-control issues when it comes to saving money. Understandably, when managing a tight budget, it’s extremely challenging to save for the future when you’ve got more immediate needs to take care of. We believed that by adding this friction, we’d remove the temptation of taking out your savings too early and increase your chance of successfully saving for the winter, when you need it most. This approach demonstrates how we prioritised what users need over what they may say they want.

Measuring success
We’d hypothesised that helping customers save money for the winter using the Winter Wallet, would lead to happy customers who want to stay with us for longer.

Based on this hypothesis, we created 2 key performance indicators (KPIs) that we’d track and try to optimise:

From tracking the opt-in rate in the first week, we could see that we were unlikely to hit that target so we needed to take action quickly.

Putting an ear to the ground

We introduced a variety of feedback mechanisms to learn quickly what customers were doing and why.

feedback-mechanisms

Launching our on-site survey helped us learn whether customers who’d visited the Winter Wallet landing page intended to sign up to it in the future and if not, what were the barriers to doing so. Creating a conversion funnel highlighted where people were dropping out of the sign up flow.

conversion-funnel

Conversion funnel showing where people were dropping out

Optimisation phase

We ran a series of A/B experiments using Optimizely to optimise the numbers opting in and hitting their target.

Experiments to drive opt-ins

The data highlighted that the majority of people weren’t reaching the Winter Wallet landing page. What was going on here?

Were customers seeing the homepage promo but didn’t have the motivation to click through?

Returning to the behavioural economics literature allowed us to build on learnings from similar experiments. For example, an experiment ran by the ‘Common Cents Lab’ found that using ‘claim your discount’ rather than ‘sign up for a discount’ had successfully increased open rates and click through rates in an email experiment. The authors of the experiment reasoned that ‘claim’ signals a sense of ownership which triggers feelings of loss aversion. Loss aversion is the tendency to prefer avoiding losses to acquiring equivalent gains. The word also trigger the scarcity bias (we unconsciously assume things that are scarce are more valuable than things that are abundant).
We tried changing the language to ‘Claim your 5% savings bonus...With our Winter Wallet’ which led to a massive 56% improvement in sign up rates.

optimising-hp-promo

A/B test of different call-to-actions

Were users even paying attention to the homepage promo?

We experimented with making the promo stand out more by changing its background colour. This created more contrast between the now blue promo and the rest of the page. This resulted in a 22% improvement in sign-ups to the Winter Wallet.

optimising-hp-promo-colour

A/B test of different colour promo

Or were they intrigued by the Winter Wallet promo but needed to focus on the more immediate goal of topping up their energy?

Hardly anyone was clicking the promos we had throughout the top up flow so we introduced a message once they had completed their top up task, with an animation to draw extra attention to it.

This created a huge spike in sign ups.
promo-after-topping-up

Effective Winter Wallet promo after user has topped up energy account

Too frictionless?
There’s a danger when you’re trying to optimise for a certain behaviour that you make the action too frictionless. Customers who sped through the top flow, repeatedly clicking on the Confirm button, accidentally added the default £20 contribution to their Winter Wallet. We learnt through feedback from the customer services team that this was driving calls from upset customers who couldn’t afford to lock away that money in their Winter Wallet.

Why didn’t we pick this up through the user testing we performed pre-launch?

In a test environment, it’s difficult to recreate real-life scenarios — someone desperate to top up their energy (possibly about to be disconnected), where English is their second language. With user testing, these biases can creep in without you realising. For us, the fact that they were using usertesting.com meant that their reading level was high enough to follow the onscreen instructions. Also, user testers tend to spend more time working through a prototype than they would in reality.

Fortunately, we had the right feedback mechanisms in place to quickly capture and address these issues. In this case, we removed any default so the user had to actively pick an option (including ‘Not now’ if they didn’t want to contribute anything to the Winter Wallet) before they could move onto the payment page.

How did the Winter Wallet go down?

Nearly 25,000 Boost Smart PAYG + customers opted in to the Winter Wallet, and they managed to save a whopping 1 million pounds.

7,000 of these customers reached their ambitious targets and received their 5% bonus credit, meaning we credited those customers with a total of around £35,000 towards their energy bills this winter.

Remember we’d hypothesised that helping customers save money for the winter using the Winter Wallet, would lead to happy customers who want to stay with us for longer?

To ensure we had the right data to support or reject this hypothesis, we made sure we had a control condition of customers who weren’t exposed to the Winter Wallet. An analysis of the 2 condition shows you’re significantly more likely to stay with us if you were in the Winter Wallet condition.

What have customers said about the Winter Wallet?
In addition to improving retention, our customers have been very positive about the Winter Wallet on social media, which should help attract new customers to the brand.

customer-feedback

Takeaways

There's a few things I hope you take away from this work.

  1. Discovery research done as a team allows you to explore the problem space together and build a shared understanding around the customers latent needs. In our case, none of our customers asked for a way to save for winter. But by understanding their context, we were able to come up with a solution unique to the Pay as you go energy market, which customers really valued.

  2. Don't stop when you've launched. Build feedback loops to get quick insights around what's happening and why. Then apply these learnings to optimisation experiments to drive improvements.

  3. Don't reinvent the wheel — if there's existing research out there around what works, try applying it to your context.

Want to know how we redesigned the scheme the following year to quadruple its impact? Check out our article about Winter Wallet 2019.

ux
ux
Alex Morris
Author

Alex Morris

Senior UX Designer @ Boost Power (OVO Energy Group)

View Comments