OVO Tech Blog
OVO Tech Blog

Our journey navigating the technosphere



OVO Tech Blog

Our first experience of running a design sprint

This is a story about our first time running a GV style design sprint. If you’ve never heard of one before, you can read about it here

As you may already be aware, the energy industry as a whole is pretty complex, even for those of us who have worked in it for a while. This means it can be really tricky to keep things simple for customers while also providing full transparency — but both are really important to us.

A major goal for customers is understanding how their bill relates to the energy they’ve used, and we know that the charts and data we provide in their online account still require too much effort from the customer to figure this out.

We decided this was exactly the sort of problem that a design sprint could help us solve.

Monday — Jumping into the deep end.

We knew we wanted to use real-time smart meter data to provide a simple, easy-to-understand experience, so we started out with the ambitious but fairly vague goal of “make it simple for our customers to manage their energy account”.

After deciding on our main goal, we needed to break it down into questions we could try to answer in the sprint. After spending some time discussing what ‘make it simple’ meant to us, we came up with the following sprint questions:

Our sprint questions (the ones with ❓ and ✔️ were our targets for Fridays testing)

After deciding on our sprint questions, we started trying to map the customer journey.

We found that the example map in the sprint book didn’t really work for us because the journey of using and paying for energy is cyclical, not linear, so we tried a different approach. We also decided that instead of a successful ‘end point’, we were aiming for a repeated positive outcome — that every time a customer tried to interpret their usage and cost, they would feel confident and in control.


Once we’d interviewed our experts, turned our assumptions into questions, and covered our walls in (How might we?) notes, we decided on the area of our customer journey map where we’d focus our efforts for the rest of the week: ‘helping users interpret how their usage/DD affects their balance’. Which fit perfectly with what we felt was the most impactful to the business.

Then we were all set for Tuesday.

Tuesday — Let’s Sketch.

We started the day with a quick round of lightning demos from everyone in the sprint team, which involved each team member showing the rest of the sprint team user interfaces and interactions that they found inspiring. This was a useful exercise to get creative thoughts flowing and help us consider how these designs/solutions could factor into our own sketches. A scribe captured the key component from each demo as we went along.

‘Notes’ from the lightning talks, showing our UI inspirations

Working alone using the four-step sketch process, we were able to quickly capture notes and start sketching our ideas on paper. Once we’d narrowed down to our best idea, we created a 3-step storyboard with annotations of how our sketches worked and how they would interact. The ideas in these solution sketches would become the foundation of our prototype to be tested on Friday.


Wednesday — Time to decide

We started out Wednesday morning pinning up all of the sketches from Tuesday and then spent a few minutes reviewing the individual solutions, taking notes on things we liked about them and sticking up any questions we had about the solution. This lets each solution get an equal voice and reduced bias in selecting our next step.

Reviewing Tuesday’s sketches

After a busy morning of reviewing and voting on which sketches we thought would best help us answer our sprint questions, we separated our ‘winners’ from our ‘maybe laters’ and began working on our storyboard.

Dot voting highlight which aspects of our sketches we thought we most useful

For us, this was the most challenging part of the week. Picking our ‘opening scene’, which was a customer receiving a mobile notification, was straightforward but everything after that point took a bit longer to iron out. It was easy to get bogged down in the detail of each screen, and with eight people in the room working on the same thing, it sometimes felt like design by committee.


If we were to do it again, we would map out how the user will move through the flow and what you’d expect them to do at each stage to get to the next step. This way, the whole team has agreed on the high-level journey and the person who’ll be making the prototype can start adding the details.

Thursday — Prototyping

We started Thursday by having a quick recap over the storyboard to make sure we were all aligned on what we’d be building. At this point, we still had a few decisions to make about roles and responsibilities for the day (Note: Another thing we would do differently is to actually make these decisions at the end of the day on Wednesday).

According to the Sprint plan, the roles we needed were:

As our team’s skill sets were very varied, we found that there was some overlap in roles. As our prototype was going to be heavily reliant on data, we needed to make sure the numbers made sense, so we made the most of the technical skill sets in the room and the role of asset collector shifted from collecting images to creating data. So, we didn’t follow the book exactly on this one. However, we found a way that worked for us to get the job done.

By the end of a long shift on Thursday, we had a fully working prototype ready to be tested the next day.

Finishing the prototype Thursday evening 😎

Friday — We tested with 5 real humans

After a full of week of defining the problem, interviewing experts, lightning demos, sketching sessions, LOTS of decisions, storyboarding and building, we were finally ready to put our prototype to the test.
For this, we had one person interviewing and the rest of the sprint team watching remotely in another room. This let the interviewer focus on walking the user through the prototype, making sure no bits were missed, while the team made notes and pulled out all the insight from each interview.

It only took a few interviews for trends to emerge, so we actually made a few tweaks to the prototype in between interviews to maximise the amount of learning we could take away from the testing.


Once we had finished all five interviews and captured hundreds of Post-Its of observations, we organised them by what was positive, negative, and neutral. This final piece of the process helped us distil the primary themes and decide which of our questions (the ones we wrote on Monday) had actually been answered, and which needed further investigation.

So what did we learn from the design sprint?

Firstly, we gained valuable insight on how customers think about their energy usage and costs. Our prototype definitely didn’t nail the solution, but it gave us a clear direction for further iteration.

Five days sounds like a lot of time, but it enabled us to focus in a way we hadn’t been able to previously, and to ensure that we had a shared understanding. This was key, because even as late as Tuesday, we discovered that some of us had different interpretations of the language we were using. For truly complex problems, the time investment is definitely worth it.

Finally, with the Sprint process itself, we learned where we can be flexible and where it’s important to stick to the plan. We definitely want to use this approach again for other problem scenarios, but we also can see how to adapt it to two or three days instead of five.

View Comments