Coming to the end of the year provides us with a natural break in which to reflect on what we’ve achieved. This post serves as something of an end of year report to mark our progress as a User Research practice.
User Experience (UX) within OVO has been in a period of growth this year. Our community of practice is made up of design specialists and research specialists. At the start of 2018 there was 1 recognised User Researcher in OVO, but we’re finishing the year with a practice of 4 User Researchers.
Setting up the practice
We began 2018 by defining the behaviours we wanted to live up to. To get there we needed to have a shared understanding of what we all thought when we pictured a User Research practice. Sharing where we drew inspiration from, what had served us well in the past and where we each wanted to grow provided a platform to have this conversation.
Our behaviours included statements such as:
- involve teams at all parts of the research process
- discover rather than validate
- talk about biases and limitations in research (which we do through sharpening our practice)
Our behaviours have provided us with a compass to direct our attention. We have used them as a tool to provide a framework for conversations about the activities we’re carrying out. This has meant we have shared with each other stories of our successes. Conversely, they have created a space in which we can ask for advice where we might be coming up against obstacles to embody certain behaviours.
The act of writing them though is where the real power lay. Crafting the specific words forced conversations about what we really meant about certain phrases. Talking around a thing, the behaviours, in this way meant we left with a shared understanding. Knowing where the others were coming from and where we wanted to get to built an early trust among us.
ResearchOps
A key component of being able to conduct user research is being able to ensure that we are protecting our participants. This means conducting research that is ethical.
We focused our early attention on key parts of the participant journey. Creating templates and processes that considered participant needs, including GDPR requirements. Creating a consistent template for an information sheet and consent form enabled all our studies to have participants that are informed and aware of how their data would be used.
These templates were the first building block in ensuring we embodied our behaviours. Further templates and patterns have been developed throughout the year too which help maintain our ethics whilst empowering designers to conduct their own studies.
Speaking with users
As a UX community we have had in-depth conversations with 146 people this year. I realise that this being qualitative research the number is less significant than what we actually learned. But I couldn’t quite believe it. I think it’s worth calling out to give context to our journey as a practice, but to also acknowledge each individual person that has given up their time to teach us something new.
The topics of these research conversations have ranged from:
- Seeing how behaviours adapted to owning an electric car to learning about how boiler related emergencies are handled.
- Discovering how people use their energy usage data to fine tune new appliances to understanding the variables, and their relative weights, that are considered when making a decision on which energy supplier to go with.
- Evaluating the accessibility and usability of our online account when managing an energy utility for those with a visual impairment or dyslexia.
We’ve used a range of methodologies to form the base of these studies. We’ve been invited into the homes of people who own electric cars to see first hand how they charge their vehicle. Research studios have provided a more controlled environment for when we’ve conducted usability testing. Remote depth interviews have provided us with an opportunity to speak with people from Montrose to Dartmouth, and everywhere in between.
Team sport
Research is a team sport, and our teams have come along with us for the journey. Over the course of the year our team mates have observed, taken notes and even asked questions. The magic number supposedly for research exposure is 2 hours every 6 weeks.
Halfway through 2018 we started to experiment with measuring exposure hours in a few of our teams. We observed that, whilst there were members of our team interested in being part of the research their taking part was irregular and team attendance wasn’t equally distributed. Having a teams research exposure tracked and visible allowed for observation to be seen as important as other work to be done. Whilst the shared value created accountability across the whole team.
For those that haven’t been able to observe research live, we’ve hosted research story-slams on a Friday afternoon. These differ from research playbacks, which are conducted after a research round back to the specific team, in the sense that they focus on the people we’ve spoken to and their stories rather than the research questions. They are a tool in which to create empathy with our users.
We’ve hosted 20 story-slams now and we're learning how to make the most of them
- Advertising helps get people in the room. We started by making posters, where we just changed the title every time, but we found it really took off when we created an individual graphic identity for each session.
- The content does need to be curated. People who aren't researchers can find the dry bits a bit, well, dry. Curating means snipping our research clips so that they are concise, and focusing more on individuals rather than all the participants in a round. In doing so the content is easier to follow and allows for rich conversations from the attendees about the stories we’re hearing.
- Friday afternoons with snacks works well for the London office but not for all offices. Different offices have different rhythms and although we experimented with different times at the other offices, we found turnout was not as strong.
Informing conversations and decisions
If a research study hasn’t informed a decision or made a change did the research ever happen? That’s somewhat of a silly question, I’m trying to be philosophical like the tree making a sound in the woods question. But it does illustrate an important question: what has been the effect of our research? The setting up of our processes and speaking to all those people would have been in vain if it had not been part of some wider impact.
Some of the impact that research has had includes:
- Articulating needs during winter that customers on prepayment meters have, therefore informing the development of a new service that aides in managing money during the warmer months for the colder months.
- Illustrating access needs that some people have which in turn has informed the creation of a internal community centred around accessibility whilst also teaching us how to better implement our tables of usage data so they are easier to navigate for someone using a screen reader.
- Influencing the product roadmap of our Green energy team, research identified where we weren’t meeting environmental needs in some people. As a result this team has prioritised ideas that might help people to live a more carbon neutral life.
What’s in store for 2019.
This year has been about setting the foundations for our user research practice to embed itself into how products are developed at OVO. The foundations feel really steady now, and that's down to Jess, Karen, Pete, Kiran and the whole of the UX community.
This foundation will help us to move forward with our practice in 2019, enabling us to focus on:
- Meeting more customers to understand their needs.
- Increasing participation from our teams in user research.
- Influencing the priority of product roadmaps to tackling users problems.