What we’ve learned about gathering insights (it’s not as easy as we thought!)

Janine Woodward-Grant
BanesCarersCentre
Published in
8 min readOct 13, 2021

--

Over the past 6–9 months we’ve been spending a lot of time gathering insights from a lot of different places. Adult carers; young carers; third sector partners; health and social care colleagues; employers; the general public — the list goes on. We’ve used interviews; surveys; desktop research and more. Some of this we’ve done well, some we have definitely learned lessons in!

Design research with your audience in mind

Fairly obvious, but it became really clear to us as we went through this process that certain tools and ways of working were much better suited to different groups. Here are a few examples:

  • Back in April, we wanted to understand the view of the health and social care sector on unpaid carers. The best way to do this is to have in depth interviews with some of our health partners. But the vaccine response was in full swing, and a friendly GP advised this wasn’t going to get us the best result, whereas a quick survey might. We took this advice on board and kept the survey quite short, meaning we were able to gather more than 50 responses containing really valuable insight.
  • Again, we wanted to ask the general public to give us their views on carers. Our standard approach to this is to ask both quantitative and qualitative questions, to give an in depth understanding of the issue. But a local PR company advised that approach could come at the cost of not having many people fill in the survey at all. Most respondents would likely be on mobile devices, not wanting to type lots. Additionally, as we have no relationship with this group, why would they go to the effort of a detailed written form? What’s in it for them? We amended the survey and included a prize draw to encourage people who might not otherwise engage with us to take part. We feel it’s important to thank people for giving you their valuable time and views. It was a small price to pay for data which could help us better support carers.
  • We wanted to reach out to a wide audience with our public research, but this inevitably means online, and what of those who are digitally excluded? What tool could we use to hear their views? Thanks to Louise Clapton from Dorothy House, we found Community First who undertake door to door research using a Community Organising approach. By mixing this face to face approach with an online survey we were able to reach all audiences we wanted to speak to.

It’s not always a survey (after 9 months I really wish it was that simple!) and you might need to ask questions in a different way. For the best results it helps to think about what your stakeholders want to answer, rather than simply what you want to ask.

Consider how you can ensure you get the right people responding

This was a big learning curve. Being users of the Microsoft 365 suite, we’ve been using Microsoft Forms for a lot of our survey work. Sure, there are other tools which are a bit slicker, but it’s free and we’re comfortable using it.

Back in August we put together our survey to local residents to capture their views and decided to pay for social media advertising to reach more people. Once it went live I therefore expected a few dozen entries on the first day or so — regular followers and those captured by the ad. Yet on the first night there were 400 responses! We’ve never really leveraged social advertising and clearly we had been missing out. The next day, with responses up to 1300 and rising, we could not have been happier.

Screenshot showing the number of responses to a survey, totalling 1509
Final ‘responses’ to the survey

As the research team sat down to celebrate and look at the results.. The answers to a lot of the questions were really not what we would have thought, for example most people felt completely confident to spot a carer in need and would offer significant amounts of help. This was completely out of line with what carers had been telling us… but research can reveal things you don’t expect, so we went with it. However, as we continued to review the results, by the end of the survey it had become very clear we had been spammed. Very few of those responses were actually legitimate, taking us from a high to a total low. Maybe we were unlucky or maybe it was the lure of the gift voucher. Who knows?

Picking ourselves up, we realised we needed to add a security layer, such as a recaptcha tool. Unfortunately this can only be added to a form embedded to your website (and if you have the technology enabled). It can’t be added to Microsoft Forms. We opted for the next best thing — a question you had to answer to prove who you were. All respondents who answered ‘A panda is black and …? with ‘white’ were included. Those who didn’t were not. We also checked all 1300 original entries to find those which were legitimate to ensure no one genuine missed out. Thankfully, we still ended up with more responses than we had anticipated so all’s well that ends well. But not a mistake we’ll be repeating again!

Plan your comms strategy

Sometimes when you start research you can assume everyone is as interested in a topic as you. “It’ll only take 5 minutes, of course [insert group being surveyed] will all want to take part. They’ll definitely see the benefit.” Turns out, that’s not exactly the case. Even with a free prize draw, we failed to get as many responses as we might have expected in some areas. I don’t definitely know why, and I’m sure there are more reasons than this, but this is my top 3.

  1. Social media is so busy it can hard, without money and organic reach, to make sure your survey is getting in front of the right people.
  2. Other people’s priorities are different to yours. You might think it’s no big deal to ask them to promote your survey, but you have no idea how this fits in to their current comms or organisational priorities, or what their timetable is like.
  3. If you don’t have a relationship with someone, why would they go out of their way to help you? We emailed lots of businesses asking them to help us with our survey. Few responded. I don’t blame those who didn’t. We were a bolt out of the blue — a charity they likely had never heard of asking them to do something which might not have cost anything, but could involve employee time and which wasn’t for a cause they relate to. When we emailed those businesses we were engaged with, the response was almost immediate.
  4. Other people’s priorities are different to yours. Yes, I’ve said it once, but I’ll say it again. Everyone is busy these days. Fitting in something which isn’t a priority for is unlikely to happen.

This doesn’t mean you can’t get people to fill in surveys, but you really need to spend time considering: what’s in it for them? What can you tell them so they begin to see it as a priority? Where and how can you promote it to ensure the right people see it? How much time will you need in order to get the results you want? What assets do you have that you can use to help you (the wider staff team; trustees; friendly partners and so on)? Where we were able to do this, results went through the roof. Where we didn’t — results weren’t quite so good.

Recruiting is hard!

Linked to the comms strategy issue above, if you’re wanting in depth research or user testing, it can be hard to get people willing to give the time you need, even with incentives. People are busy. Added to that they can’t necessarily see a direct benefit to participating. Added to that, lives change and with the best will in the world, user testing or research is not a priority and will be the thing that’s cancelled if change is needed. We also didn’t give ourselves long enough to recruit. We needed more lead time (and probably a better comms strategy — see above!), which could have compensated for the difficulty involved. We still had enough people to give us insights, but it was hard work and we’ll definitely bear this in mind for future.

Be open to learning things you weren’t looking to hear

This is more relevant in face to face research, but several times we’ve gone out looking to answer a certain set of questions and, although we have learned relevant things, we’ve also come back with response to a completely different question. Two key instances come to mind:

  • We were looking to do some on the ground research in to how the community around carers feel. When we were ‘on the ground’ a lot of people we talked to were actually carers themselves. This gave us a completely different perspective to the one we had been looking for — but the insights were still incredibly valuable in helping us understand where the community help and what carers really need.
  • When we were doing some user testing, another type of research, on our new website, we wanted to know how easy to use our new website was, but an overriding message that came back was more about how confusing our name is, rather than the usability of the website itself! People saw ‘Bath and North East Somerset’ and assumed we were part of the Council, which impacted on their perception of us as an organisation.

I think if you’re too keen to answer one specific question or set of questions, you can miss really important messages you are being told by the people you’re talking to. By having a conversation with people rather than being stuck to a script which only allows you to ask certain things, you allow users or stakeholder (whatever you’d like to call them) the chance to drive the direction which can help to ensure that issues important to them, not you, are the ones that come up.

We’re no experts at all, and we can definitely improve how we undertake user research. But we’re getting there, and hopefully these 4 key lessons we’ve learned will help improve the future research we do!

You can also check out our ‘Gathering Insights’ posts to find out what all of this research actually taught us about carers and caring in Bath and North East Somerset….

--

--

Janine Woodward-Grant
BanesCarersCentre

Deputy Chief Executive & Digital Lead at B&NES Carers' Centre #tech #carers #community