Back to the Future – a brief history of the residents’ survey and what the future may hold

This week, Westco’s Jen Compton presents the value of the resources being developed by the Westco Insight Network, a free informal network focused on getting the very best out of opinion research in the public sector. We want the network to help all those involved reduce duplication, share best practice and make the best use of their time and resources.

If Marty McFly had landed his DeLorean in England in June 1966, he could have been witness to a very exciting monument in time, not England winning the world cup – although if he stayed around until July he could have also seen that, the first residents’ survey took place. It was developed by the Redcliffe-Maud Commission. Their survey asked a sample of the electorate population aged 21 and over how they perceive the LAA runs things on a rather questionable scale of very well, fairly well, not at all well or don’t know. They also asked if the electorate had ‘heard anything about what the LAA is doing’ and if they feel ‘enough being done to help people and improve things in the LAA or should more be done?’

When Marty goes back to the future, his own 1985, he sees things have changed. By this time he sees the more familiar residents’ survey appear – following the MORI classic model and looking at service satisfaction. ‘How satisfied or dissatisfied are you with the way [named council] runs things/running the area?’

However, when he lands in 2015 – he doesn’t see much of a change in those 30 years – and no flying cars! We had the BVPI and then Place Survey from 2000 to 2008. This consisted of four waves of statutory government surveys using the same methodology and questionnaire (with some tweaks) providing a plethora of benchmarks. The evidence from the BVPI/Place Survey helped the development of the Reputation Campaign 2006

We currently have the LGA National Poll which offers some benchmarking and follows the established model for resident’s surveys with a set of core standard questions, proposes a suggested methodology and recommends use of effective base. According to our Residents’ Surveys and those of the LGA local government is in a reasonably good place in terms of satisfaction, and has probably held up far better than we all expected. But, is this built on a weak foundation if measuring “satisfaction” doesn’t really get us to the heart of the issue? In particular should we focus more on engaging people in understanding the changes local government will have to make (rather than just inform them)?

When SOLACE members were asked to reflect on future local government priorities and key drivers, the top three service priorities were named as economic development and growth, housing and Adult Services. The key drivers for their organisations were financial sustainability, stimulating economic growth and cost reductions/efficiency. So getting back to the Future – the challenge now is how can we ensure that Residents’ Surveys remain relevant, especially if (or when) they start to show a decline in ratings which may seem beyond our control. Should we ask more about community resilience and capacity for growth and change? The findings from a recent SOLACE survey seem to suggest – yes we should.