Quantcast
Viewing latest article 8
Browse Latest Browse All 12

Confessions of a Better connected reviewer

Today saw the publication of the annual Better connected survey of all UK local authority websites, produced by Socitm, which identifies good (and bad) practice based on extensive evidence-based research.

Last summer I joined the team responsible for conducting this survey, and now that the report is out I thought it good timing to reflect on the experience from both sides of the fence – as both a local gov web manager and a reviewer.

I’d been aware of the Better connected report for many years – not least because Edinburgh has achieved the maximum 4-star rating for the last 3 years. When the chance came up to join the team, I was excited by the opportunity but also wary that some might see it as “joining the dark side”. After all, the report pulls no punches when it comes to pointing out the failings of websites, and its strong emphasis on “top tasks” has generated lots of national debate.

Nevertheless I applied and, after a rigorous selection process and phone interview in August 2012, I was invited to join the team. Just days later, I found myself knee-deep in the first task – conducting a pilot of the survey on a number of sites.

The results of that pilot were top of the agenda at my first meeting with the team in September 2012 (held in Reading although the team come from far and wide). One of my initial concerns about the survey was whether it would be too subjective, leading to erratic findings depending on the reviewers’ experience (or even just their mood!) on the day. I was pleased to find everyone striving to make the questions as consistent and objective as possible, and developing specific guidance for the reviewers to follow if in any doubt, to further ensure consistency. After that meeting, I felt a lot more confident about getting stuck into my main allocation of reviews.

Goodbye social life…

Then, in October 2012, came the review itself – 6 intensive weeks of reviewing around 40 council websites, each one taking somewhere between 90 and 210 minutes (the better sites were almost always the quickest). The survey took the form of a huge spreadsheet of over 200 questions, split across several tasks which were announced by Socitm ahead of time – for example, paying a parking fine or reserving a library book. The questions take you through each task, testing various aspects of the user experience and looking out for common pitfalls.

This was seriously hard work, especially on top of a full-time day job, but also very rewarding.  Being a complete geek, I had prepared myself a lovely Google spreadsheet mapping the work I had to do each week. Seeing the spreadsheet, and accompanying progress graphs, slowly turn from red to green was one of the things that kept me going on those long, late Sunday nights!

I learnt something new on almost every site I reviewed – whether it was noting some really great content or a lovely bit of innovation, or identifying common errors or usability dead-ends. And I must honestly say that it was eye-opening how many basic mistakes were being made – typos and broken links, obscure content and puzzling navigation, meaningless acronyms and other jargon, and sometimes even factually incorrect content.

This was the biggest surprise of all. For the most part the sites weren’t too bad, with more than a few absolute gems. But time and time again I would find obvious usability issues that anyone could have spotted and fixed, let alone a web professional. Without knowing how the different sites are managed, it’s hard to point fingers at the underlying cause. Maybe a lack of internal resources, or not enough training for web publishers, or a failure to enforce standards. Impossible to tell, and certainly of no interest to the poor citizen just trying to get something done!

Being honest about the bad things

For me, this first hand experience cast the Better connected report in a new light – the findings are reflective of general trends, they’re not just seeking to grab headlines. We’re bound to see stories coming out over the coming days and weeks focusing on the failures, and as someone passionate about local government services I know how painful that can be, but the sad reality is that a large amount of sites are failing in certain areas. Of course, I’m also hopeful that we can also celebrate the good points, and I’ll blog about some of those in the near future. But by being realistic about the failures, hopefully web managers can take this opportunity to underline what they’ve probably already been saying to their management - namely that the web needs to be taken seriously and that means getting buy-in across the organisation and making sufficient resources available to make things happen.

Third-party sites

One of the common concerns I’ve heard expressed about previous Better connected reports is that much of what it tests is “out of the council’s control” because services are provided by a third party. For example, library catalogues, planning portals, leisure websites.

I discovered that the team are acutely aware of this, and in fact have been in discussions with many of the big vendors to try to get them to improve their products. But ultimately the councils are the vendors’ customers, and it’s vital that they are demanding the improvements too.

Also, it was very often the case that councils using the same third-party applications varied in success due to how they’d customised the software. For example, many websites forced users to register before they could comment on a planning application, yet other councils using the same planning software had simply turned this requirement off, vastly improving the user experience. It’s therefore clear that councils do have some (perhaps limited) ability to improve these user journeys, even when the user is taken off the main council site. I think this is a really important message to take back to those teams responsible for integrating those applications on their website.

And, of course, the main point is that the public don’t really care – they just want to get stuff done and are unlikely to be impressed by a council giving the excuse that parts of their website are weak because they are actually provided by someone else!

Re-reviews

One aspect of the Better connected annual cycle that I hadn’t known about was the robust re-review process that takes place after the main reviews are done. Any reviews which trigger certain conditions – for example, a site that appears to have got dramatically worse since last year - is subject to a re-review by two other reviewers. The team told me that they have been refining this process for years and review it every year to ensure the results are as fair and consistent as possible.

The team met again in January 2013 for two days to discuss these re-reviews – debating the scores given and making sure all was as fair as it could be. I really liked this stage – it validated the whole process and allowed us to iron out any potential blips, and also gave rise to some really good discussion and debate.

It’s also worth mentioning that the team kept in constant communication during this whole process, via email and an online collaboration space. We supported each other, shared good and bad examples, asked questions, invited second opinions and generally encouraged each other. As we were all using different devices and browsers, we could also easily ask others to double-check errors that we’d encountered, to see if they were device-specific or universal. We also used Dropbox to share files. As with all dispersed teams, this way of working is essential and it was good to be part of a team using these online tools so well.

Mobile, cookies and digital engagement

In addition to the main reviews, I was involved in a number of smaller surveys looking at specific topics. This was a good chance to put some of my professional experience to use, helping to develop the surveys to ensure our findings would be as meaningful as possible.

The mobile survey looked at how mobile-friendly the various sites were. We know that the number of people using mobile devices is continuing to boom, so it’s vital that public services make their web content universally available. Sadly we found limited evidence of this happening yet – there were a few wonderfully responsive layouts (my preferred approach if done well), some mobile sites (some good, some awfully limited) and the occasional app (very mixed in terms of usefulness and rarely well promoted). But such efforts were few and far between.

The cookies survey looked at how well the sites had implemented the requirements of the EU regulations on the use of cookies. This was generally not bad, although the range of solutions was immense with all manner of methods for handling cookies. We also found anecdotal evidence that many councils didn’t fully understand what cookies were being deployed by their site, nor what the full impact of the cookies were, which was worrying.

The digital engagement survey looked at the integration of social media and other forms of digital engagement such as the ability to register and sign-in to a site. We found lots of great use of social media, with 4 in 5 sites promoting their accounts on their homepage, although they were not always well promoted elsewhere.

Conclusions

This work was a valuable experience for me both personally and professionally. It’s vital that web teams are looking at their site from the point of view of their citizens, and the Better connected report is one of the ways of gaining an insight into how real users are experiencing your site.

If your council did well in this year’s report – well done! Pat yourselves on the back and be proud that your efforts have been acknowledged. But don’t rest on your laurels – a year is a long time on the web and citizens are becoming more and more demanding of what they expect from the web and online services.

In Edinburgh, for example, we have a rolling programme of identifying and testing our own, locally determined “top tasks” – mixing user testing and analytics to identify gaps and streamline the customer journeys. It’s great to get recognition from Socitm that we’re on the right track, and we always make an effort to disseminate the findings to managers across the council, but it’s even more important to us that we’re serving our citizens well.

If your council didn’t do so well, my advice would be to not take it too personally – look at the findings as constructive criticism and see if there are messages which might actually reinforce and support your own efforts and endeavours. Use it as a way to show your senior management teams that you need more support, more coordination, more buy-in, more resources. Go through the recommendations and cherry-pick some of the simpler amendments to deliver some quick wins. Also engage your citizens, if you haven’t already, and get them to tell you what they think. Work with them to make the improvements that will matter most to them.

There’s no such thing as a perfect council website and they also have a habit of leap-frogging each other - even a brilliant site will be defunct within a few years if not properly supported. The most important thing is to be on a journey to building something better for your residents, and if the Better connected findings can help you with that, then all those late nights will have been worthwhile.

The final meeting for this year’s cycle will be in May, when we’ll review the process and lessons learned to make improvements for next year. All feedback welcome. There’s also bound to be plenty of discussion on Twitter (hashtag is #socitmBC) and on the “Web Improvement and Usage” group on the Knowledge Hub - see you there!

The full Better connected 2013 report is only available to Socitm Insight subscribers, but anyone can access the headline results and reviewers’ comments on some of the keys areas such as housing, planning and social care.


Viewing latest article 8
Browse Latest Browse All 12

Trending Articles