In a future article I'll be walking through my thinking behind the UX design for the home page of the BigTown Council website; I've no idea if the UX or indeed the graphic design - happy with it as I am - will actually work on a real council website which is used by real users, many of whom will probably be cross about their need to use the council website at all. I'm doing this as an independent personal project separate from my day to day employment, after realising LocalGovDrupal makes for a useful platform to develop and demonstrate some of the ideas I've been writing about for years without control of a real council website to show them on. The fact that I'm doing this as an independent personal project means I don't have the resources to undertake the proper user research and user testing a council would have.
Real council webteams^Wonline service delivery teams^W^W^W^Wdigital teams^W^Wwhatever we want to call ourselves, of course, shouldn't just take the confident assertions of a weirdo randomer who speaks with a funny East Lancashire / West Lancashire / West Midlands fusion accent at face value, they should test the ideas being presented with real users before committing to them.
Back to the BigTown home page UX design. Rather than group all the principle service links together in one place, and group all the 'campaigns' in another place, I've presented them in alternating rows. I think this is a more effective way to do what a home page is there to do than the current LocalGovDigital Received Wisdom, but I don't know that. I'm not going to talk about the multiple techniques and tools in the UX Researcher's kitchen cupboard here, but one of those techniques is A/B, or Comparison, Testing - where you present two versions of the proposed deliverable, and ask users for their preference, whether that's an emotional preference or a functional preference. You can show both versions to the same test group and explictly ask which they prefer, you can show one version to half your testers and the other version to the other half, and gain satisfaction metrics from them. Rumour has it when Facebook was trialling new features and designs, they'd silently release them to a small subset of users and monitor for posts along the lines of 'this new update sucks harder than a Dyson Bagless' or 'really liking the new update', and make a decision based on textual sentiment analysis whether to roll the update out further or revert it.
Web Content Management should be considered a mature technology; some of the most popular Web Content Management Systems are over 20 years old. Some of them are better than others at managing and publishing content (and it's not for me here to express my own preferences in that regard).
I think it's now time for them to get better at enabling us to test content; I hope to see CMS manufacturers and maintainers introduce functionality into the productspace which is specifically there to enable comparative testing - to enable content and UX designers to easily present multiple versions of the same page or design to the user base and get explicit feedback from users, so as to enable us all to accelerate our improvement journeys and collectively deliver better service to the people who matter, our end users.