The role of user centred interaction design in digital customer experience: Exploring the importance of usability testing and sharing top tips to getting it right
Technology is the backbone to public sector delivery and enabled the major transformation of public services over the past twenty years.
It also presents a major opportunity for local authorities that are aspiring to boost revenue collection, drive efficiencies and improve services for residents and local businesses. Take council tax as an example, where increasingly residents want to pay online. If local authorities are not integrating payment technology – and making it a seamless process for the customer – councils will simply not be able to maximise the amount of revenue they collect. They also run the risk of ‘digital residents’ defaulting to costly, traditional channels and limiting the return on investment.
Technology will always be an enabler but invariably councils will need a deep understanding of customers’ expectations across all channels. A lot more effort needs to be focused on driving digital capability to serve customers exactly how they want to be served.
This is where ‘usability testing’ comes into play and is arguably one of the most effective ways to increase efficiency within a council’s digital processes.
Too often we assume a process or customer behaviour will work in a certain way when using portals, mobile devices or automated switchboards. Usability testing challenges these assumptions and looks at how everyday users genuinely interact with these digital solutions and processes.
Let’s look at the process for residents wanting to make an application for a parking permit, for example. What steps does a resident have to go through? Is there an online form? Do they know about it? What do they like about it? How clear is the information or easy is it to use? Is the process as slick as it could be? Are there any points where you may lose the online user for non-online options?
Having worked on a number of usability testing projects of different sizes and at different stages of implementation, my top tips for getting it right are:
The digital solution
The solution should ideally have an interface and be testable e.g. websites, web forms, IVRs, etc. This is to create a realistic situation in which the participant can be observed performing a list of tasks using the solution you are testing.
Test with ‘real’ users
Recruit potential users across different age ranges, demography and digital skills to get the broadest picture of usability testing. Local cafes, one-stop-shops and libraries are good places to find real users. Alternatively, consider residents who have sent in complaints and wish to be contacted regarding their feedback.
Don’t forget the developers
The solution development team (business analysts, designers, solution architects, developers, testers etc.) should be part of the usability testing. This will give them a first-hand view of what customers really want, access to the customers’ cognitive process and positively impact software development.
The earlier the better
Usability testing is often done retrospectively. This can still bring benefits, but the earlier this is done in a project, customer satisfaction, efficiencies and uptake are often better. Prototypes and clickable pages can be used for testing, a fully built product is not required. This early testing avoids huge redesign costs, increases user retention and allows issues to be raised with solution development team.
Little and often
Ideally, each user testing should be conducted by maximum of two people, any more may make the customer feel uncomfortable or less likely to participate. It should be brief - no
more than 10 minutes with a maximum of 5 testers at each session until the web form is fully developed. Running tests in small groups over a period ensures that you can distribute your budget for testing across many small tests rather than spending it all on one elaborate test. During testing, encourage users to think out loud and share their experience including thoughts and ideas. The testers should records findings and analyse test results immediately.
Don’t be afraid to redesign
User testing should be reported and shared with the project stakeholders with an agreed prioritisation plan. After briefing the solution development team, they should come up with ways to solve identified issues and redesign the process, for example, the web form to better suit user needs.
Repeat usability tests with a similar range of users to confirm users no longer have the same issues and no others have arisen. You need to repeat this process several times to get the best out of it.
Councils that that involve users at the early stages of design have an added advantage. They are more likely to discover any glitches to the service or user journey before it has been fully built. Ultimately changes can be made in more cost-effective way and there is likely to be a higher satisfaction and uptake of the digital channels – overall maximising the return on the digital investment.
The creation of the Government Digital Strategy (GDS) became a huge incentive for the public sector to align even more with digital. Capita’s Local Government digital customer experience team is at the forefront of this initiative, working with a wide range of local authorities to drive digital transformation across service areas. Our engagement with a cross section of stakeholders show a significant number of local authorities, are more invested in working with us to transform their customer experience and prioritize their channel shift strategy.
By Ify Madu, Business process management manager, Capita Local Public Services
Ify has worked as a Senior business analyst/project manager for over 7 years in multi-disciplinary organisations across the private and public sector, joining Capita in 2015. Ify has a broad experience working on digital and business transformation projects; She is currently Business process reengineering manager for Capita leading a team of 6 Business Analysts on digital projects.