Government Digital Service Standards – Our Top Tips on How to Work Towards Passing the GDS Assessment

May 24, 2019 10:04 am

Government Digital Service Standards – Our Top Tips on How to Work Towards Passing the GDS Assessment



Is your service subject to GDS Assessment? If so, you will need to be considering the points set out in the GDS standard to ensure your service meets the criteria, and more importantly, provides a great user experience.

A significant portion of the 18 points involve UX methods and user research activities. Having worked on a number of government and public sector projects, we have gained valuable insights into what makes for a successful GDS service, by diligently working closely with the organisations, carrying out UX research, and participating in GDS assessments.

Here are our key learnings and pieces of advice, focussing on four of those points which are most closely focused on UX and user research activities.

Understand user needs

Understanding your users and their needs should start as soon as possible in your project life cycle, to give your service the best possible start in having a successful launch down the line. Conducting research into user needs can, and often should, begin before you even have a service to test.

In Alpha, this can take the form of activities such as user interviews, focus groups, concept testing, or exploration into how potential users interact with similar existing services. With a brand new service, your first task is to define who your users are, then explore their needs, goals, and frustrations.

Your findings should then be translated into user stories, so the whole team can work towards these to build the service with user needs at the center. Having a collection of robust personas is also an effective way to integrate user needs throughout the whole team as they allow them to visualise who they’re building the service for, and what experiences and frustrations the users might be bringing with them when they interact with your organisation.

As your build progresses, you can move onto testing higher fidelity prototypes, testing environments, and live services, to make sure the service is still meeting user needs, and to discover any new user needs, or even user groups, that have come about since you began.

A tip on engaging your audience at a GDS assessment: present the personas as early as possible before you walk through the service. This introduces the panel to your users, meaning that they can visualise who will be using the service, and provides human context to all your design decisions.

Do ongoing user research

User research doesn’t end once you have defined your user needs. Regular user research is something we recommend to all clients, regardless of sector. It’s important to include user input at every stage of development to maintain a user-centred approach, checking that what you’re building is working for them, and validating any changes made from previous iterations.

During Alpha and Beta, we recommend incorporating some user research, such as usability testing, into each sprint. We have found that rapid testing sessions with a quick turnaround in planning, findings, and recommendations can fit perfectly into an agile framework, and help to maintain the velocity of development and keep it steered in a user-centred direction.

One way in which we have supported clients in doing this, is by embedding one of our researchers into the development team, providing an on-site UX consultancy service, attending and contributing to all agile ceremonies. This means that not only does the client have direct access to UX expertise every step of the project, but also that when it comes to usability testing, we know the project inside-out, and can focus each round accordingly.

We are also on hand at your GDS assessment to provide a first-hand account of all the user research completed and answer any questions on user needs and how we plan to conduct research in the next design phase.

Make sure users succeed first time

The aim of all your research, planning, and following style guidelines, is to make your service as easy as possible to use, for all users. This means considering and involving users of different abilities and digital inclusion levels.

From the start of a project, when we begin our participant recruitment planning, we ensure accessibility and digital inclusion are accounted for, and carry this through into usability and accessibility testing. An accessibility consultant can also be a fantastic addition to your delivery team – helping to embed accessibility considerations at every step of the design.

Make sure to keep track of where all your participants sit on the digital inclusion scale, to be able to demonstrate, in your assessment, how you have included these users in your decision-making. Of course, there isn’t a one-size-fits-all approach and some services will need to consider lower-skilled users more than others. The table below shows an example of the spread of digital inclusion scores we included in a previous public sector research project.

Number of users shown by the digital inclusion scores

Make the user experience consistent with GOV.UK

This point is about using the same language and design patterns as the rest of GOV.UK. These guidelines are based on best practice and user research, so are a great source to tap into to make your design user-friendly and familiar. These links cover how to do this:

https://design-system.service.gov.uk/

https://www.gov.uk/guidance/style-guide/a-to-z-of-gov-uk-style

However, as more than just those services hosted on GOV.UK are subject to GDS Assessment, this point is not always critical and sometimes what works for central government might not be appropriate to your organisation. So whilst referring to GOV.UK patterns is a good starting point, it is still important to test the designs with your users, to ensure it works in the context of your service.

Improvements to the Government Service Standard


[ https://gds.blog.gov.uk/2019/05/09/welcome-to-the-updated-service-standard ]

The evolution of the Government Service Standards this year aims to improve government digital services even further, by making it work better for a wider range of organisations and across more channels than purely digital.

Having just supported a new public sector organisation in its Alpha development phase, it’s great to see that the updates to the standard should make it more transferrable for similar organisations working outside of central government.

For example, our client was not required to make the user experience strictly consistent with GOV.UK, so point 13 (Make the user experience consistent with GOV.UK) was not relevant. The new version, ‘Use and contribute to common standards, components and patterns’ is a much more inclusive approach and also benefits the UX community by encouraging teams to contribute their own work to evolve best practice too.

The way the points have been reworded, organised, and consolidated is a perfect demonstration of how GDS are striving to constantly improve the user experience of their services. The 14 points are much easier to understand, and the way they are chunked into 3 areas of focus will further help teams to organise their work in a way which helps them naturally cover each point and make assessments run smoother, as the team can be confident in presenting their research and worrying less about meeting strict government design conventions.

We look forward to working with more teams to support their public sector projects and work towards the new standards to deliver more great user experiences.

If your public sector project is in need of UX support, get in touch to find out more. https://www.simpleusability.com/contact/