alphaThis is our new digital manual - it's a work in progress.

Measuring and reviewing content

Content reviews

A content review allows us to understand our content and users over time.

Numbers are important. They support any recommendations we might make. They do not, however, tell the whole story and a review should be about giving context and a narrative. A story is more relatable than numbers. How do you do this?

Compare and show progression

What is the content doing now that it didn’t before? Look at the data from previous reviews. Where there is no previous review, put this in your executive summary. There are some assumptions you can make based on content design principles, just state this. As you progress from one review to another you might see patterns. Content might be performing better than 3 months ago but less well than 6 or 12 months ago.

Once we start understanding how content performs over time we can start to project into the future. How should much do we want to improve our content and what does that look in terms of key metrics?

Compare metrics and interrogate which metrics are most important for the content (user needs) you’re reviewing. Is there something you’re missing? Is there an event, a change in process (online or offline) or functionality that means that you will need to change how you measure the content?

Context

Aligned to comparative analysis is contextual analysis. What has happened in the last 3/6/12 months (in addition to the content changes you’ve put in place)? Has there been a cyclical event that is worth noting (school holidays/term times, major holidays, tax year etc.)?

Again, this is important when presenting a review to people who are not designers, researchers or analysts. It shows the content creating interactions with real life.

Important, but often forgotten, is looking at other content. How does your content fit in with other pages? Were your pages helped by something that happened on another area of the site?

Recommendations

As you continue telling the story of your content you’ll start to see the cycle of review, action and progress emerge.

“This is what’s happening, these are the numbers to support it, we recommended…and this is what happened…”

Try to frame recommendations within the user need. Where there is no user need, state this in your executive summary. Any assumptions based content design principles should be stated. You might end up recommending that the content be tested with users and/or a user need worked on with the researchers but initially work on broad insights about the users (maybe from other pages that have a user need). Are the pages still fulfilling the need? Has the need changed?

Recommendations could also:

  • be inconclusive, you might want to recommend further action, i.e. user research
  • include changing content type or format
  • refer to similar content on essex.gov.uk or outside which could serve the user need better

What does it look like?

  • Surface the narrative. Recommendations should be at the top of what you present. Remember the inverted pyramid. Big ideas at the top, detail below
  • Keep the narrative concise and easy to read
  • Remember you should be talking to as many people as possible (not just designers and researchers) so keep it compelling and jargon free (yes, we also use jargon)
  • Remember to show the numbers especially with comparative analysis. So current number with previous number in brackets

Measuring content

Content should be measured and reviewed regularly to make sure it’s working for users. If it’s not, it should be iterated and improved.

Content will have different review periods depending on demand, format and legislative requirements. Generally, you should consider reviewing sections of content every:

  • 3 months for very high-volume content
  • 6 months for high to average volume content
  • 12 months for low volume content

How to review content

It’s helpful to think about the following things when reviewing sections of content.

Usability

Find out if the content is working for users. Look at the original user needs and test the acceptance criteria.

Are users achieving their goals? You can test this by:

  • using Google Analytics or Hotjar heatmaps to see if users are clicking on a call to action
  • using Google Analytics to assess if there are any broken journeys
  • looking at scroll depth in Hotjar to see if users are reading the content
  • seeing if the average time spent on page (data from Google Analytics) is comparable to the average reading time of page (data from a readability checker)

Some content will have dashboards set up that will collate this information. Speak to the Digital Analyst to see if your content has a dashboard.

Accessibility

Check the content is accessible by:

  • using the SiteImprove accessibility checker
  • testing the content’s readability in SiteImprove or through a readability checker (such as Hemmingway app or Readable) - aim for a reading age of 9 to 11 years old or 5th Grade

Findability

Are users easily able to find your content? Use:

  • Cludo to see if any keywords relating to your content are leading to ineffective or no results
  • Google Analytics to see if traffic has remained consistent or if you’re seeing unusual activity
  • Google to see how highly your page ranks – if it’s not that high, perhaps the meta description or title needs tweaking

Satisfaction

Use Hotjar to look at user feedback. There’s no need to change content based on one person’s views but try and spot any patterns in comments.

Accuracy

Check for inaccuracies and errors in your content by:

  • looking at broken links and spelling mistakes in SiteImprove
  • asking the content owner or SME if there are any factual changes to the content

This is also a good opportunity to ask the content owner or SME if they expect any upcoming changes to the content, such as changes in prices, dates or legislation. Changes should be added to the content calendar.

When you’ve finished your review

Any quick and easy fixes should be made to the content.

If your review highlights problems with the content, a redesign should be scheduled in to the work backlog. You should also engage the User Research team about testing redesigned content.

When you’ve finished, remember to set a new review date!