StatusCake

Performance and Availability of UK Government Websites in 2018

ssl monitoring

Over the period of 1st June – 30th June 2018, StatusCake tested 2653 .gov.uk domains for website availability and load-time. To establish performance benchmarks, we referred to the StatusCake Uptime and Downtime Cheat Sheet, which converts uptime percentages into monthly and yearly downtime in days, hours, and minutes.

In the period tested, the average uptime of the domains was 97.12%, which equates to over 20 hours of average downtime per month, and over ten days downtime over the course of the year! This also falls short of the recognized standard of 99.90% minimum average uptime.

In terms of load time, the average performance was a respectable 0.97 seconds, with Sandown Council experiencing the slowest load time of 9.76 seconds, and Stirling Council enjoying the fastest load time of 0.01 seconds.

How We Tested Performance

To undertake this research, StatusCake tested the availability of the homepage of 2653 .gov.uk domains every ten minutes over the period of 1st June – 30th June 2018 (inclusive). If a website was unavailable when tested, an error was recorded. Over the course of the month, average uptime was calculated by the number of times the availability of a particular domain was tested, divided by the number of times an error was recorded. Therefore, a domain with 100% uptime was available every time it was tested in this period, while a domain with 0% uptime was down every time it was tested.

Load time was tested from the point after which the DNS lookup ends to the point at which the page has fully loaded. This takes into consideration account files and images but does not include scripts, such as JavaScript and Ajax. This was calculated to the nearest millisecond, and forms an accurate picture of the type of load time the typical end user would experience.

Best Performing Domains

Over the course of our testing period in June 2018, we are pleased to report that 964 .gov.uk domains experienced no down time at all in the period tested.

Of the 36% of sites that enjoyed 100% uptime in June, the average load time was 0.47 seconds, this is under half the average load time of the domains overall, suggesting that reliable websites also tend to be faster and ultimately easier to use.

To calculate a best-performing website overall, we have looked at the 964 sites which experienced 100% uptime, and filtered these domains by load time. This method identified the Food Standards Scotland website as the best performing .gov.uk domain, registering 100% uptime and an average load time of 0.005 seconds.

http://datawrapper.dwcdn.net/wm5te/2/

Worst Performing Domains

We’ve looked at some of the best performing .gov.uk domains, but now it’s time to take a look at some of the domains which didn’t fare so well over the course of our testing. We were surprised to find that 287 (11%) domains experienced average downtime of over two hours during the course of a month, which equates to over a day of downtime over the course of a year. If the two hours of downtime per month experienced by 11% of domain is a worry, the 10 hours of monthly downtime must be a serious concern for the 6% of websites which registered an average uptime of 98.6% or lower. Unfortunately, for some domains, it gets worse, as 4% of .gov.uk websites tested experienced average downtime of over one full day in the course of the month, representing over 12 days of downtime over the course of a year! We were able to identify a best performing domain in our testing period, and it’s only fair that we also recognize the poorest performing domain we tested. That dubious honour goes to Kidderminster Town Council, who experienced an average uptime of just 17.47% in our testing period of June 2018 .

Conclusion

Our motivation in carrying out this research, was not to identify the best or worst performing .gov.uk domains, but to understand how the UK government websites fares when it comes to average uptime. The answer is a mixed bag, with 22% of domains falling short of the generally recognised standard of 99.90% minimum average uptime. Despite this, over three quarters of the domains we tested did either meet or exceed these standards, meaning that the government is doing more than right than it is wrong when it comes to maintaining its websites.

Nevertheless, as we have highlighted in the course of our research, website downtime is a serious problem for national and local authorities as much as it is businesses. If a local council website is down for just two hours a month, that adds up to one whole day over the course of the year in which local residents cannot access potentially crucial information about their council.

The research also raises the question as to what extent local authorities are satisfying the ‘Local Government Digital Service Standard‘, a 15 point common approach for local authorities to deliver good quality, user centered, value for money, digital services. While there are no specific guidelines on average uptime, the standards do stipulate that local authority websites should ‘plan for the event of the digital service being taken temporarily offline’.  You can find out more about Local Government Digital Service Standard in our article, StatusCake & UK Local Government Digital Service Standard.

Ultimately, this research reinforces the need to invest in a dedicated website uptime and performance monitoring service, which can alert you instantly the moment your site goes down. StatusCake provide a suite of performance monitoring tools which are easy to set-up and use, and provide you with invaluable insights into how your website’s performance is impacting your customers’ experiences.

Click here, to start your free trial today.

Share this

More from StatusCake

Engineering

Beyond Uptime: Building a Self-Healing OpenClaw Observability Stack

3 min read The allure of OpenClaw is undeniable. You deploy a highly autonomous, self-hosted AI agent, give it access to your repositories and inboxes, and watch it reason through complex workflows while you sleep. It is the dream of the ultimate 10x developer tool realized. But as any veteran DevOps engineer will tell you: running an LLM-backed

When AWS us-east-1 Fails, Much of the Internet Fails With It

7 min read There are cloud outages, and then there are us-east-1 outages. That distinction matters because failures in AWS’s Northern Virginia region rarely feel like ordinary regional incidents. They tend instead to expose something larger and more uncomfortable: too much of the modern internet still behaves as though one place is an acceptable concentration point for infrastructure,

In the Age of AI, Operational Memory Matters Most During Incidents

7 min read Artificial intelligence is making software easier to produce. That much is already obvious. Code that once took hours to scaffold can now be drafted in minutes. Boilerplate, integration logic, tests, refactors and small internal tools can be generated with startling speed. In some cases, even substantial pieces of implementation can be assembled quickly enough to

AI Didn’t Kill the SDLC. It Made It Harder to See

10 min read Whilst AI has compressed the visible stages of software delivery; requirements, validation, review and release discipline have not disappeared. They have been pushed into automation, runtime and governance. The real risk is not that the lifecycle is dead, but that organisations start acting as if accountability died with it. There is a now-familiar story about

When Code Becomes Cheap: The New Reliability Constraint in Software Engineering

4 min read How AI Is Shifting Software Engineering’s Primary Constraint For most of the history of software engineering, the primary constraint was production. Code was expensive, skilled engineers were scarce, and shipping features required concentrated human effort. Velocity was limited by how fast people could reason, implement, test, and deploy. That constraint shaped everything from team size,

Buy vs Build in the Age of AI (Part 3)

5 min read Autonomous Code, Trust Boundaries, and Why Governance Now Matters More Than Ever In Part 1, we looked at how AI has reduced the cost of building monitoring tools. Then in Part 2, we explored the operational and economic burden of owning them. Now we need to talk about something deeper. Because the real shift isn’t

Want to know how much website downtime costs, and the impact it can have on your business?

Find out everything you need to know in our new uptime monitoring whitepaper 2021

*By providing your email address, you agree to our privacy policy and to receive marketing communications from StatusCake.