Want to know how much website downtime costs, and the impact it can have on your business?
Find out everything you need to know in our new uptime monitoring whitepaper 2021



Over the period of 1st June – 30th June 2018, StatusCake tested 2653 .gov.uk domains for website availability and load-time. To establish performance benchmarks, we referred to the StatusCake Uptime and Downtime Cheat Sheet, which converts uptime percentages into monthly and yearly downtime in days, hours, and minutes.
In the period tested, the average uptime of the domains was 97.12%, which equates to over 20 hours of average downtime per month, and over ten days downtime over the course of the year! This also falls short of the recognized standard of 99.90% minimum average uptime.
In terms of load time, the average performance was a respectable 0.97 seconds, with Sandown Council experiencing the slowest load time of 9.76 seconds, and Stirling Council enjoying the fastest load time of 0.01 seconds.
To undertake this research, StatusCake tested the availability of the homepage of 2653 .gov.uk domains every ten minutes over the period of 1st June – 30th June 2018 (inclusive). If a website was unavailable when tested, an error was recorded. Over the course of the month, average uptime was calculated by the number of times the availability of a particular domain was tested, divided by the number of times an error was recorded. Therefore, a domain with 100% uptime was available every time it was tested in this period, while a domain with 0% uptime was down every time it was tested.
Load time was tested from the point after which the DNS lookup ends to the point at which the page has fully loaded. This takes into consideration account files and images but does not include scripts, such as JavaScript and Ajax. This was calculated to the nearest millisecond, and forms an accurate picture of the type of load time the typical end user would experience.
Over the course of our testing period in June 2018, we are pleased to report that 964 .gov.uk domains experienced no down time at all in the period tested.
Of the 36% of sites that enjoyed 100% uptime in June, the average load time was 0.47 seconds, this is under half the average load time of the domains overall, suggesting that reliable websites also tend to be faster and ultimately easier to use.
To calculate a best-performing website overall, we have looked at the 964 sites which experienced 100% uptime, and filtered these domains by load time. This method identified the Food Standards Scotland website as the best performing .gov.uk domain, registering 100% uptime and an average load time of 0.005 seconds.
http://datawrapper.dwcdn.net/wm5te/2/
We’ve looked at some of the best performing .gov.uk domains, but now it’s time to take a look at some of the domains which didn’t fare so well over the course of our testing. We were surprised to find that 287 (11%) domains experienced average downtime of over two hours during the course of a month, which equates to over a day of downtime over the course of a year. If the two hours of downtime per month experienced by 11% of domain is a worry, the 10 hours of monthly downtime must be a serious concern for the 6% of websites which registered an average uptime of 98.6% or lower. Unfortunately, for some domains, it gets worse, as 4% of .gov.uk websites tested experienced average downtime of over one full day in the course of the month, representing over 12 days of downtime over the course of a year! We were able to identify a best performing domain in our testing period, and it’s only fair that we also recognize the poorest performing domain we tested. That dubious honour goes to Kidderminster Town Council, who experienced an average uptime of just 17.47% in our testing period of June 2018 .
Our motivation in carrying out this research, was not to identify the best or worst performing .gov.uk domains, but to understand how the UK government websites fares when it comes to average uptime. The answer is a mixed bag, with 22% of domains falling short of the generally recognised standard of 99.90% minimum average uptime. Despite this, over three quarters of the domains we tested did either meet or exceed these standards, meaning that the government is doing more than right than it is wrong when it comes to maintaining its websites.
Nevertheless, as we have highlighted in the course of our research, website downtime is a serious problem for national and local authorities as much as it is businesses. If a local council website is down for just two hours a month, that adds up to one whole day over the course of the year in which local residents cannot access potentially crucial information about their council.
The research also raises the question as to what extent local authorities are satisfying the ‘Local Government Digital Service Standard‘, a 15 point common approach for local authorities to deliver good quality, user centered, value for money, digital services. While there are no specific guidelines on average uptime, the standards do stipulate that local authority websites should ‘plan for the event of the digital service being taken temporarily offline’. You can find out more about Local Government Digital Service Standard in our article, StatusCake & UK Local Government Digital Service Standard.
Ultimately, this research reinforces the need to invest in a dedicated website uptime and performance monitoring service, which can alert you instantly the moment your site goes down. StatusCake provide a suite of performance monitoring tools which are easy to set-up and use, and provide you with invaluable insights into how your website’s performance is impacting your customers’ experiences.
Share this
3 min read In the previous post, we looked at how alert noise is rarely accidental. It’s usually the result of sensible decisions layered over time, until responsibility becomes diffuse and response slows. One of the most persistent assumptions behind this pattern is simple. If enough people are notified, someone will take responsibility. After more than fourteen years
3 min read In a previous post, The Incident Checklist: Reducing Cognitive Load When It Matters Most, we explored how incidents stop being purely technical problems and become human ones. These are moments where decision-making under pressure and cognitive load matter more than perfect root cause analysis. When systems don’t support people clearly in those moments, teams compensate.
4 min read In the previous post, we looked at what happens after detection; when incidents stop being purely technical problems and become human ones, with cognitive load as the real constraint. This post assumes that context. The question here is simpler and more practical. What actually helps teams think clearly and act well once things are already
3 min read In the previous post, we explored how AI accelerates delivery and compresses the time between change and user impact. As velocity increases, knowing that something has gone wrong before users do becomes a critical capability. But detection is only the beginning. Once alerts fire and dashboards light up, humans still have to interpret what’s happening,
5 min read In a recent post, I argued that AI doesn’t fix weak engineering processes; rather it amplifies them. Strong review practices, clear ownership, and solid fundamentals still matter just as much when code is AI-assisted as when it’s not. That post sparked a follow-up question in the comments that’s worth sitting with: With AI speeding things
4 min read Why strong reviews, accountability, and monitoring matter more in an AI-assisted world Artificial intelligence has become the latest fault line in software development. For some teams, it’s an obvious productivity multiplier. For others, it’s viewed with suspicion. A source of low-quality code, unreviewable pull requests, and latent production risk. One concern we hear frequently goes
Find out everything you need to know in our new uptime monitoring whitepaper 2021