Want to know how much website downtime costs, and the impact it can have on your business?
Find out everything you need to know in our new uptime monitoring whitepaper 2021



Google recently announced that the current Google Analytics 3 (Universal Analytics) will come to an end in July 2023 (now extended until October 2023) and they’ve encouraged all current users to start using the new GA4. Google Analytics 4 is the new version of the current GA reporting portal that current users have used to analyse the performance of their sites.
When the launch of GA4 was made public in October 2020, Google gave all current users enough time to migrate their current use of GA3 to the newer and better GA4 until July 2023 which is plenty of time for both small and large corporations to switch.
The question is now, however, should you switch Universal Analytics off and switch immediately to GA4 or should you wait? The best thing to do is to use both simultaneously first to make sure the new Google Analytics has all the relevant reports in place otherwise you will miss out on this data. Although there are many new features on the new Google Analytics 4 version it might need some time for you to have all your reports set up.
The short answer would be yes! If you want to get the most out of Google Analytics and use the same reports as you have been using, you must update to GA4. If you don’t, by the time GA3 stops working in October 2023, you and your business will be left high and dry without the tracking and reports you need to monitor your website performance. The result? This could cause endless issues for your internal teams that rely on this data for both strategy and sales, which in turn helps to deploy improvements and adjustments to your business.
Good question. Firstly, one of the fundamental differences is the UI. When you first see the reporting UI for GA4 you might be very surprised to see it differs greatly from what you’re used to in GA3.
View functionality will also be altered with only one view with GA4 for your reports which differs quite significantly from the 25 offered currently by GA3. The reason for this? For a cleaner, easier viewpoint that shows you everything you need to report on in one place. You can still filter your view however, by “data streams”.
Probably the biggest, and more shocking differences of all, is how GA4 measures activity on your website. For example, GA3 uses timed sessions whilst GA4 doesn’t use time as a measurement for a said session. Surprised yet? Wait until you hear this one…
Forget about the bounce rate. Yep, I said it. Bounce rate is going in GA4 and instead you’ll see “engagement rate”. Unlike bounce rates, engagement rates take into consideration the time taken on the landing page, which could give you an even better understanding of your visitors’ interaction with your website.
These are just the basics. Stay tuned for our comprehensive changes to GA3 and GA4 coming soon!
Share this
3 min read In the previous post, we looked at how alert noise is rarely accidental. It’s usually the result of sensible decisions layered over time, until responsibility becomes diffuse and response slows. One of the most persistent assumptions behind this pattern is simple. If enough people are notified, someone will take responsibility. After more than fourteen years
3 min read In a previous post, The Incident Checklist: Reducing Cognitive Load When It Matters Most, we explored how incidents stop being purely technical problems and become human ones. These are moments where decision-making under pressure and cognitive load matter more than perfect root cause analysis. When systems don’t support people clearly in those moments, teams compensate.
4 min read In the previous post, we looked at what happens after detection; when incidents stop being purely technical problems and become human ones, with cognitive load as the real constraint. This post assumes that context. The question here is simpler and more practical. What actually helps teams think clearly and act well once things are already
3 min read In the previous post, we explored how AI accelerates delivery and compresses the time between change and user impact. As velocity increases, knowing that something has gone wrong before users do becomes a critical capability. But detection is only the beginning. Once alerts fire and dashboards light up, humans still have to interpret what’s happening,
5 min read In a recent post, I argued that AI doesn’t fix weak engineering processes; rather it amplifies them. Strong review practices, clear ownership, and solid fundamentals still matter just as much when code is AI-assisted as when it’s not. That post sparked a follow-up question in the comments that’s worth sitting with: With AI speeding things
4 min read Why strong reviews, accountability, and monitoring matter more in an AI-assisted world Artificial intelligence has become the latest fault line in software development. For some teams, it’s an obvious productivity multiplier. For others, it’s viewed with suspicion. A source of low-quality code, unreviewable pull requests, and latent production risk. One concern we hear frequently goes
Find out everything you need to know in our new uptime monitoring whitepaper 2021