When the drop hits
- doug56778
- Nov 7
- 3 min read
Building a real-time tracker for GA4 data delivery times
I have a low tolerance for the misuse of instant messages.
You know the type. The single "Hi" in a chat, followed by an agonising silence. Or worse, the classic late-Friday manager message: "We need to talk." That one sends a whole weekend's worth of anxiety spiralling, even if the "talk" is just to offer praise.
But when messaging systems are used well, they're brilliant.
At Duga Digital, we love good, timely, automated communication. Our team Slack is wired up to tell us when a Git commit happens, when new release notes are posted, or, most importantly for this story, when our Google Analytics (GA4) data is exported to BigQuery (BQ).

This level of information is incredibly useful... until it raises a new question.
The "Spidey-Sense" Problem

"Dan," I asked one day, "is it me, or are our BigQuery drops getting later and later?"
You know that feeling. Your brain gets an itch. Your spidey-sense tingles. Something just feels off.
Dan's response was logical: "The data is all right there in Slack. Run the numbers."
He was right, but parsing Slack messages is messy. I'd much rather analyze this in BigQuery itself. This "itch" was the perfect excuse to build a proper tracking system.
From Slack Noise to BQ Gold
We already had the foundation in place. Our system uses Event Arc triggers to push messages to Pub/Sub, which in turn trigger Cloud Run Functions to handle our BQ workloads and send those handy webhook messages to Slack.
The goal: Modify this system to not only notify us but to log the delivery time.
With my recent interest in MCP servers, the Slack MCP Server is an easy setup, and I was already using Conversational Analytics and a few others so I had the agents pull the data, and then import the historical data into a new BQ table.
While they did the heavy lifting, I modified our existing Cloud Run Function.
Now, when the function confirms the data set is available, it does two things:
Sends the usual Slack message.
Inserts a new row into our bq_delivery_logs table.
# Insert into BigQuery
row = [{
"event_timestamp": datetime.now(timezone.utc).isoformat(),
"project_id": project_id,
"dataset_id": dataset_id,
"readable_date": datetime.now(timezone.utc).date().isoformat()
}]
bq_client.insert_rows_json(BQ_TABLE, row)(Accessibility Note: The code block above shows a Python list of dictionaries being passed to the BigQuery client's insert_rows_json method. The row includes an ISO-formatted timestamp, the project ID, the dataset ID, and a readable date.)
With this in place, the fun bit started. We could finally graph the data.
The Results: A Tale of Two GA4 Tiers
We have two main GA4 properties we track: one on the free tier (Standard) and one on the paid tier (360). The data volume is otherwise identical. We expected 360 to be faster, but the pattern surprised us.
GA4 Standard (Free Tier)
The free tier has been remarkably consistent. Aside from a few glitches, the data delivery time has slowly grown, which seems to line up with our increasing traffic and data volume.
On average, the free GA4 tier delivers its BigQuery dataset in 9 hours and 50 minutes.

GA4 360 (Paid Tier)
And the 360 property? This is where it got interesting.
Yes, it's faster averaging just over 7 hours for the data. But look at the volatility. It has recently shown significant increases in delay, climbing from under 7 hours to over 8 hours, with no corresponding config changes, imports, or new workloads on our end.

Our Takeaway (And Your Turn)
This simple tracking system turned a "gut feeling" into a valuable, actionable insight. For us, it's clear that while GA4 360 is faster, it's not immune to processing delays, and in our case, those delays are currently more volatile than the free tier.
We're watching this closely.
It leaves us with a few questions, and I'd love to hear your thoughts:
Do you track your BigQuery data drop delays?
Have you seen similar patterns with 360 vs. Standard?
How do you automate your delivery notices and downstream workloads?
Share your experiences in the comments.


Comments