top of page
Search

Superweek 2026 - Day Three

  • 5 days ago
  • 4 min read


KPIs, Consent, Complexity and the Humans Left Standing

If Day One was existential and Day Two was operational, Day Three was foundational.


It forced a deeper question:

Are we actually clear on what we’re trying to achieve before we optimise anything?

From KPIs to tool selection, anomaly detection to AI browsers, consent automation to mindfulness the third day felt less like a technical summit and more like a systems audit of our industry.


Here’s how it unfolded.


Start With the KPI, Not the Dashboard

Tim Wilson – Doing KPIs Right


Tim opened with something that should be obvious but rarely is: most analytics programs are built on poorly defined KPIs.


He reframed data usage into three buckets:

  1. Performance measurement

  2. Hypothesis validation

  3. Operational enablement


But the core of his talk boiled down to two “magic questions”:

  • What are we trying to achieve?

  • How will we know if we’ve achieved it?


Simple.


Rarely answered properly.


Tim dismantled the classic “we expected 8,500 leads” nonsense. That’s not a KPI. That’s a number pulled from the ether. If you want more leads, just spend more money.


The real measure is efficiency.


100 leads per £1,000.Profit per acquisition.


Margin return on spend.


Business outcomes aren’t “harder to measure”. They are measurements. Outputs are verbs. Outcomes are metrics.


One of the sharpest lines of the week:

Alignment is more important than accuracy.

In other words, a slightly imperfect target that everyone believes in beats a perfect metric nobody understands.


If you don’t know what your KPI target should be, you probably don’t understand the commercial plan.


That’s not an analytics problem. That’s a business problem.



Choosing Tools Without Falling Off the Cliff

Fosca Fimiani & Jason Packer – A Journey Into the Unknown


This was a refreshingly sober take on the “GA4 sucks” discourse.


Instead of ranking tools like dog breeds, they evaluated platforms against constraints and organisational reality.


Jason installed 20 different analytics tools and tested them against real use cases.


The takeaway:

  • Rankings are distractions

  • Hype is noise

  • Price has layers (today vs later)

  • Hidden costs matter more than list prices


A few archetypes emerged:

  • Product-led tools with free tiers but limited governance

  • Enterprise platforms with integration depth but complexity

  • Data warehouse + BI custom builds with transparency but high dependency on data architecture


If you’re good at using tools, that doesn’t mean you’re good at selecting them.


Optimise for your organisation, not your ego.



Third-Party Tags: Necessary Evil or Structural Debt?

Lukáš Cech – The Future of Third-Party Tags

Nobody loves third-party tags. Yet nobody removes them.


Lukáš surfaced the quiet tension between marketing and IT that still hasn’t disappeared.


Questions he raised:

  • Does every tag have a business case?

  • Is that business case ongoing or one-time?

  • Why is performance degrading but tags never leaving?


Google and Meta want client-side persistence. Vendors optimise for their needs.


Meanwhile, performance suffers and governance grows murky.


The room didn’t get a clean answer, and that was the point.


Technology debt accumulates quietly. Very few organisations have a removal strategy.



Hunting for the Unknown Unknowns

Siavash Kanani – Seeking Signals


Siavash delivered one of the most technically impressive sessions of the week.


We’re good at monitoring what we expect. We’re terrible at spotting what we didn’t anticipate.


His anomaly detection engine in GA4 + BigQuery reframed failure as:

A deviation that hasn’t been named yet.

The initial system was powerful but expensive — ARIMA+ models trained across multiple time series daily.


Cost drivers:

  • Runs per day

  • Series per run

  • Bytes processed per training


The breakthrough came from:

  • Incremental Dataform processing

  • Lean training datasets

  • Weekly retraining

  • WARM_START adjustments

  • Surfacing only >98% confidence anomalies

  • Ranking based on business priority


100x cost reduction.


But then came the human problem: too many anomalies.


Volatility isn’t insight.


The solution? Weighting, clustering, context, escalation logic. Human judgment still required.


Anomaly detection doesn’t remove humans. It changes their role from searchers to arbiters.



Consent Automation in the Age of Troll Law Firms

Fred Pike – Consent or Chaos


Fred’s story felt more real than theoretical.


US companies facing five-figure fines now treat compliance as a cost of doing business.


Troll law firms are actively targeting CMP weaknesses.


He demonstrated using LLMs, MCP servers, and Claude to automate CMP updates and container manipulation.


It worked.Then it didn’t.Then Python took over.


The honest takeaway:

AI-assisted operations scale faster than manual processes. But hallucinations are still real. Humans must remain in the loop.


Automation without oversight becomes liability.




AI Browsers and the End of Assumptions

Denis Golubovskyi – Tracking in AI Browsers

AI browsers like Atlas and others are restrictive by default:

  • Script blocking

  • Consent suppression

  • Agent-like browsing behaviour


Even with first-party setups, user agents are homogenised. Detection gets harder.

Patterns emerging in BigQuery:

  • Heavy, rapid event sequences

  • Strange UAs

  • Non-human interaction timing


But here’s the uncomfortable truth:

GA4 doesn’t filter bots meaningfully.Detection is heuristic.The arms race continues.


Tools like Clarity can help. Browser APIs like event.isTrusted offer signals.

But the bigger shift is behavioural.


Browsers are becoming agents.Users are outsourcing navigation.


Your funnel assumptions are already outdated.



Complexity: Worth It or Not?

Matt Gershoff – Interactions

Matt revisited complexity through a statistical lens.


Run an additive model. Run a multiplicative model.If the complex one doesn’t materially reduce error, don’t use it.


Truth is less useful than utility.


This theme echoed all week:just because we can build complexity doesn’t mean we should.



Data to Action Requires Architecture

Nicolas Hinternesch – The Activation Framework

Nicolas closed the technical arc.



Data doesn’t turn into action by magic. It requires:

  • Clear questions (why, what, who, where, how)

  • Direct activation (automated systems)

  • Indirect activation (people making decisions)


The concept of human on the loop stood out.


Not fully embedded in the system.Not removed from it.Adjacent. Monitoring. Intervening.


Autonomy with oversight.


That’s the pattern emerging across AI, analytics, and governance.



The Quietest but Most Important Talk

Jeremy Lutz – The Mindful Way of Web Analytics

Jeremy ended the day by turning the analytical lens inward.



If we train ourselves to constantly look for what’s broken, we start seeing broken patterns everywhere; in dashboards and in ourselves.


Stress is structural in this industry:

  • Constant tool change

  • Vendor pressure

  • Compliance risk

  • AI uncertainty


His prescription wasn’t fluffy:

  • Breathwork

  • Meditation

  • Journaling

  • Gratitude

  • Choosing your battles


When asked if the increased need for mindfulness reflects our industry; yes.


And yes.


The final reflection wasn’t about KPIs or AI.


It was about sustainability.




Day 4 to come!

 
 
 

Comments


Google_GMP_Certified_Badge_Final_Med (1).png
  • Facebook
  • LinkedIn

CONTACT US

Lime Tree Work Shop, Lime Tree Walk, Sevenoaks, Kent, TN13 1YH

info@dugadigital.com

Registered in England and Wales no. 13177452.
VAT Registration no. 397 6168 39.

bottom of page