Using GA4 with BigQuery for Product Analytics

Leverage GA4 and BigQuery with Mitzu.io to gain deeper insights into user behavior and revenue impact with a streamlined, warehouse-native solution.
István Mészáros
8
min read
Share this post

Google Analytics 4 (GA4), introduced in October 2020, is positioned as a successor to Universal Analytics(UA).
GA4 aimed to enhance privacy standards in data collection and incorporated more AI and machine learning than UA, offering potential benefits for marketers.  

In a nutshell, people at Google wanted GA4 to become the next marketing and product analytics solution for any business.
Google announced that GA4 would replace Universal Analytics in July 2023, and historical data would not transfer from UA to GA4.
Two years later, it is painfully clear the transition was neither smooth nor well-received.

Here are the top complaints about GA4:

  • GA4 is Difficult to Use: The GA4 charts, reports, and dashboards are complicated to navigate, and most marketers easily get lost.
  • Bad User-Interface: The primary concern raised by marketers regarding GA4 was its user interface. Filtering, user segmentation, or just finding features is complex at best. Users of GA4 easily get lost in endless menus and modals.
  • Data Lag: In the free tier, GA4 offers one or more days lag for data, which is often insufficient for marketers.
  • Data Discrepancies: Data quality is often mentioned as the primary problem with GA4. Insights in GA4 can diverge or contradict insights from other tools, usually leading to confusion.
  • Insufficient for Complex Analytics: While simple insights can be a good enough solution If someone wants to dig deeper into the data, it often hits a block.

BigQuery to the rescue 

While the GA4 user interface may be sufficient for basic requirements, answering more complex questions can be impossible. Luckily, you can export all your GA4 data to BigQuery for free (up until a limit).
Here is a great description and video from Google on how to do it:

https://support.google.com/analytics/answer/9358801?hl=en

You can set up daily batch exports or use the streaming approach to BigQuery.
While the streaming approach requires you to share your credit card details, it solves the issue of data lag. 
You will see almost immediately all your GA4 tracked events in BigQuery. 

If data lag is not something you are concerned about, stick to daily batch exports. It is free.

GA4 data model in BigQuery

Having your GA4 data in BigQuery comes with a lot of benefits. The obvious one is that you can connect Looker Studio or similar data tools to it. 
However, there is a caveat. GA4 data model is very complex. You must be an "expert" in SQL to answer any questions from your raw BigQuery data.
The main issue with the BigQuery GA4 data is the REPEATED type for the Event Param column.

Schema for GA4 with Repeated Types

Repeated types are the arrays in BigQuery. The main issue with arrays in SQL is that you need to know which element to access by its index.


SELECT  
  event_name,  
  user_pseudo_id,  
  event_timestamp,  
  params.key,  
  params.value
FROM `<project>.events_YYYYMMDD`,  
UNNEST(event_params) AS params
WHERE 
  event_name = 'page_view' 

The only sensible way to access the repeated types is by using the UNNEST function in BigQuery.  (See above)

However, this will multiply the number of rows per event in the dataset, which is often impractical to query. The other issue with multiplying the number of rows per single event is that it breaks a fundamental concept of event modeling in the data warehouse. This concept means every user interaction with your landing page or the app you are tracking is modelled in a single row.  The single row per event concept is essential for larger projects as you can easily reason about it as a data analyst or data scientist.  Having multiple rows per event will force you to always keep in mind the deduplication.

Unnested GA4 data in BigQuery

Transforming the GA4 model to clean event model

A widely accepted data model for product or landing page events is called one big table (OBT). GA4 data is already modeled with this model. 
Dealing with event properties in REPEATED types can be a real headache regarding SQL. As we've discussed, attempting to unnest these repeated types can break the fundamental concept of one event per row.

Currently, the only viable solution in BigQuery for avoiding REPEATED types while keeping the single row per single event concept is to model the properties initially in REPEATED types as a JSON column.

Why? The JSON keys can represent the names of the event properties as the keys of the JSON objects, while the values will be the corresponding JSON values.
This way, the property values are addressed not by indexing in an array (REPEATED type) but with the property's name.

More about event modeling concepts here: Modeling events in a data warehouse

One caveat is that we need to perform periodic data transformation due to this model change.
The easiest way is to create a new daily table containing all the event and user properties in JSON columns.

The code below transforms the REPEATED types of varying lengths to JSON columns instead, preserving the single row per event concept while making the properties accessible not by an index but by their name.


CREATE OR REPLACE TABLE `<project>.<dataset>.ga4_events` AS (
  WITH
    event_properties AS (
    SELECT
      event_timestamp,
      user_id,
      user_pseudo_id,
      event_name,
      JSON_OBJECT(ARRAY_AGG(p.key), ARRAY_AGG(COALESCE(COALESCE(COALESCE(p.value.string_value, CAST(p.value.int_value AS string)), CAST(p.value.float_value AS string)), CAST(p.value.double_value AS string)))) event_params
    FROM
      `<project>.<dataset>.events_*`,
      UNNEST(event_params) AS p
    GROUP BY
      1,
      2,
      3,
      4 ),
    user_properties AS (
    SELECT
      event_timestamp,
      user_id,
      user_pseudo_id,
      event_name,
      JSON_OBJECT(ARRAY_AGG(p.key), ARRAY_AGG(COALESCE(COALESCE(COALESCE(p.value.string_value, CAST(p.value.int_value AS string)), CAST(p.value.float_value AS string)), CAST(p.value.double_value AS string)))) user_properties
    FROM
      `<project>.<dataset>.events_*`,
      UNNEST(user_properties) AS p
    GROUP BY
      1,
      2,
      3,
      4 )
  SELECT
    TIMESTAMP_MICROS(evt.event_timestamp) AS event_time,
    ep.event_params,
    up.user_properties,
    evt.* EXCEPT (event_timestamp,
      event_params,
      user_properties,
      items)
  FROM
    `<project>.<dataset>.events_*` AS evt
  LEFT JOIN
    event_properties ep
  ON
    evt.event_timestamp = ep.event_timestamp
    AND COALESCE(evt.user_id,'')=COALESCE(ep.user_id,'')
    AND evt.user_pseudo_id =ep.user_pseudo_id
    AND evt.event_name = ep.event_name
  LEFT JOIN
    user_properties up
  ON
    evt.event_timestamp = up.event_timestamp
    AND COALESCE(evt.user_id,'')=COALESCE(up.user_id,'')
    AND evt.user_pseudo_id =up.user_pseudo_id
    AND evt.event_name = up.event_name)

Visit GitHub for the most up to date code.

The new schema will look like this:

Using the transformed event model

The transformed data will look like this:

Once the transformation is ready, you access the events and their properties in a human-readable way. Yes, it is true now you must use an extra inbuilt function.
This is something that should be addressed by folks at BigQuery. Other data warehouse solutions support JSON or MAP types natively.


SELECT   
   event_name,
   user_pseudo_id,
   event_time 
FROM `<schema>.ga4_events` 
WHERE json_value(event_params['page_title']) = 'Mitzu - Simple Warehouse Native Product Analytics'   
AND event_name='page_view'

Using the transformed model now is sufficient for answering complex questions.

Final thoughts - product analytics over BigQuery

A warehouse-native product analytics solution such as Mitzu can easily create SQL queries over a modeled dataset with JSON columns.
Mitzu provides a clean and easy-to-understand UI for the GA4 events from BigQuery.

GA4 data in BigQuery analysed in Mitzu

Creating funnel insights or measuring visitor retention this way becomes a simple task.
You as a data analyst or data scientist don't need to worry about how to write the SQL queries.

How to get started?

Collect data

Ingest your first and third party data to your data warehouse. If you don't yet have a data warehouse we can help you get started.

Setup Mitzu

Connect Mitzu to your data warehouse just as any other BI tool. List your facts and dimensions tables.
Create an events and properties catalog.

Start making better decisions faster

Start learning valuable insights with a few clicks only. No need to know SQL. Collaborate with your team on key business questions.

Unbeatable solution for all of your analytics needs

Get started with Mitzu for free and power your teams with data