/Documentation

Google Service Account power-up

Updated Apr 20, 2026

The Google Service Account power-up connects your sGTM container to Google Cloud services such as BigQuery and Firestore. A single JSON credentials file is all you need to start reading, writing, and enriching event data directly from your sGTM container.

Google Service Account is available on the Free subscription plan and higher. To check your current plan or upgrade, go to your sGTM container settings.

How to set up Google Service Account

Step 1 | Activate the power-up

1. Log in to your Stape account and select your sGTM container from the dashboard.

select your sGTM container from the dashboard

2. Go to Power-ups and click Use next to the Google Service Account panel.

 Google Service Account panel

3. Toggle the Google Service Account switch to enable it.

Toggle the Google Service Account switch

Step 2 | Obtain the service account credentials file

1. Open the Google Cloud Console and go to IAM & AdminService Accounts. Click Create service account.

Click Create service account

2. Enter a name, and click Create and continue.

Enter a name, and click Create and continue

3. Assign the appropriate role:

  • BigQuery Data Editor for BigQuery access
  • Cloud Datastore User for Firestore access
  • Both roles if you need access to both services

4. Click Continue and Done.

Click Continue and Done

5. Click the newly created account, open the Keys tab, then click Add keyCreate new key.

Create new key

6. In the pop-up, select JSON and click Create. A JSON file will download to your computer.

In the pop-up, select JSON and click Create

7. Back in Stape, click Select file and upload the JSON file you downloaded.

8. Click Save changes.

Click Save changes

Step 3 | Set up Google Cloud Services

1. In your sGTM container, open Templates, click Search Gallery, and add the Write to BigQuery tag from the Template Gallery or import it from GitHub.

add the Write to BigQuery tag

2. Create a new tag using that template and configure what data to write:

  • All Event Data - writes all event data available in the sGTM event to your BigQuery table. Only fields matching your table schema will be accepted, others are discarded.
  • Custom Data Only - writes only the specific fields you define, with no default event data included.

Optionally enable Add Event Timestamp to include a millisecond timestamp. The target BigQuery column must be of the INTEGER type.

3. Set your trigger and save the tag.

Set your trigger and save the tag

1. Open Firebase, click Create project, and select the same project you used for the service account.

2. Follow the Firebase instructions to create a database. Choose a Cloud Firestore location that matches your sGTM server location. You can check your server location in your Stape container settings. When selecting a starting mode, choose Production.

3. Click Start a collection and define the collection path you'll use in your tag (e.g. purchase).

Click Start a collection and define the collection path you'll use in your tag

4. In your sGTM container, open Templates, click Search Gallery, and add the Firestore Writer tag from the Template Gallery.

add the Firestore Writer tag

5. Create a new tag using that template and configure it:

  • Firebase Path - path to your Firestore collection or document. Mustn’t start or end with a '/'. If you point to a collection, a document with a random ID will be created automatically.
  • Add Event Data - sends all sGTM event data to the Firestore document. Custom Data fields override Event Data fields where they overlap.
  • Merge document keys - merges incoming data with the existing document instead of overwriting it entirely.
  • Override Firebase Project ID - by default, the tag reads the project ID from the GOOGLE_CLOUD_PROJECT environment variable, which is automatically set when your sGTM container runs on Google Cloud. Enable this option only if you need to use a Firebase project that is different from the one your container is running on.
  • Skip null or undefined values - allows skipping items from customDataList with undefined or null values.
  • Add Timestamp - adds a millisecond timestamp to the document.

Use Custom Data to add or override specific fields, such as user_email or user_id. If sending user data to production, make sure to hash it before writing.

6. Set your trigger and save the tag.

Set your trigger and save the tag

7. To read data back from Firestore, create a Firestore Lookup variable in sGTM. 

8. Set your Collection Path and define a query condition (e.g. match user_id in Firestore against a _gid variable in sGTM) to retrieve stored values and use them in other tags.

Set your Collection Path

Testing

Verify the connection is working by triggering a test event through your sGTM container and checking that data lands in the expected destination:

  • BigQuery: Navigate to your dataset in the BigQuery console and confirm a new row appears after the test event fires. See Explore BigQuery in the Google Cloud console for instructions on browsing table data.
  • Firestore: Open your Firestore database in the Cloud Console and check that the relevant document was created or updated. See Use Firestore Studio for instructions on viewing and filtering documents.

Use case

A sample scenario is an eCommerce store that wants to understand which products drive the most revenue, but their GA4 reports are sampled and can't be sliced the way the team needs. The marketing team is making decisions based on aggregated totals that hide what's actually selling.

You can fix this by pushing raw event data into BigQuery for direct querying:

  1. Set up the Google Service Account power-up with a JSON key for a service account with the BigQuery Data Editor role. Then add a Write to BigQuery tag in your sGTM container and map your purchase event fields to a BigQuery table.
  2. Once events start flowing in, run SQL queries directly against your table to break down revenue by product, date range, or traffic source without any sampling.
  3. Compare the numbers against your GA4 reports. If GA4 was previously sampling your data, you'll see more granular and accurate totals in BigQuery, and your team will be able to answer questions about products that weren't possible before.

Comments

Can’t find what you are looking for?