Stape/Documentation

Open Container for Bot Index power-up

Updated Apr 22, 2026

The Open Container for Bot Index power-up removes the default restriction that blocks search engine bots from accessing your sGTM container, allowing crawlers to interact with it just like human visitors. This ensures search engines get a complete picture of your site's setup.

Open Container for Bot Index is available on the Free subscription plan and higher. To check your current plan or upgrade, go to your sGTM container settings.

How to set up Open Container for Bot Index

1. Log in to your Stape account and select your sGTM container from the dashboard.

select your sGTM container from the dashboard

2. Go to the Power-ups tab.

Go to the Power-ups tab

3. Click Use next to the Open Container for Bot Index panel.

Click Use next to the Open Container for Bot Index panel.

4. Toggle the Open Container for Bot Index switch to enable it, and click Save changes.

Toggle the Open Container for Bot Index switch to enable it

Testing

Open the robots.txt page of your Stape sGTM container. For example, if your container URL is 
https://sst.stape.work, navigate to https://sst.stape.work/robots.txt.

You should see one of the following:

  • Allow - the power-up is working correctly.
  • Disallow - the power-up is not active. Double-check that the toggle is enabled and changes were saved.
Open the robots.txt page of your Stape sGTM container.

Use case

A sample scenario is a SEO team that notices their sGTM container injects structured data and canonical tags, but those signals aren't being consistently picked up during crawls. Their search console reports show correct markup in manual inspections, yet coverage issues persist and certain pages fail to surface the expected rich results.

You can identify this problem and fix it this way:

  1. Check your sGTM container's robots.txt file and confirm whether the response returns Disallow for crawlers. If it does, bots are blocked from accessing your container entirely.
  2. Enable the Open Container for Bot Index power-up and save the changes. Re-check robots.txt to confirm it now returns Allow.
  3. After enabling, request re-indexing for the affected pages through Google Search Console and monitor coverage and rich result reports over the following 2–3 weeks to allow crawlers to re-process the pages.

If bot access to the container was the root cause, you will see the previously missing rich results begin to appear and coverage errors tied to those pages start to resolve.

Comments

Can’t find what you are looking for?