What the GSC API gives you
Google Search Console is the single most important free SEO data source available. The API exposes the same data you see in the GSC dashboard — but instead of you having to log in and look, OpenClaw pulls it on a schedule and delivers exactly what matters.
Average Position
The average ranking position of a page or keyword across all queries where it appeared.
Impressions
How many times your pages appeared in search results, even if not clicked.
Clicks
How many times searchers clicked through to your site from a search result.
CTR
Click-through rate — clicks divided by impressions. Low CTR on top-ranked pages often means the title or meta needs work.
By Page
Performance data broken down per URL — see which pages are growing or falling.
By Query
Performance data per search query — see exactly which keywords drive your traffic.
Setting up GSC API access (step by step)
This is a one-time setup. It takes about 15 minutes. You need a Google account with your site verified in Google Search Console.
-
Create a Google Cloud project Go to console.cloud.google.com, click "New Project", and give it a name like "openclaw-seo". This is free and takes 30 seconds.
-
Enable the Search Console API In your project, go to APIs & Services → Library. Search for "Google Search Console API" and click Enable.
-
Create a service account Go to APIs & Services → Credentials → Create Credentials → Service Account. Name it "openclaw-reader", role "Viewer". Download the JSON key file — this is what OpenClaw will use to authenticate.
-
Add the service account to your GSC property In Google Search Console, go to your property → Settings → Users and permissions → Add user. Enter the service account email (looks like
openclaw-reader@your-project.iam.gserviceaccount.com), set permission to "Full". Click Add. -
Store the credentials in OpenClaw Place the downloaded JSON key file somewhere safe on your server (e.g.
~/.openclaw/gsc-credentials.json). Reference this path in your AGENTS.md configuration below. -
Test the connection Ask OpenClaw in chat: "Pull my top 10 ranking keywords from Google Search Console for the last 7 days." If it responds with real data, the connection is working.
gsc-credentials.json file grants read access to your GSC data. Store it outside your web root and never commit it to a public Git repository. Set file permissions to 600 (readable only by owner).
Configuring OpenClaw — AGENTS.md
Add this to your AGENTS.md to teach OpenClaw how to use your GSC data:
## rank-tracker
You are the SEO rank tracking agent for [yourdomain.com].
Credentials: ~/.openclaw/gsc-credentials.json
GSC property URL: https://yourdomain.com/
When asked for rank data, use the Google Search Console
API (searchanalytics.query endpoint) with these defaults:
- Date range: last 7 days (ending 3 days ago to account
for data delay)
- Dimensions: ["query", "page"]
- Row limit: 1000
- Data state: "final"
Key pages to monitor closely (alert on drops >3 positions):
- / (homepage)
- /[your most important page]
- /[your second most important page]
Alert threshold for all other pages: drop of 5+ positions
for any page currently ranked in the top 20.
When generating reports, always include:
- Current position vs. position this time last week
- Change (+ is improvement, - is decline)
- Clicks and impressions for the period
- CTR (flag any page with position 1-5 but CTR below 3%)
Comparison period: always compare current week vs
the same 7-day window from 4 weeks ago (not last week,
to avoid day-of-week bias).
HEARTBEAT.md templates
Weekly rank report (every Monday morning)
## Weekly Rank Report — Every Monday 07:30
Use rank-tracker agent to pull GSC data.
Compare this week vs. 4 weeks ago.
Generate a report with:
1. Top 10 pages by clicks this week
2. Top 10 queries by impressions
3. Pages with biggest position improvements (top 5)
4. Pages with biggest position declines (top 5)
5. Any pages that dropped out of top 20 entirely
6. Pages in positions 1-10 with CTR below 3%
(title/meta may need optimising)
Send the full report to #seo-reports on Slack.
Format numbers clearly: use ↑ for improvements,
↓ for declines, → for unchanged.
Daily drop alert (weekdays only)
## Rank Drop Alert — Weekdays 09:00
Use rank-tracker agent.
Pull position data for the last 3 days vs the
3 days prior.
If any monitored key page has dropped 3+ positions,
send an immediate alert to [your phone] via WhatsApp:
"⚠️ Rank drop detected: [page URL]
Yesterday: position [X]
Today: position [Y]
Change: [Z] positions
Queries affected: [top query losing position]"
If no significant drops, do not send a message.
Monthly deep analysis
## Monthly SEO Performance Report — 1st of month, 08:00
Use rank-tracker agent.
Pull the full 28-day period vs the same period
last month and vs the same period last year (if available).
Report:
1. Overall organic clicks and impressions trend
2. Top 20 queries by clicks — position and CTR
3. New keywords entering top 10 this month
4. Keywords that fell out of top 10 this month
5. Pages with consistent CTR below 2% despite
ranking in positions 1-5 (optimisation opportunities)
6. Any query where we rank between 8-12
(low-hanging fruit — content improvement could
push to page 1 top 5)
Send to [your email] as a formatted summary.
Rank drop alert logic
The most valuable single thing you can automate in SEO is catching ranking drops before they compound. Here is how to think about alert thresholds:
Tier 1 — Critical pages (homepage, top landing pages): Alert on any drop of 3+ positions. These pages drive the most revenue and traffic; a 3-position drop is meaningful and may require immediate action.
Tier 2 — Top-20 pages: Alert on drops of 5+ positions. Normal variance means 1-2 position moves happen all the time; 5+ positions suggests something changed — a competitor improved, your content dropped in freshness, or there was an algorithm update.
Tier 3 — All other pages: Only alert on drops out of the top 20 entirely. These pages aren't driving significant traffic, so monitoring every fluctuation adds noise without value.
What the weekly report looks like
Going further — DataForSEO and PageSpeed
GSC is free and covers most rank tracking needs, but two additional API integrations significantly extend what OpenClaw can do:
DataForSEO SERP API: GSC only shows data for queries where you already rank. DataForSEO lets you track any keyword, whether you currently rank for it or not. This is useful for monitoring competitors and tracking keywords you're targeting before you've broken into the top positions. Cost is per-call — typically a few cents per keyword per check, making it affordable to track 50–100 keywords weekly.
## competitor-watch (using DataForSEO)
API endpoint: https://api.dataforseo.com/v3/serp/google/organic/live/regular
Credentials: [your DataForSEO login]
Weekly competitor check — every Wednesday 09:00:
Track these keywords in Google UK:
- "openclaw alternative"
- "ai agent framework"
- "[your target keyword]"
For each keyword, record:
- Our position (if any)
- Top 3 competitor URLs and their positions
- Whether a featured snippet is present, and who holds it
Send a brief summary to #seo-competitive.
PageSpeed Insights API: Core Web Vitals are a confirmed ranking factor. The free PageSpeed API lets OpenClaw check your most important pages weekly and alert you if LCP, CLS, or INP scores fall into the "Needs Improvement" or "Poor" range before they affect your rankings.
## core-web-vitals-check — Every Tuesday 08:00
Check Core Web Vitals for these pages using
PageSpeed Insights API (key: [your PSI API key]):
- https://yourdomain.com/
- https://yourdomain.com/pricing/
- https://yourdomain.com/[your top landing page]/
For each page, report:
- LCP (flag if > 2.5s)
- CLS (flag if > 0.1)
- INP (flag if > 200ms)
- Overall performance score (flag if below 70)
If any metric is in "Needs Improvement" or "Poor",
send a WhatsApp alert immediately.
If all metrics are good, log quietly to
~/openclaw/reports/cwv-[date].md
Frequently asked questions
Is Google Search Console API free?
Yes, completely free. You pay nothing for the API calls. You need a verified GSC property and a Google Cloud service account, both of which are free to create.
How far back does GSC data go?
Google Search Console keeps 16 months of data. You can query data up to 16 months into the past for trend analysis and year-over-year comparisons.
Can OpenClaw alert me when a specific page loses its featured snippet?
Yes, using a combination of GSC API and DataForSEO. GSC tells you when position changes significantly; DataForSEO's SERP features API tells you whether a featured snippet is present and who holds it. Configure OpenClaw to check both weekly for your priority keywords.
What is a good threshold for rank drop alerts?
3+ positions for your critical pages (homepage, top landing pages), 5+ positions for any other top-20 page. Avoid alerting on 1–2 position moves — GSC data has natural variance and you'll receive too many false positives.