Ahrefs Automation: How to Automate SEO Workflows with the Ahrefs API

Matthew Diakonov··11 min read

Ahrefs Automation: How to Automate SEO Workflows with the Ahrefs API

Ahrefs provides one of the richest SEO datasets available, but clicking through the UI for every check does not scale. If you manage more than a handful of sites or run recurring audits, you need automation. This guide covers the practical ways to automate Ahrefs, from the official API to Python scripts, third-party connectors, and AI-powered agents.

What You Can Automate in Ahrefs

Not every Ahrefs feature is available programmatically. Before writing code, it helps to know which data endpoints exist and which workflows still require the web UI.

| Workflow | API Available? | Endpoint / Method | |---|---|---| | Backlink monitoring | Yes | /v3/site-explorer/all-backlinks | | Keyword rank tracking | Yes | /v3/rank-tracker/overview | | Domain overview metrics | Yes | /v3/site-explorer/overview | | Site audit (crawl) | Yes (trigger + poll) | /v3/site-audit/ | | Content Explorer search | Yes | /v3/content-explorer/search | | Keyword research (volume, difficulty) | Yes | /v3/keywords-explorer/overview | | Batch analysis | Yes | /v3/batch-analysis | | Custom alerts | No (UI only) | Set up manually in dashboard | | Competitive analysis overlays | Partial | Combine multiple endpoint calls |

The API uses a credit-based system. Each call costs a certain number of "API rows" depending on the plan. Starter plans include 500 API rows per month; higher tiers include more. You can check your remaining quota at https://app.ahrefs.com/api/profile.

Setting Up API Access

You need an Ahrefs subscription that includes API access (Lite plan and above). Here is how to get started:

  1. Sign in to your Ahrefs account
  2. Navigate to Account Settings > API (or visit https://app.ahrefs.com/api)
  3. Generate an API token
  4. Store the token securely (environment variable, secrets manager, or keychain)
# Store your API token as an environment variable
export AHREFS_API_TOKEN="your_token_here"

# Quick test: fetch domain rating for a site
curl -s -H "Authorization: Bearer $AHREFS_API_TOKEN" \
  "https://api.ahrefs.com/v3/site-explorer/overview?target=example.com&mode=domain" \
  | python3 -m json.tool

The response includes domain_rating, backlinks, referring_domains, organic_keywords, and organic_traffic estimates.

Warning

API tokens grant full read access to your Ahrefs data. Never commit them to version control. Use environment variables or a secrets manager. Rotate tokens if you suspect exposure.

Automating Backlink Monitoring with Python

Backlink changes are one of the highest-value things to track automatically. New referring domains, lost backlinks, and toxic link patterns all affect rankings. Here is a Python script that checks for new backlinks daily and sends a summary:

import os
import json
import requests
from datetime import datetime, timedelta

AHREFS_TOKEN = os.environ["AHREFS_API_TOKEN"]
BASE_URL = "https://api.ahrefs.com/v3"
TARGET_DOMAIN = "yoursite.com"

def get_new_backlinks(days_back=7):
    """Fetch backlinks discovered in the last N days."""
    date_from = (datetime.now() - timedelta(days=days_back)).strftime("%Y-%m-%d")
    
    resp = requests.get(
        f"{BASE_URL}/site-explorer/all-backlinks",
        headers={"Authorization": f"Bearer {AHREFS_TOKEN}"},
        params={
            "target": TARGET_DOMAIN,
            "mode": "domain",
            "date_from": date_from,
            "limit": 50,
            "order_by": "domain_rating:desc",
        },
    )
    resp.raise_for_status()
    return resp.json().get("backlinks", [])

def format_report(backlinks):
    """Build a plain-text summary of new backlinks."""
    if not backlinks:
        return "No new backlinks in the last 7 days."
    
    lines = [f"New backlinks for {TARGET_DOMAIN} (last 7 days):", ""]
    for bl in backlinks:
        dr = bl.get("domain_rating", "?")
        source = bl.get("url_from", "unknown")
        anchor = bl.get("anchor", "(no anchor)")
        lines.append(f"  DR {dr} | {source}")
        lines.append(f"         anchor: {anchor}")
        lines.append("")
    return "\n".join(lines)

if __name__ == "__main__":
    backlinks = get_new_backlinks()
    report = format_report(backlinks)
    print(report)
    
    # Save to file for downstream processing
    with open("/tmp/backlink_report.json", "w") as f:
        json.dump(backlinks, f, indent=2)

Run this on a cron schedule (0 9 * * 1 for every Monday at 9am) and pipe the output to your Slack channel or email.

Automated Rank Tracking and Alerting

Ahrefs Rank Tracker stores historical position data. You can pull it via API and build custom alerts that are more flexible than the built-in ones:

def check_rank_drops(project_id, threshold=5):
    """Flag keywords that dropped more than N positions."""
    resp = requests.get(
        f"{BASE_URL}/rank-tracker/overview",
        headers={"Authorization": f"Bearer {AHREFS_TOKEN}"},
        params={"project_id": project_id},
    )
    resp.raise_for_status()
    keywords = resp.json().get("keywords", [])
    
    drops = []
    for kw in keywords:
        current = kw.get("position", 0)
        previous = kw.get("previous_position", 0)
        if previous > 0 and current - previous >= threshold:
            drops.append({
                "keyword": kw["keyword"],
                "was": previous,
                "now": current,
                "drop": current - previous,
                "url": kw.get("url", ""),
            })
    
    return sorted(drops, key=lambda d: d["drop"], reverse=True)

This gives you a list of keywords where you lost 5+ positions, sorted by severity. You can trigger this weekly and send alerts only when there are actual drops, cutting through the noise of the default Ahrefs email digests.

Cron Job(daily/weekly)Python Scriptfetch + analyzeAhrefs APIv3 endpointsSlackEmailDashboardAhrefs Automation PipelineCron triggers script, script calls API, results flow to notifications

Site Audit Automation

Ahrefs Site Audit crawls your site and reports issues (broken links, missing meta tags, slow pages, redirect chains). You can trigger crawls programmatically and poll for results:

def trigger_site_audit(project_id):
    """Start a new crawl for the given project."""
    resp = requests.post(
        f"{BASE_URL}/site-audit/crawl",
        headers={"Authorization": f"Bearer {AHREFS_TOKEN}"},
        json={"project_id": project_id},
    )
    resp.raise_for_status()
    return resp.json()

def get_audit_issues(project_id, min_importance="warning"):
    """Pull issues from the latest completed audit."""
    resp = requests.get(
        f"{BASE_URL}/site-audit/issues",
        headers={"Authorization": f"Bearer {AHREFS_TOKEN}"},
        params={
            "project_id": project_id,
            "importance": min_importance,
        },
    )
    resp.raise_for_status()
    return resp.json().get("issues", [])

A common pattern: trigger a crawl every Sunday night, check for completion Monday morning, then generate a report of new issues compared to the previous week. This catches regressions (broken internal links, new 404s from content changes) before they affect rankings.

Third-Party Integration Options

If you prefer not to write Python scripts, several platforms connect to the Ahrefs API with pre-built workflows:

| Platform | What It Does | Best For | |---|---|---| | Zapier / Make | Connect Ahrefs triggers to 5,000+ apps | Non-technical teams | | Google Sheets (via Apps Script) | Pull Ahrefs data into spreadsheets | Ad-hoc reporting | | Looker Studio (via connector) | Build live dashboards from Ahrefs data | Client-facing reports | | n8n (self-hosted) | Open-source workflow automation | Teams wanting full control | | Supermetrics | Pull SEO data into Sheets, BigQuery, or Data Studio | Enterprise reporting |

For Zapier or Make, the typical flow is: schedule a trigger (e.g. every Monday), call the Ahrefs API endpoint, filter/transform the data, then push it to Slack, email, or a spreadsheet.

Automating with AI Agents

The newest approach to Ahrefs automation involves AI agents that can interpret SEO data and take action. Instead of writing rigid if/then scripts, you describe what you want in natural language and the agent handles the API calls, analysis, and reporting.

For example, an AI agent can:

  • Pull your backlink data, identify which new links come from high-authority domains, and draft outreach follow-ups
  • Monitor rank drops, cross-reference with recent content changes in your CMS, and suggest fixes
  • Run competitive gap analysis by comparing your keyword profile against three competitors and prioritizing opportunities

This is where tools like Fazm come in. Fazm lets you build AI agents that interact with your development environment, including SEO tooling. You can set up an agent that runs Ahrefs API calls on a schedule, analyzes the results, and commits changes (like updating meta descriptions or internal links) directly to your codebase.

Tip

When building automation scripts, start with the single endpoint that saves you the most time. For most teams, that is backlink monitoring or rank tracking. Add complexity only after the basic pipeline is running reliably.

Common Pitfalls

  • Burning through API credits too fast. Each API call costs rows. A naive script that fetches all backlinks for 50 domains daily will blow through your quota in a week. Use limit parameters, cache results locally, and batch requests where possible.

  • Not handling rate limits. The Ahrefs API returns 429 Too Many Requests when you exceed the per-second limit. Always implement exponential backoff:

import time

def api_call_with_retry(url, headers, params, max_retries=3):
    for attempt in range(max_retries):
        resp = requests.get(url, headers=headers, params=params)
        if resp.status_code == 429:
            wait_time = 2 ** attempt
            print(f"Rate limited. Waiting {wait_time}s...")
            time.sleep(wait_time)
            continue
        resp.raise_for_status()
        return resp.json()
    raise Exception("Max retries exceeded")
  • Comparing metrics across different time windows. Ahrefs recalculates Domain Rating, traffic estimates, and keyword positions on different schedules. If you compare a Monday snapshot to a Friday snapshot, the underlying index may have changed. Pin your comparisons to the same day of week.

  • Ignoring the mode parameter. The API supports domain, prefix, exact, and subdomains modes. Using the wrong mode gives you wildly different numbers. For most automation, domain mode is what you want.

  • Not validating responses before processing. API responses can include empty arrays, null values, or changed field names after API updates. Always check that the expected fields exist before indexing into them.

Minimal Working Example: Weekly SEO Report

Here is a complete, runnable script that generates a weekly SEO summary for a single domain:

#!/usr/bin/env python3
"""Weekly Ahrefs SEO report. Run via cron: 0 9 * * 1 python3 weekly_report.py"""

import os
import json
import requests
from datetime import datetime

TOKEN = os.environ["AHREFS_API_TOKEN"]
DOMAIN = "yoursite.com"
HEADERS = {"Authorization": f"Bearer {TOKEN}"}
BASE = "https://api.ahrefs.com/v3"

def get_overview():
    r = requests.get(f"{BASE}/site-explorer/overview",
                     headers=HEADERS,
                     params={"target": DOMAIN, "mode": "domain"})
    r.raise_for_status()
    return r.json()

def get_top_pages(limit=10):
    r = requests.get(f"{BASE}/site-explorer/top-pages",
                     headers=HEADERS,
                     params={"target": DOMAIN, "mode": "domain",
                             "limit": limit, "order_by": "organic_traffic:desc"})
    r.raise_for_status()
    return r.json().get("pages", [])

def get_new_referring_domains(limit=20):
    r = requests.get(f"{BASE}/site-explorer/refdomains",
                     headers=HEADERS,
                     params={"target": DOMAIN, "mode": "domain",
                             "limit": limit, "order_by": "domain_rating:desc"})
    r.raise_for_status()
    return r.json().get("refdomains", [])

def main():
    overview = get_overview()
    top_pages = get_top_pages()
    ref_domains = get_new_referring_domains()
    
    report = {
        "date": datetime.now().isoformat(),
        "domain": DOMAIN,
        "domain_rating": overview.get("domain_rating"),
        "organic_keywords": overview.get("organic_keywords"),
        "organic_traffic": overview.get("organic_traffic"),
        "backlinks": overview.get("backlinks"),
        "referring_domains": overview.get("referring_domains"),
        "top_pages": [{"url": p.get("url"), "traffic": p.get("organic_traffic")}
                      for p in top_pages[:5]],
        "recent_referring_domains": [{"domain": d.get("domain"),
                                       "dr": d.get("domain_rating")}
                                      for d in ref_domains[:10]],
    }
    
    print(json.dumps(report, indent=2))

if __name__ == "__main__":
    main()

Save this as weekly_report.py, set the environment variable, and add it to your crontab. The JSON output can feed into Slack webhooks, email templates, or dashboards.

Wrapping Up

Ahrefs automation turns a manual SEO workflow into a repeatable system. Start with one script (backlink monitoring or rank tracking), get it running reliably on a cron schedule, and expand from there. The API is well-documented and the credit system is predictable once you understand the costs per endpoint.

Fazm builds AI agents that automate developer workflows, including SEO pipelines that pull from tools like Ahrefs and act on the data.

Related Posts