Cloudflare Analytics Engine
Connect & Ingest data from Cloudflare Analytics Engine
Cloudflare Analytics Engine is a time-series analytics database that allows you to store and query custom metrics from your Cloudflare Workers. The Sling Cloudflare Analytics Engine connector extracts data via the SQL API, automatically discovering all tables (datasets) in your account.
CLI Pro Required: APIs require a CLI Pro token or Platform Plan.
Setup
The following credentials and inputs are accepted:
Secrets:
api_token(required) -> Your Cloudflare API Token with Analytics Engine read permissionsaccount_id(required) -> Your Cloudflare Account ID
Inputs:
anchor_date(optional) -> The starting date for historical data extraction (default: 90 days ago). Format:YYYY-MM-DD
Getting Your Credentials
Account ID
Log in to your Cloudflare Dashboard
Select any domain in your account
Your Account ID is displayed in the right sidebar under "API"
API Token
Go to Cloudflare API Tokens
Click Create Token
Select Create Custom Token
Configure the token:
Token name: Sling Analytics Engine (or your preferred name)
Permissions: Account > Account Analytics > Read
Account Resources: Include your account
Click Continue to summary and then Create Token
Copy the token immediately (it won't be shown again)
Important: Make sure your API token has the Account Analytics: Read permission. Standard API keys won't work—you need a scoped API token.
Using sling conns
sling connsEnvironment Variable
Sling Env File YAML
See here to learn more about the sling env.yaml file.
With custom anchor date:
Dynamic Endpoints
The Cloudflare Analytics Engine connector uses dynamic endpoints to automatically discover all tables (datasets) in your account. When you run sling conns discover, the connector:
Queries
SHOW TABLESvia the SQL APICreates an endpoint for each discovered table
Names each endpoint using snake_case (e.g.,
my_datasetfor a table named "MyDataset")
To see available endpoints:
Replication
Here's an example replication configuration to sync Analytics Engine data to a PostgreSQL database:
Full refresh example:
Incremental Sync
The connector uses time-based incremental sync with the timestamp field:
First run: Fetches all records from
anchor_date(default: 90 days ago) to presentSubsequent runs: Only fetches records with
timestamp>= last synced timestamp
The connector automatically:
Tracks the maximum
timestampvalue from each runUses this as the starting point for the next run
Paginates through large datasets (10,000 records per page)
Data Schema
Analytics Engine tables have a flexible schema. Common columns include:
timestamp
Event timestamp (used for incremental sync)
index1
First index dimension
blob1...blob20
String blob columns
double1...double20
Numeric double columns
The exact columns depend on how you've configured your Analytics Engine datasets.
Rate Limiting
The connector is configured with conservative rate limiting:
Rate: 5 requests per second
Concurrency: 3 parallel requests
Retry: Exponential backoff on 429 (rate limit) responses
Common Use Cases
Sync All Analytics Data
Sync Specific Datasets
Historical Backfill
If you are facing issues connecting, please reach out to us at [email protected], on discord or open a Github Issue here.
Last updated
Was this helpful?