Cloudflare Analytics Engine

Connect & Ingest data from Cloudflare Analytics Engine

Cloudflare Analytics Engine is a time-series analytics database that allows you to store and query custom metrics from your Cloudflare Workers. The Sling Cloudflare Analytics Engine connector extracts data via the SQL API, automatically discovering all tables (datasets) in your account.

Setup

The following credentials and inputs are accepted:

Secrets:

  • api_token (required) -> Your Cloudflare API Token with Analytics Engine read permissions

  • account_id (required) -> Your Cloudflare Account ID

Inputs:

  • anchor_date (optional) -> The starting date for historical data extraction (default: 90 days ago). Format: YYYY-MM-DD

Getting Your Credentials

Account ID

  1. Log in to your Cloudflare Dashboard

  2. Select any domain in your account

  3. Your Account ID is displayed in the right sidebar under "API"

API Token

  1. Click Create Token

  2. Select Create Custom Token

  3. Configure the token:

    • Token name: Sling Analytics Engine (or your preferred name)

    • Permissions: Account > Account Analytics > Read

    • Account Resources: Include your account

  4. Click Continue to summary and then Create Token

  5. Copy the token immediately (it won't be shown again)

Using sling conns

Environment Variable

Sling Env File YAML

See here to learn more about the sling env.yaml file.

With custom anchor date:

Dynamic Endpoints

The Cloudflare Analytics Engine connector uses dynamic endpoints to automatically discover all tables (datasets) in your account. When you run sling conns discover, the connector:

  1. Queries SHOW TABLES via the SQL API

  2. Creates an endpoint for each discovered table

  3. Names each endpoint using snake_case (e.g., my_dataset for a table named "MyDataset")

To see available endpoints:

Replication

Here's an example replication configuration to sync Analytics Engine data to a PostgreSQL database:

Full refresh example:

Incremental Sync

The connector uses time-based incremental sync with the timestamp field:

  • First run: Fetches all records from anchor_date (default: 90 days ago) to present

  • Subsequent runs: Only fetches records with timestamp >= last synced timestamp

The connector automatically:

  • Tracks the maximum timestamp value from each run

  • Uses this as the starting point for the next run

  • Paginates through large datasets (10,000 records per page)

Analytics Engine data has a 90-day retention by default. Set anchor_date appropriately based on your retention settings.

Data Schema

Analytics Engine tables have a flexible schema. Common columns include:

Column
Description

timestamp

Event timestamp (used for incremental sync)

index1

First index dimension

blob1...blob20

String blob columns

double1...double20

Numeric double columns

The exact columns depend on how you've configured your Analytics Engine datasets.

Rate Limiting

The connector is configured with conservative rate limiting:

  • Rate: 5 requests per second

  • Concurrency: 3 parallel requests

  • Retry: Exponential backoff on 429 (rate limit) responses

Common Use Cases

Sync All Analytics Data

Sync Specific Datasets

Historical Backfill

If you are facing issues connecting, please reach out to us at [email protected], on discord or open a Github Issue here.

Last updated

Was this helpful?