Azure Table
Extract data from Azure Table Storage
Setup
The following credentials keys are accepted:
conn_str(optional) -> The full connection stringaccount_name(optional) -> The Azure Storage account nameaccount_key(optional) -> The account key for authenticationsas_token(optional) -> The SAS token for authentication
Using sling conns
sling connsHere are examples of setting a connection named AZURE_TABLE. We must provide the type=azuretable property:
# Using connection string
$ sling conns set AZURE_TABLE type=azuretable conn_str="<connection_string>"
# Using account key
$ sling conns set AZURE_TABLE type=azuretable account_name=<account_name> account_key=<account_key>
# Using SAS token
$ sling conns set AZURE_TABLE type=azuretable account_name=<account_name> sas_token=<sas_token>Environment Variable
# Using connection string
export AZURE_TABLE='{ type: azuretable, conn_str: "DefaultEndpointsProtocol=https;AccountName=myaccount;AccountKey=mykey;EndpointSuffix=core.windows.net" }'
# Using account key
export AZURE_TABLE='{ type: azuretable, account_name: "myaccount", account_key: "mykey" }'
# Using SAS token
export AZURE_TABLE='{ type: azuretable, account_name: "myaccount", sas_token: "?sv=2020-08-04&ss=t&srt=sco..." }'Sling Env File YAML
See here to learn more about the sling env.yaml file.
connections:
AZURE_TABLE:
type: azuretable
account_name: myaccount
account_key: <account_key>
AZURE_TABLE_SAS:
type: azuretable
account_name: myaccount
sas_token: '?sv=2020-08-04&ss=t&srt=sco...'
AZURE_TABLE_CONN_STR:
type: azuretable
conn_str: DefaultEndpointsProtocol=https;AccountName=myaccount;AccountKey=mykey;EndpointSuffix=core.windows.net
AZURE_TABLE_DEFAULT_AUTH:
type: azuretable
account_name: myaccount
# Uses DefaultAzureCredential when no key/sas/conn_str providedExamples
Extract data from Azure Table Storage
source: azure_table
target: postgres
defaults:
mode: full-refresh
object: public.{stream_table}
streams:
default.customers:
default.orders:
select: [PartitionKey, RowKey, OrderId, CustomerId, Amount, OrderDate]
default.products:
limit: 1000Incremental Loading
source: postgres
target: azure_table
defaults:
mode: incremental
primary_key: [id]
update_key: Timestamp
streams:
public.events:
object: default.eventsWorking with Filters
source: azure_table
target: snowflake
streams:
default.logs:
object: public.logs
# Use OData filter syntax
where: "PartitionKey eq '2024-01' and Status eq 'active'"
default.transactions:
object: public.transactions
# Filter by timestamp
where: "Timestamp ge datetime'2024-01-01T00:00:00Z'"If you are facing issues connecting, please reach out to us at [email protected], on discord or open a Github Issue here.
Last updated
Was this helpful?