Sling looks for connection credentials in several places:
One of the easiest ways is to manage your connections is to use the sling conns
sub-command. Follow in the next section.
Managing Connections
Sling makes it easy to set , list and test connections. You can even see the available streams in a connection by using the discover sub-command.
Copy $ sling conns -h
conns - Manage and interact with local connections
See more details at https://docs.slingdata.io/sling-cli/
Usage:
conns [discover | list | set | test ]
Subcommands:
discover list available streams in connection
list list local connections detected
test test a local connection
unset remove a connection from the sling env file
set set a connection in the sling env file
exec execute a SQL query on a Database connection
Flags:
--version Displays the program version string.
-h --help Displays help with available flag, subcommand, and positional value parameters.
Set Connections
Here we can easily set a connection with the sling conns set
command and later refer to them by their name. This ensures credentials are not visible by other users when using process monitors, for example.
Copy # set a connection by providing the key=value pairs
$ sling conns set AWS_S3 type=s3 bucket=sling-bucket access_key_id=ACCESS_KEY_ID secret_access_key= "SECRET_ACCESS_KEY"
# we set a database connection with just the url
$ sling conns set MY_PG url= 'postgresql://postgres:myPassword@pghost:5432/postgres'
To see what credential keys are necessary/accepted for each type of connector, click below:
File/Storage Connections (see here )
Database Connections (see here )
List Connections
Once connections are set, we can run the sling conns list
command to list our detected connections:
Copy $ sling conns list
+--------------------------+-----------------+-------------------+
| CONN NAME | CONN TYPE | SOURCE |
+--------------------------+-----------------+-------------------+
| AWS_S3 | FileSys - S3 | sling env yaml |
| FINANCE_BQ | DB - BigQuery | sling env yaml |
| DO_SPACES | FileSys - S3 | sling env yaml |
| LOCALHOST_DEV | DB - PostgreSQL | dbt profiles yaml |
| MSSQL | DB - SQLServer | sling env yaml |
| MYSQL | DB - MySQL | sling env yaml |
| ORACLE_DB | DB - Oracle | env variable |
| MY_PG | DB - PostgreSQL | sling env yaml |
+--------------------------+-----------------+-------------------+
Test Connections
We can also test a connection by running the sling conns test
command:
Copy $ sling conns test LOCALHOST_DEV
9:04AM INF success!
Discover Connections
We can easily discover streams available in a connection with the sling conns discover
command:
Copy $ sling conns discover postgres --pattern public.work*
+---+--------+-------------------+-------+---------+
| # | SCHEMA | NAME | TYPE | COLUMNS |
+---+--------+-------------------+-------+---------+
| 1 | public | worker_heartbeats | table | 14 |
| 2 | public | workers | table | 20 |
| 3 | public | workspaces | table | 9 |
+---+--------+-------------------+-------+---------+
$ sling conns discover aws_s3
+---+------------------+-----------+---------+-------------------------------+
| # | NAME | TYPE | SIZE | LAST UPDATED (UTC) |
+---+------------------+-----------+---------+-------------------------------+
| 1 | logging/ | directory | - | - |
| 2 | sling_test/ | directory | - | - |
| 3 | work/ | directory | - | - |
| 4 | temp/ | directory | - | - |
| 5 | records.json | file | 442 KiB | 2022-12-07 11:05:01 (1y ago ) |
| 6 | test .sqlite.db | file | 4.8 MiB | 2022-12-14 21:00:48 (1y ago ) |
| 7 | test1.parquet | file | 48 KiB | 2024-03-31 22:54:52 (29d ago ) |
| 8 | test_1000.csv | file | 99 KiB | 2024-02-23 09:53:13 (67d ago ) |
+---+------------------+-----------+---------+-------------------------------+
Show column level information:
Copy $ sling conns discover postgres -p public.workspaces --columns
+----------+--------+------------+----+--------------+--------------------------+--------------+
| DATABASE | SCHEMA | TABLE | ID | COLUMN | NATIVE TYPE | GENERAL TYPE |
+----------+--------+------------+----+--------------+--------------------------+--------------+
| postgres | public | workspaces | 1 | id | bigint | bigint |
| postgres | public | workspaces | 2 | account_id | bigint | bigint |
| postgres | public | workspaces | 3 | name | text | text |
| postgres | public | workspaces | 4 | short_name | varchar | string |
| postgres | public | workspaces | 5 | token | text | text |
| postgres | public | workspaces | 6 | settings | jsonb | json |
| postgres | public | workspaces | 7 | created_dt | timestamp with time zone | timestampz |
| postgres | public | workspaces | 8 | updated_dt | timestamp with time zone | timestampz |
| postgres | public | workspaces | 9 | deleted_dt | timestamp with time zone | timestampz |
+----------+--------+------------+----+--------------+--------------------------+--------------+
$ sling conns discover aws_s3 -p test1.parquet --columns
+---------------------------------+----+------------------+----------------+--------------+
| FILE | ID | COLUMN | NATIVE TYPE | GENERAL TYPE |
+---------------------------------+----+------------------+----------------+--------------+
| s3://my-bucket/test1.parquet | 1 | id | INT_64 | bigint |
| s3://my-bucket/test1.parquet | 2 | first_name | UTF8 | string |
| s3://my-bucket/test1.parquet | 3 | last_name | UTF8 | string |
| s3://my-bucket/test1.parquet | 4 | email | UTF8 | string |
| s3://my-bucket/test1.parquet | 5 | target | BOOLEAN | bool |
| s3://my-bucket/test1.parquet | 6 | create_dt | Timestamp | datetime |
| s3://my-bucket/test1.parquet | 7 | date | Timestamp | datetime |
| s3://my-bucket/test1.parquet | 8 | rating | DECIMAL | decimal |
| s3://my-bucket/test1.parquet | 9 | code | DECIMAL | decimal |
| s3://my-bucket/test1.parquet | 10 | json_data | UTF8 | string |
| s3://my-bucket/test1.parquet | 11 | _sling_loaded_at | INT_64 | bigint |
+---------------------------------+----+------------------+----------------+--------------+
Credentials Location
Sling Env File (env.yaml
)
The Sling Env file is the primary way sling reads connections. It needs to be saved in the path ~/.sling/env.yaml
where the ~
denotes the path of the user Home folder, which can have different locations depending on the operating system (see here for Windows , here for Mac and here for Linux ). Sling automatically creates the .sling
folder in the user home directory, which is typically as shown below:
Linux: /home/<username>/.sling
, or /root/.sling
if user is root
Mac: /Users/<username>/.sling
Windows: C:\Users\<username>\.sling
Once in the user home directory, setting the Sling Env File (named env.yaml
) is easy, and adheres to the structure below. Running sling
the first time will auto-create it. You can alternatively provide the environment variable SLING_HOME_DIR
.
To see what credential keys are necessary/accepted for each type of connector, click below:
File/Storage Connections (see here )
Database Connections (see here )
Copy # Holds all connection credentials for Extraction and Loading
connections :
marketing_pg :
url : 'postgres://...'
ssh_tunnel : 'ssh://...' # optional
# or dbt profile styled
marketing_pg :
type : postgres
host : [ hostname ]
user : [ username ]
password : [ password ]
port : [ port ]
dbname : [ database name ]
schema : [ dbt schema ]
ssh_tunnel : 'ssh://...'
finance_bq :
type : bigquery
method : service-account
project : [ GCP project id ]
dataset : [ the name of your dbt dataset ]
keyfile : [ /path/to/bigquery/keyfile.json ]
# Global variables for specific settings, available to all connections at runtime (Optional)
variables :
aws_access_key : '...'
aws_secret_key : '...'
Environment Variables
Sling also reads environment variables. Simply export a connection URL (or YAML payload) to the current shell environment to use them.
To see examples of setting environment variables for each type of connector, click below:
File/Storage Connections (see here )
Database Connections (see here )
Mac / Linux Windows
Copy $ export MY_PG= 'postgresql://user:mypassw@pg.host:5432/db1'
$ export MY_PG= '{type: postgres, host: "pg.host", user: user, database: "db1", password: "mypassw", port: 5432}'
$ export MY_SNOWFLAKE= 'snowflake://user:mypassw@sf.host/db1'
$ export MY_SNOWFLAKE='{type: snowflake, host: "<host>", user: "<user>", database: "<database>", password: "<password>", role: "<role>"}'
$ export ORACLE_DB= 'oracle://user:mypassw@orcl.host:1521/db1'
$ export BIGQUERY_DB='{type: bigquery, dataset: public, key_file: /path/to/service.json, project: my-google-project}' # yaml or json form is also accepted
$ sling conns list
+---------------+------------------+-----------------+
| CONN NAME | CONN TYPE | SOURCE |
+---------------+------------------+-----------------+
| MY_PG | DB - PostgreSQL | env variable |
| MY_SNOWFLAKE | DB - Snowflake | env variable |
| ORACLE_DB | DB - Oracle | env variable |
| BIGQUERY_DB | DB - Big Query | env variable |
+---------------+------------------+-----------------+
Copy $ $ env: MY_PG = 'postgresql://user:mypassw@pg.host:5432/db1'
$ $ env: MY_PG = '{type: postgres, host: "pg.host", user: user, database: "db1", password: "mypassw", port: 5432}'
$ $ env: MY_SNOWFLAKE = 'snowflake://user:mypassw@sf.host/db1'
$ $env:MY_SNOWFLAKE='{type: snowflake, host: "sf.host", user: user, database: db1, password: "mypassw", role: "<role>"}'
$ $ env: ORACLE_DB = 'oracle://user:mypassw@orcl.host:1521/db1'
$ $env:BIGQUERY_DB='{type: bigquery, dataset: public, key_file: /path/to/service.json, project: my-google-project}' # yaml or json form is also accepted
$ sling conns list
+---------------+------------------+-----------------+
| CONN NAME | CONN TYPE | SOURCE |
+---------------+------------------+-----------------+
| MY_PG | DB - PostgreSQL | env variable |
| MY_SNOWFLAKE | DB - Snowflake | env variable |
| ORACLE_DB | DB - Oracle | env variable |
+---------------+------------------+-----------------+
DBT Profiles (~/dbt/profiles.yml
)
Sling also reads dbt profiles connections! If you're already set up with dbt cli locally, you don't need to create additional duplicate connections.
See here for more details.
Copy $ sling conns list
+------------------+------------------+-------------------+
| CONN NAME | CONN TYPE | SOURCE |
+------------------+------------------+-------------------+
| SNOWCASTLE_DEV | DB - Snowflake | dbt profiles yaml |
| SNOWCASTLE_PROD | DB - Snowflake | dbt profiles yaml |
+------------------+------------------+-------------------+
Last updated 2 months ago