We recommend you install Elementary CLI using one of the following methods:

Install Elementary CLI

To install the monitor module run:
pip install elementary-data
Run one of the following commands based on your platform (no need to run all):
pip install 'elementary-data[snowflake]'
pip install 'elementary-data[bigquery]'
pip install 'elementary-data[redshift]'
pip install 'elementary-data[databricks]'
pip install 'elementary-data[athena]'
## Postgres doesn't require this step
Run edr --help in order to ensure the installation was successful. If you're receiving command not found: edr please check our troubleshooting guide.

Install from source

Install Elementary directly from GitHub source:
git clone https://github.com/elementary-data/elementary
pip install ./elementary

Validate installation

To validate the installation and get usage instructions for the CLI run:
edr --help

CLI configuration

Elementary CLI requires a connection profile to connect to DWH. Additional configuration is made on config.yml:
  • Required: HOME_DIR/.dbt/profiles.yml - Connection details to the data warehouse, in the format of dbt connection profile named elementary.
  • Optional: HOME_DIR/.edr/config.yml - Elementary configuration.
(These default paths and names may be changed using the CLI options).

Configuring the Elementary Profile

In order to connect, Elementary needs a connection profile in a file named profiles.yml. This profile will be used by the CLI, to connect to the DWH and find the dbt package tables.The easiest way to generate the profile is to run the following command within the dbt project where you deployed the elementary dbt package (works in dbt cloud as well):
dbt run-operation elementary.generate_elementary_cli_profile
Ensure that you fill in your password and any other missing fields after you paste the profile in your local profiles.yml file.
Copy the output, fill in the missing fields and add the profile to your profiles.yml. Here is a demonstration:

Elementary profile details and format

  • Path: HOME_DIR/.dbt/profiles.yml
  • Profile name: elementary
  • Schema name: The schema of elementary models, default is <your_dbt_project_schema>_elementary
## SNOWFLAKE ##
## By default, edr expects the profile name 'elementary'.      ##
## Configure the database and schema of elementary models.     ##
## Check where 'elementary_test_results' is to find it.        ##

elementary:
  outputs:
    default:
      type: snowflake
      account: [account id]

      ## User/password auth ##
      user: [username]
      password: [password]

      role: [user role]
      database: [database name]
      warehouse: [warehouse name]
      schema: [schema name]_elementary
      threads: 4

If you are a dbt user, you already have a profiles.yml file that you can use. The default path of the dbt profiles file is ~/.dbt/ directory.
  • For further details, please refer to dbt’s documentation about the file.
  • Note: in BigQuery we require ‘location’ (details), and in dbt it is optional.
Elementary requires connection details and credentials to connect to the data warehouse.These are configured in a connection profile using a file named profiles.yml.We recommend creating a dedicated database, or at least a schema.Elementary leverages dbt's connection profile format.dbt-core users Just add a profile with the relevant details, and name it elementary.
The provided credentials need to have permission to:
  • Read - Information schema and the tables configured for monitoring.,
  • Write - To the database configured in elementary profile, including creating new schemas.
Only create this file if you need to use it for configuration, it is not mandatory.
Create a new directory and yml file under: HOME_DIR/.edr/config.yml
Here is the format of the yml itself:
config.yml
# alerts destination #
slack:
  notification_webhook: <your_slack_webhook_url>
  # optional #
  workflows: false
We want to keep building and improving, and for that, we need to understand how users work with Elementary (and data is fun!). For that we added an anonymous tracking of events using Posthog (open-source product analytics, highly recommended).We only track start, end, platform, number of queries and the size of the graph. No credentials, queries content, table names or anything private (not now and not ever).By default this completely anonymous tracking is turned on. You can opt-out at any time by adding the following to your config.yml file:
anonymous_usage_tracking: False

We are available on Slack, reach out for any kind of help!