mirror of
https://github.com/open-metadata/OpenMetadata.git
synced 2025-10-27 00:31:42 +00:00
GitBook: [#124] add env to snowflake connector docs
This commit is contained in:
parent
184ce9219c
commit
a71509c82f
@ -811,19 +811,72 @@ python3 --version
|
||||
|
||||
Here’s an overview of the steps in this procedure. Please follow the steps relevant to your use case.
|
||||
|
||||
1. Create a configuration file using template JSON
|
||||
2. Configure service settings
|
||||
3. Configure data filters (optional)
|
||||
4. Configure sample data (optional)
|
||||
5. Configure DBT (optional)
|
||||
6. Confirm sink settings
|
||||
7. Confirm metadata\_server settings
|
||||
8. Install the Python module for this connector
|
||||
9. Run ingestion workflow
|
||||
1. Prepare a Python virtual environment
|
||||
2. Install the Python module for this connector
|
||||
3. Create a configuration file using template JSON
|
||||
4. Configure service settings
|
||||
5. Configure data filters (optional)
|
||||
6. Configure sample data (optional)
|
||||
7. Configure DBT (optional)
|
||||
8. Confirm sink settings
|
||||
9. Confirm metadata\_server settings
|
||||
10. Run ingestion workflow
|
||||
|
||||
|
||||
|
||||
### 1. Create a configuration file using template JSON
|
||||
### **1. Prepare a Python virtual environment**
|
||||
|
||||
In this step, we’ll create a Python virtual environment. Using a virtual environment enables us to avoid conflicts with other Python installations and packages on your host system.
|
||||
|
||||
In a later step, you will install the Python module for this connector and its dependencies in this virtual environment.
|
||||
|
||||
#### **1.1 Create a directory for openmetadata**
|
||||
|
||||
Throughout the docs, we use a consistent directory structure for OpenMetadata services and connector installation. If you have not already done so by following another guide, please create an openmetadata directory now and change into that directory in your command line environment.
|
||||
|
||||
```
|
||||
mkdir openmetadata; cd openmetadata
|
||||
```
|
||||
|
||||
#### **1.2 Create a virtual environment**
|
||||
|
||||
Run the following command to create a Python virtual environment called, `env`. You can try multiple connectors in the same virtual environment.
|
||||
|
||||
```
|
||||
python3 -m venv env
|
||||
```
|
||||
|
||||
#### **1.3 Activate the virtual environment**
|
||||
|
||||
Run the following command to activate the virtual environment.
|
||||
|
||||
```
|
||||
source env/bin/activate
|
||||
```
|
||||
|
||||
Once activated, you should see your command prompt change to indicate that your commands will now be executed in the environment named `env`.
|
||||
|
||||
#### **1.4 Upgrade pip and setuptools to the latest versions**
|
||||
|
||||
Ensure that you have the latest version of pip by running the following command. If you have followed the steps above, this will upgrade pip in your virtual environment.
|
||||
|
||||
```javascript
|
||||
pip3 install --upgrade pip setuptools
|
||||
```
|
||||
|
||||
****
|
||||
|
||||
### **2. Install the Python module for this connector**
|
||||
|
||||
Once the virtual environment is set up and activated as described in Step 1, run the following command to install the Python module for this connector.
|
||||
|
||||
```javascript
|
||||
pip3 install 'openmetadata-ingestion[snowflake]'
|
||||
```
|
||||
|
||||
|
||||
|
||||
### 3. Create a configuration file using template JSON
|
||||
|
||||
Create a new file called `snowflake.json`. 
|
||||
|
||||
@ -983,7 +1036,7 @@ Use this method in production OpenMetadata deployments in which you plan to conf
|
||||
|
||||
|
||||
|
||||
### 2. Configure service settings
|
||||
### 4. Configure service settings
|
||||
|
||||
In this step we will configure the Snowflake service settings required for this connector. Please follow the instructions below to ensure that you've configured the connector to read from your Snowflake service as desired.
|
||||
|
||||
@ -1057,7 +1110,7 @@ To specify a single database to ingest metadata from, provide the name of the da
|
||||
|
||||
|
||||
|
||||
### 3. Configure data filters (optional)
|
||||
### 5. Configure data filters (optional)
|
||||
|
||||
#### include\_views (optional)
|
||||
|
||||
@ -1139,7 +1192,7 @@ The syntax and semantics for `schema_filter_pattern` are the same as for [`table
|
||||
|
||||
|
||||
|
||||
### 4. Configure sample data (optional)
|
||||
### 6. Configure sample data (optional)
|
||||
|
||||
#### generate\_sample\_data (optional)
|
||||
|
||||
@ -1167,7 +1220,7 @@ Note: `generate_sample_data` is set to `true` by default.
|
||||
|
||||
|
||||
|
||||
### 5. Configure DBT (optional)
|
||||
### 7. Configure DBT (optional)
|
||||
|
||||
DBT provides transformation logic that creates tables and views from raw data. OpenMetadata includes an integration for DBT that enables you to see the models used to generate a table from that table's details page in the OpenMetadata user interface. The image below provides an example.
|
||||
|
||||
@ -1197,7 +1250,7 @@ Use the field `source.config.dbt_catalog_file` to specify the location of your D
|
||||
|
||||
|
||||
|
||||
### 6. Confirm `sink` settings
|
||||
### 8. Confirm `sink` settings
|
||||
|
||||
You need not make any changes to the fields defined for `sink` in the template code you copied into `snowflake.json` in Step 1. This part of your configuration file should be as follows.
|
||||
|
||||
@ -1210,7 +1263,7 @@ You need not make any changes to the fields defined for `sink` in the template c
|
||||
|
||||
|
||||
|
||||
### 7. Confirm `metadata_server` settings
|
||||
### 9. Confirm `metadata_server` settings
|
||||
|
||||
You need not make any changes to the fields defined for `metadata_server` in the template code you copied into `snowflake.json` in Step 1. This part of your configuration file should be as follows.
|
||||
|
||||
@ -1226,17 +1279,7 @@ You need not make any changes to the fields defined for `metadata_server` in the
|
||||
|
||||
|
||||
|
||||
### 8. Install the Python module for this connector
|
||||
|
||||
Run the following command to install the Python module for the Snowflake connector.
|
||||
|
||||
```bash
|
||||
pip3 install --upgrade 'openmetadata-ingestion[snowflake]'
|
||||
```
|
||||
|
||||
|
||||
|
||||
### 9. Run ingestion workflow <a href="#run-manually" id="run-manually"></a>
|
||||
### 10. Run ingestion workflow <a href="#run-manually" id="run-manually"></a>
|
||||
|
||||
Your `snowflake.json` configuration file should now be fully configured and ready to use in an ingestion workflow.
|
||||
|
||||
@ -1269,13 +1312,13 @@ This error usually occurs due to an older version of pip. Try upgrading pip as f
|
||||
pip3 install --upgrade pip setuptools
|
||||
```
|
||||
|
||||
Then re-run the install command in Step 8.
|
||||
Then re-run the install command in Step 2.
|
||||
|
||||
|
||||
|
||||
### requests.exceptions.ConnectionError
|
||||
|
||||
If you encounter the following error when attempting to run the ingestion workflow in Step 9, this is probably because there is no OpenMetadata server running at http://localhost:8585.
|
||||
If you encounter the following error when attempting to run the ingestion workflow in Step 10, this is probably because there is no OpenMetadata server running at http://localhost:8585.
|
||||
|
||||
```
|
||||
requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8585):
|
||||
@ -1286,7 +1329,7 @@ Failed to establish a new connection: [Errno 61] Connection refused'))
|
||||
|
||||
To correct this problem, please follow the steps in the [Run OpenMetadata](https://docs.open-metadata.org/v/main/try-openmetadata/run-openmetadata) guide to deploy OpenMetadata in Docker on your local machine.
|
||||
|
||||
Then re-run the metadata ingestion workflow in Step 9.
|
||||
Then re-run the metadata ingestion workflow in Step 10.
|
||||
{% endtab %}
|
||||
{% endtabs %}
|
||||
|
||||
|
||||
Loading…
x
Reference in New Issue
Block a user