
* fix: reverse metadata collate specific content - collateContent * fix: updated 1.8 docs collateContent * refactor: added snowflake owner management in docs * fix: use reverse-metadata file relative path * fix: reverse metadata file path * fix: yaml.md files - used relative path
5.9 KiB
title | slug |
---|---|
Run the MSSQL Connector Externally | /connectors/database/mssql/yaml |
{% connectorDetailsHeader name="MSSQL" stage="PROD" platform="OpenMetadata" availableFeatures=["Metadata", "Query Usage", "Data Profiler", "Data Quality", "dbt", "Lineage", "Column-level Lineage", "Stored Procedures", "Sample Data", "Reverse Metadata (Collate Only)"] unavailableFeatures=["Owners", "Tags", "SSIS packages"] / %}
In this section, we provide guides and references to use the MSSQL connector.
Configure and schedule MSSQL metadata and profiler workflows from the OpenMetadata UI:
- Requirements
- Metadata Ingestion
- Query Usage
- Lineage
- Data Profiler
- Data Quality
- dbt Integration {% collateContent %}
- Reverse Metadata {% /collateContent %} {% partial file="/v1.8/connectors/external-ingestion-deployment.md" /%}
Requirements
MSSQL User must grant SELECT
privilege to fetch the metadata of tables and views.
-- Create a new user
-- More details https://learn.microsoft.com/en-us/sql/t-sql/statements/create-user-transact-sql?view=sql-server-ver16
CREATE USER Mary WITH PASSWORD = '********';
-- Grant SELECT on table
GRANT SELECT TO Mary;
Usage & Lineage consideration
To perform the query analysis for Usage and Lineage computation, we fetch the query logs from sys.dm_exec_cached_plans
, sys.dm_exec_query_stats
& sys.dm_exec_sql_text
system tables. To access these tables your user must have VIEW SERVER STATE
privilege.
GRANT VIEW SERVER STATE TO YourUser;
Python Requirements
{% partial file="/v1.8/connectors/python-requirements.md" /%}
To run the MSSQL ingestion, you will need to install:
pip3 install "openmetadata-ingestion[mssql]"
Metadata Ingestion
All connectors are defined as JSON Schemas. Here you can find the structure to create a connection to MSSQL.
In order to create and run a Metadata Ingestion workflow, we will follow the steps to create a YAML configuration able to connect to the source, process the Entities if needed, and reach the OpenMetadata server.
The workflow is modeled around the following JSON Schema
1. Define the YAML Config
This is a sample config for MSSQL:
{% codePreview %}
{% codeInfoContainer %}
Source Configuration - Service Connection
{% codeInfo srNumber=1 %}
scheme: Defines how to connect to MSSQL. We support mssql+pytds
, mssql+pyodbc
, and mssql+pymssql
.
{% /codeInfo %}
{% codeInfo srNumber=2 %}
username: Specify the User to connect to MSSQL. It should have enough privileges to read all the metadata.
{% /codeInfo %}
{% codeInfo srNumber=3 %}
password: Password to connect to MSSQL.
{% /codeInfo %}
{% codeInfo srNumber=4 %}
hostPort: Enter the fully qualified hostname and port number for your MSSQL deployment in the Host and Port field.
{% /codeInfo %}
{% codeInfo srNumber=5 %}
database: The initial database to establish a connection to the data source.
{% /codeInfo %}
{% codeInfo srNumber=6 %}
ingestAllDatabases: If you need to ingest multiple databases - aside from the initial one above - you can enable this option.
{% /codeInfo %}
{% codeInfo srNumber=7 %}
uriString: In case of a pyodbc
connection.
{% /codeInfo %}
{% partial file="/v1.8/connectors/yaml/database/source-config-def.md" /%}
{% partial file="/v1.8/connectors/yaml/ingestion-sink-def.md" /%}
{% partial file="/v1.8/connectors/yaml/workflow-config-def.md" /%}
Advanced Configuration
{% codeInfo srNumber=8 %}
Connection Options (Optional): Enter the details for any additional connection options that can be sent to database during the connection. These details must be added as Key-Value pairs.
{% /codeInfo %}
{% codeInfo srNumber=9 %}
Connection Arguments (Optional): Enter the details for any additional connection arguments such as security or protocol configs that can be sent to database during the connection. These details must be added as Key-Value pairs.
- In case you are using Single-Sign-On (SSO) for authentication, add the
authenticator
details in the Connection Arguments as a Key-Value pair as follows:"authenticator" : "sso_login_url"
{% /codeInfo %}
{% /codeInfoContainer %}
{% codeBlock fileName="filename.yaml" %}
source:
type: mssql
serviceName: "<service name>"
serviceConnection:
config:
type: Mssql
username: <username>
password: <password>
hostPort: <hostPort>
# database: <database>
# connectionOptions:
# key: value
# connectionArguments:
# key: value
{% partial file="/v1.8/connectors/yaml/database/source-config.md" /%}
{% partial file="/v1.8/connectors/yaml/ingestion-sink.md" /%}
{% partial file="/v1.8/connectors/yaml/workflow-config.md" /%}
{% /codeBlock %}
{% /codePreview %}
{% partial file="/v1.8/connectors/yaml/ingestion-cli.md" /%}
{% partial file="/v1.8/connectors/yaml/query-usage.md" variables={connector: "mssql"} /%}
{% partial file="/v1.8/connectors/yaml/lineage.md" variables={connector: "mssql"} /%}
{% partial file="/v1.8/connectors/yaml/data-profiler.md" variables={connector: "mssql"} /%}
{% partial file="/v1.8/connectors/yaml/auto-classification.md" variables={connector: "mssql"} /%}
{% partial file="/v1.8/connectors/yaml/data-quality.md" /%}
dbt Integration
You can learn more about how to ingest dbt models' definitions and their lineage here.