Fix Doc links (#7734)

* Fix Broken links

* Fix symlink

Co-authored-by: Pere Miquel Brull <peremiquelbrull@gmail.com>
This commit is contained in:
Sriharsha Chintalapani 2022-09-28 14:05:51 -07:00 committed by GitHub
parent fe12e966ae
commit 612c083612
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
21 changed files with 42 additions and 41 deletions

View File

@ -0,0 +1 @@
../src/metadata/examples/workflows

View File

@ -28,7 +28,7 @@ You can find [this](https://github.com/open-metadata/OpenMetadata/blob/main/open
## 2. Update OM Server code
Once we have updated the JSON Schema, we can start implementing our Secrets Manager, extending the `ThirdPartySecretsManager.java` abstract class located [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-service/src/main/java/org/openmetadata/catalog/secrets/ThirdPartySecretsManager.java). For example:
Once we have updated the JSON Schema, we can start implementing our Secrets Manager, extending the `ThirdPartySecretsManager.java` abstract class located [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-service/src/main/java/org/openmetadata/service/secrets/ThirdPartySecretsManager.java). For example:
```java
public abstract class AwesomeSecretsManager extends ThirdPartySecretsManager {
@ -50,7 +50,7 @@ public abstract class AwesomeSecretsManager extends ThirdPartySecretsManager {
}
```
After this, we can update `SecretsManagerFactory.java` which is a factory class. We can find this file [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-service/src/main/java/org/openmetadata/catalog/secrets/SecretsManagerFactory.java).
After this, we can update `SecretsManagerFactory.java` which is a factory class. We can find this file [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-service/src/main/java/org/openmetadata/service/secrets/SecretsManagerFactory.java).
```java
...
@ -87,4 +87,4 @@ Similar to what we did in step 2, we have to add our implementation to the facto
```
<p/><p/>
If you need support while implementing your Secret Manager client, do not hesitate to reach out to us on [Slack](https://slack.open-metadata.org/).
If you need support while implementing your Secret Manager client, do not hesitate to reach out to us on [Slack](https://slack.open-metadata.org/).

View File

@ -31,7 +31,7 @@ All OpenMetadata supported types are defined under [`OpenMetadata/openmetadata-s
The API request objects are defined under [`OpenMetadata/openmetadata-spec/src/main/resources/json/schema/api`](https://github.com/open-metadata/OpenMetadata/tree/main/openmetadata-spec/src/main/resources/json/schema/api).
## API
OpenMetadata uses the [Dropwizard](https://www.dropwizard.io/) Java framework to build REST APIs. You can locate defined APIs in the directory [`OpenMetadata/openmetadata-service/src/main/java/org/openmetadata/catalog/resources`](https://github.com/open-metadata/OpenMetadata/tree/main/openmetadata-service/src/main/java/org/openmetadata/catalog/resources). OpenMetadata uses [Swagger](https://swagger.io/) to generate API documentation following OpenAPI standards.
OpenMetadata uses the [Dropwizard](https://www.dropwizard.io/) Java framework to build REST APIs. You can locate defined APIs in the directory [`OpenMetadata/openmetadata-service/src/main/java/org/openmetadata/service/resources`](https://github.com/open-metadata/OpenMetadata/tree/main/openmetadata-service/src/main/java/org/openmetadata/service/resources). OpenMetadata uses [Swagger](https://swagger.io/) to generate API documentation following OpenAPI standards.
## System and Components
@ -40,20 +40,20 @@ OpenMetadata uses the [Dropwizard](https://www.dropwizard.io/) Java framework to
### Events
OpenMetadata captures changes to entities as `events` and stores them in the OpenMetadata server database. OpenMetadata also indexes change events in Elasticsearch to make them searchable.
The event handlers are defined under [`OpenMetadata/openmetadata-service/src/main/java/org/openmetadata/catalog/events`](https://github.com/open-metadata/OpenMetadata/tree/main/openmetadata-service/src/main/java/org/openmetadata/catalog/events) and are applied globally to any outgoing response using the `ContainerResponseFilter`.
The event handlers are defined under [`OpenMetadata/openmetadata-service/src/main/java/org/openmetadata/service/events`](https://github.com/open-metadata/OpenMetadata/tree/main/openmetadata-service/src/main/java/org/openmetadata/service/events) and are applied globally to any outgoing response using the `ContainerResponseFilter`.
### Database
OpenMetadata uses MySQL for the metadata catalog. The catalog code is located in the directory [`OpenMetadata/openmetadata-service/src/main/java/org/openmetadata/catalog/jdbi3`](https://github.com/open-metadata/OpenMetadata/tree/main/openmetadata-service/src/main/java/org/openmetadata/catalog/jdbi3).
OpenMetadata uses MySQL for the metadata catalog. The catalog code is located in the directory [`OpenMetadata/openmetadata-service/src/main/java/org/openmetadata/service/jdbi3`](https://github.com/open-metadata/OpenMetadata/tree/main/openmetadata-service/src/main/java/org/openmetadata/service/jdbi3).
The database entity tables are created using the command [`OpenMetadata/bootstrap/bootstrap_storage.sh`](https://github.com/open-metadata/OpenMetadata/blob/main/bootstrap/bootstrap_storage.sh). [Flyway](https://flywaydb.org/) is used for managing the database table versions.
### Elasticsearch
OpenMetadata uses Elasticsearch to store the Entity change events and makes it searchable by search index. The [`OpenMetadata/openmetadata-service/src/main/java/org/openmetadata/catalog/elasticsearch/ElasticSearchEventHandler.java`](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-service/src/main/java/org/openmetadata/catalog/elasticsearch/ElasticSearchEventHandler.java) is responsible for capturing the change events and updating Elasticsearch.
OpenMetadata uses Elasticsearch to store the Entity change events and makes it searchable by search index. The [`OpenMetadata/openmetadata-service/src/main/java/org/openmetadata/catalog/elasticsearch/ElasticSearchEventHandler.java`](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-service/src/main/java/org/openmetadata/service/elasticsearch/ElasticSearchEventHandler.java) is responsible for capturing the change events and updating Elasticsearch.
Elasticsearch indices are created when the [`OpenMetadata/ingestion/pipelines/metadata_to_es.json`](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/pipelines/metadata_to_es.json) ingestion connector is run.
### Authentication/Authorization
OpenMetadata uses Google OAuth for authentication. All incoming requests are filtered by validating the JWT token using the Google OAuth provider. Access control is provided by [`Authorizer`](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-service/src/main/java/org/openmetadata/catalog/security/Authorizer.java).
OpenMetadata uses Google OAuth for authentication. All incoming requests are filtered by validating the JWT token using the Google OAuth provider. Access control is provided by [`Authorizer`](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-service/src/main/java/org/openmetadata/service/security/Authorizer.java).
See the configuration file `OpenMetadata` [`/conf/openmetadata.yaml`](https://github.com/open-metadata/OpenMetadata/blob/main/conf/openmetadata.yaml) for the authentication and authorization configurations.
@ -89,4 +89,4 @@ See the directory [`OpenMetadata/ingestion/examples/airflow/dags`](https://githu
**JsonSchema python typings**
You can generate Python types for OpenMetadata models defined using Json Schema using the make generate command of the [`Makefile`](https://github.com/open-metadata/OpenMetadata/blob/main/Makefile/README.md). Generated files are located in the directory `OpenMetadata/ingestion/src/metadata/generated`
You can generate Python types for OpenMetadata models defined using Json Schema using the make generate command of the [`Makefile`](https://github.com/open-metadata/OpenMetadata/blob/main/Makefile/README.md). Generated files are located in the directory `OpenMetadata/ingestion/src/metadata/generated`

View File

@ -37,7 +37,7 @@ Once you have generated the sources, you should be able to run the tests and the
### Quality tools
When working on the Ingestion Framework, you might want to take into consideration the following style-check tooling:
- [pylint](https://www.pylint.org/) is a Static Code Analysis tool to catch errors, align coding standards and help us follow conventions and apply improvements.
- [pylint](https://pylint.pycqa.org/en/latest/) is a Static Code Analysis tool to catch errors, align coding standards and help us follow conventions and apply improvements.
- [black](https://black.readthedocs.io/en/stable/) can be used to both autoformat the code and validate that the codebase is compliant.
- [isort](https://pycqa.github.io/isort/) helps us not lose time trying to find the proper combination of importing from `stdlib`, requirements, project files…

View File

@ -14,7 +14,7 @@ OpenMetadata being a full stack project, we use the following for development:
- [Python 3.7 or higher](https://www.python.org/downloads/)
- [Node >=10.0.0](https://nodejs.org/en/download/)
- [Yarn ^1.22.0](https://classic.yarnpkg.com/lang/en/docs/install/)
- [Rpm (Optional, only to run RPM profile with maven)](https://brewinstall.org/install-rpm-on-mac-with-brew/)
- [Rpm (Optional, only to run RPM profile with maven)](https://macappstore.org/rpm/)
- Antlr 4.9.2 - `sudo make install_antlr_cli`
- Here is a snapshot of a working environment on a Macbook.
@ -76,4 +76,4 @@ We use Java for developing OpenMetadata backend server. Following are the key te
- [jsonschema2pojo](https://www.jsonschema2pojo.org/) for Java code generation
- [Dropwizard](https://www.dropwizard.io/en/latest/) for the web service application
- [JDBI3](http://jdbi.org/) for database access
- [JDBI3](http://jdbi.org/) for database access

View File

@ -63,4 +63,4 @@ Once the account is created, you can see the fields in the exported JSON file fr
IAM & Admin > Service Accounts > Keys
```
You can validate the whole Google service account setup [here](deployment/security/google).
You can validate the whole Google service account setup [here](/deployment/security/google).

View File

@ -31,7 +31,7 @@ pip3 install "openmetadata-ingestion[powerbi]"
## Metadata Ingestion
All connectors are defined as JSON Schemas.
[Here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/dashboard/powerbiConnection.json)
[Here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/dashboard/powerBIConnection.json)
you can find the structure to create a connection to PowerBI.
In order to create and run a Metadata Ingestion workflow, we will follow

View File

@ -31,7 +31,7 @@ pip3 install "openmetadata-ingestion[powerbi]"
## Metadata Ingestion
All connectors are defined as JSON Schemas.
[Here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/dashboard/powerbiConnection.json)
[Here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/dashboard/powerBIConnection.json)
you can find the structure to create a connection to PowerBI.
In order to create and run a Metadata Ingestion workflow, we will follow

View File

@ -33,7 +33,7 @@ pip3 install "openmetadata-ingestion[azuresql]"
## Metadata Ingestion
All connectors are defined as JSON Schemas.
[Here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/database/azuresqlConnection.json)
[Here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/database/azureSQLConnection.json)
you can find the structure to create a connection to AzureSQL.
In order to create and run a Metadata Ingestion workflow, we will follow
@ -434,7 +434,7 @@ workflowConfig:
#### Source Configuration
- You can find all the definitions and types for the `serviceConnection` [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/database/azuresqlConnection.json).
- You can find all the definitions and types for the `serviceConnection` [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/database/azureSQLConnection.json).
- The `sourceConfig` is defined [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/metadataIngestion/databaseServiceProfilerPipeline.json).
Note that the filter patterns support regex as includes or excludes. E.g.,

View File

@ -59,7 +59,7 @@ pip3 install "openmetadata-ingestion[bigquery-usage]"
## Metadata Ingestion
All connectors are defined as JSON Schemas.
[Here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/database/bigqueryConnection.json)
[Here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/database/bigQueryConnection.json)
you can find the structure to create a connection to BigQuery.
In order to create and run a Metadata Ingestion workflow, we will follow
@ -604,7 +604,7 @@ workflowConfig:
#### Source Configuration
- You can find all the definitions and types for the `serviceConnection` [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/database/bigqueryConnection.json).
- You can find all the definitions and types for the `serviceConnection` [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/database/bigQueryConnection.json).
- The `sourceConfig` is defined [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/metadataIngestion/databaseServiceProfilerPipeline.json).
Note that the filter patterns support regex as includes or excludes. E.g.,

View File

@ -59,7 +59,7 @@ pip3 install "openmetadata-ingestion[bigquery-usage]"
## Metadata Ingestion
All connectors are defined as JSON Schemas.
[Here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/database/bigqueryConnection.json)
[Here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/database/bigQueryConnection.json)
you can find the structure to create a connection to BigQuery.
In order to create and run a Metadata Ingestion workflow, we will follow
@ -561,7 +561,7 @@ workflowConfig:
#### Source Configuration
- You can find all the definitions and types for the `serviceConnection` [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/database/bigqueryConnection.json).
- You can find all the definitions and types for the `serviceConnection` [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/database/bigQueryConnection.json).
- The `sourceConfig` is defined [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/metadataIngestion/databaseServiceProfilerPipeline.json).
Note that the filter patterns support regex as includes or excludes. E.g.,

View File

@ -32,7 +32,7 @@ pip3 install "openmetadata-ingestion[deltalake]"
## Metadata Ingestion
All connectors are defined as JSON Schemas.
[Here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/database/deltalakeConnection.json)
[Here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/database/deltaLakeConnection.json)
you can find the structure to create a connection to Deltalake.
In order to create and run a Metadata Ingestion workflow, we will follow

View File

@ -32,7 +32,7 @@ pip3 install "openmetadata-ingestion[deltalake]"
## Metadata Ingestion
All connectors are defined as JSON Schemas.
[Here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/database/deltalakeConnection.json)
[Here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/database/deltaLakeConnection.json)
you can find the structure to create a connection to Deltalake.
In order to create and run a Metadata Ingestion workflow, we will follow

View File

@ -13,7 +13,7 @@ slug: /openmetadata/connectors/database
- [DB2](/openmetadata/connectors/database/db2)
- [DeltaLake](/openmetadata/connectors/database/deltalake)
- [Druid](/openmetadata/connectors/database/druid)
- [DynamoDB](/openmetadata/connectors/database/dynamocb)
- [DynamoDB](/openmetadata/connectors/database/dynamodb)
- [Glue](/openmetadata/connectors/database/glue)
- [Hive](/openmetadata/connectors/database/hive)
- [MariaDB](/openmetadata/connectors/database/mariadb)

View File

@ -33,7 +33,7 @@ pip3 install "openmetadata-ingestion[mariadb]"
## Metadata Ingestion
All connectors are defined as JSON Schemas.
[Here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/database/mariadbConnection.json)
[Here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/database/mariaDBConnection.json)
you can find the structure to create a connection to MariaDB.
In order to create and run a Metadata Ingestion workflow, we will follow
@ -429,7 +429,7 @@ workflowConfig:
#### Source Configuration
- You can find all the definitions and types for the `serviceConnection` [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/database/mariadbConnection.json).
- You can find all the definitions and types for the `serviceConnection` [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/database/mariaDBConnection.json).
- The `sourceConfig` is defined [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/metadataIngestion/databaseServiceProfilerPipeline.json).
Note that the filter patterns support regex as includes or excludes. E.g.,

View File

@ -33,7 +33,7 @@ pip3 install "openmetadata-ingestion[mariadb]"
## Metadata Ingestion
All connectors are defined as JSON Schemas.
[Here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/database/mariadbConnection.json)
[Here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/database/mariaDBConnection.json)
you can find the structure to create a connection to MariaDB.
In order to create and run a Metadata Ingestion workflow, we will follow
@ -382,7 +382,7 @@ workflowConfig:
#### Source Configuration
- You can find all the definitions and types for the `serviceConnection` [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/database/mariadbConnection.json).
- You can find all the definitions and types for the `serviceConnection` [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/database/mariaDBConnection.json).
- The `sourceConfig` is defined [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/metadataIngestion/databaseServiceProfilerPipeline.json).
Note that the filter patterns support regex as includes or excludes. E.g.,

View File

@ -21,14 +21,14 @@ the following docs to connect using Airflow SDK or with the CLI.
icon="air"
title="Ingest with Airflow"
text="Configure the ingestion using Airflow SDK"
link="/openmetadata/connectors/database/postgresql/airflow"
link="/openmetadata/connectors/database/postgres/airflow"
size="half"
/>
<Tile
icon="account_tree"
title="Ingest with the CLI"
text="Run a one-time ingestion using the metadata CLI"
link="/openmetadata/connectors/database/postgresql/cli"
link="/openmetadata/connectors/database/postgres/cli"
size="half"
/>
</TileContainer>
@ -75,7 +75,7 @@ Select PostgreSQL as the service type and click Next.
<div className="w-100 flex justify-center">
<Image
src="/images/openmetadata/connectors/postgresql/select-service.png"
src="/images/openmetadata/connectors/postgres/select-service.png"
alt="Select Service"
caption="Select your service from the list"
/>
@ -95,7 +95,7 @@ from.
<div className="w-100 flex justify-center">
<Image
src="/images/openmetadata/connectors/postgresql/add-new-service.png"
src="/images/openmetadata/connectors/postgres/add-new-service.png"
alt="Add New Service"
caption="Provide a Name and description for your Service"
/>
@ -111,7 +111,7 @@ desired.
<div className="w-100 flex justify-center">
<Image
src="/images/openmetadata/connectors/postgresql/service-connection.png"
src="/images/openmetadata/connectors/postgres/service-connection.png"
alt="Configure service connection"
caption="Configure the service connection by filling the form"
/>

View File

@ -93,7 +93,7 @@ from.
<div className="w-100 flex justify-center">
<Image
src="/images/openmetadata/connectors/gluepipeline/add-new-service.png"
src="/images/openmetadata/connectors/glue/add-new-service.png"
alt="Add New Service"
caption="Provide a Name and description for your Service"
/>
@ -109,7 +109,7 @@ desired.
<div className="w-100 flex justify-center">
<Image
src="/images/openmetadata/connectors/gluepipeline/service-connection.png"
src="/images/openmetadata/connectors/glue/service-connection.png"
alt="Configure service connection"
caption="Configure the service connection by filling the form"
/>

View File

@ -120,7 +120,7 @@ processor:
value: [value]
- ...
```
The processor type should be set to ` "orm-test-runner"`. For accepted test definition names and parameter value names refer to the [tests page](/content/openmetadata/ingestion/workflows/data-quality/tests.md).
The processor type should be set to ` "orm-test-runner"`. For accepted test definition names and parameter value names refer to the [tests page](/openmetadata/ingestion/workflows/data-quality/tests).
`sink` and `workflowConfig` will have the same settings than the ingestion and profiler workflow.

View File

@ -11,7 +11,7 @@ Configure Great Expectations to integrate with OpenMetadata and ingest your test
### OpenMetadata Requirements
You will to have OpenMetadata version 0.10 or later.
To deploy OpenMetadata, follow the procedure [Try OpenMetadata in Docker](quick-start/local-deployment) or follow the [Prefect Integration](/openmetadata/integrations/prefect) guide.
To deploy OpenMetadata, follow the procedure [Try OpenMetadata in Docker](/quick-start/local-deployment) or follow the [Prefect Integration](/openmetadata/integrations/prefect) guide.
Before ingesting your tests results from Great Expectations you will need to have your table metadata ingested into OpenMetadata. Follow the instruction in the [Connectors](/openmetadata/connectors) section to learn more.
@ -108,6 +108,6 @@ alt="Run Great Expectations checkpoint"
/>
### List of Great Expectations Supported Test
We currently only support a certain number of Great Expectations tests. The full list can be found in the [Tests](/openmetadata/data-quality/tests) section.
We currently only support a certain number of Great Expectations tests. The full list can be found in the [Tests](/openmetadata/ingestion/workflows/data-quality/tests) section.
If a test is not supported, there is no need to worry about the execution of your Great Expectations test. We will simply skip the tests that are not supported and continue the execution of your test suite.
If a test is not supported, there is no need to worry about the execution of your Great Expectations test. We will simply skip the tests that are not supported and continue the execution of your test suite.

View File

@ -34,7 +34,7 @@ There are some common questions with lifecycle tools, such as _Where (URL) is th
This information, together with the rest of the features brought by OpenMetadata, lets us also manage topics such as schema changes in the sources or feature drifts, with the corresponding alerting systems.
While we can already extract certain pieces of information automatically via our Connectors (e.g., [Mlflow](/connectors/mlmodel/mlflow)), there are attributes that we'll need to fill in by ourselves. Thanks to the [Solution Design](/developers/architecture/design) of OpenMetadata and the [Python SDK](/sdk/python), this is going to be a rather easy task that will unlock the full power of your **organization's metadata**.
While we can already extract certain pieces of information automatically via our Connectors (e.g., [Mlflow](/connectors/ml-model/mlflow)), there are attributes that we'll need to fill in by ourselves. Thanks to the [Solution Design](/main-concepts/high-level-design) of OpenMetadata and the [Python SDK](/sdk/python), this is going to be a rather easy task that will unlock the full power of your **organization's metadata**.
## Properties
Now that we have a clearer view of what we are trying to achieve, let's jump into a deeper view on the `MlModel` Entity [definition](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/data/mlmodel.json):
@ -142,4 +142,4 @@ In this doc, we have seen the role that OpenMetadata serves from a Machine Learn
We have shown how to use the Python API to enrich the models' metadata and add the lineage information with its related Entities and how the versioning will respond to changes.
For further information on the properties, do not hesitate to review the [JSON Schema](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/data/mlmodel.json) and for examples and usages with the Python API, you can take a look at our [integration tests](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/tests/integration/ometa/test_ometa_model_api.py).
For further information on the properties, do not hesitate to review the [JSON Schema](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/data/mlmodel.json) and for examples and usages with the Python API, you can take a look at our [integration tests](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/tests/integration/ometa/test_ometa_model_api.py).