mirror of
https://github.com/open-metadata/OpenMetadata.git
synced 2025-12-12 15:57:44 +00:00
docs(ingestion-framework): use code block (#15040)
put all the `make` commands in a dedicated code block. under the `python setup` section. the `generated service` section now contains explanations for curios individuals.
This commit is contained in:
parent
8e860a56bf
commit
115c62b210
@ -21,16 +21,19 @@ has not been tested with Python 3.10 due to some libraries not supporting that a
|
||||
|
||||
{% /note %}
|
||||
|
||||
For the instructions below, there are a couple of commands you'll need to run first to prepare your environment:
|
||||
|
||||
```shell
|
||||
make install_dev
|
||||
sudo make install_antlr_cli
|
||||
make generate
|
||||
```
|
||||
|
||||
### Generated Sources
|
||||
The backbone of OpenMetadata is the series of JSON schemas defining the Entities and their properties.
|
||||
|
||||
All different parts of the code rely on those definitions. The first step to start developing new connectors is to properly set up your local environment to interact with the Entities.
|
||||
|
||||
For the instructions below, there are a couple of commands you'll need to run first to prepare your environment:
|
||||
|
||||
1. `make install_dev`
|
||||
2. `sudo make install_antlr_cli`
|
||||
|
||||
In the Ingestion Framework, this process is handled with `datamodel-code-generator`, which is able to read JSON schemas and automatically prepare `pydantic` models representing the input definitions. Please, make sure to run `make generate` from the project root to fill the `ingestion/src/metadata/generated` directory with the required models.
|
||||
|
||||
Once you have generated the sources, you should be able to run the tests and the `metadata` CLI. You can test your setup by running `make coverage` and see if you get any errors.
|
||||
|
||||
Loading…
x
Reference in New Issue
Block a user