* Reflection Cache for Bigquery and Redshift
* Overrided few sqlalchemy packages
* Added Geography Support
* Reformatted files
* DBT models error handling implemented
* Geography type added as a custom sqlalchemy datatype
* GEOGRAPHY and VARIANT added as custom sql types
* Implemented file formatting using black
* Implemented file formatting using black
* Fix#1994: Add support for marking dataset entities as deleted
* Fix#1994: Add support for marking dataset entities as deleted
* Fix#1994: Add support for marking dataset entities as deleted
* Use entity list from mixin
* Add entity reference helper
* Add tests for retrieving the entity reference
* Add missing space
* Fix shadowing
* Use get entity ref
* Use get entity ref
* Added entity version mixins logics to ometa API
* remove logging in line 385 used for testing
* Fixed black error + ran isort
* remove extra underscore in
* Added integration tests for OpenMetadata versions methods
* Fixed linting errors in versionMixin.py
* Fix#1952: Airflow Openmetadata lineage allow config to be read from env variable
* Fix#1952: Airflow Openmetadata lineage allow config to be read from env variable
* Fix#1952: Airflow Openmetadata lineage allow config to be read from env variable
* Fix#1952: Airflow Openmetadata lineage allow config to be read from env variable
* Fix#1952: Airflow Openmetadata lineage allow config to be read from env variable
- Support Delete Action for S3
- Add Example s3.json
- Create AWSClient util
- Use AWSClient util in S3 ingestion source
- Remove ambiguity in policy filters by removing array with different types
* Fix typo
* Clean setup
* Update ingestion local image to be barebone on connector dependencies
* Prepare ingestion connectors base image
* Add system dependencies
* Prepare docker CLI
* Add docker provider
* Prepare entrypoint for the image
* Remove DBT pipeline as per Issue 1658
* Add TODO for ingestion build
* Bind docker socket
* Update comment
* Update README
* Use DockerOperator in sample data
* Build images with latest tag
* Prepare symlink to pass the volume to the DockerOperator
* Update README
* Prepare Base image for CI
* COPY multiple files into dir
* COPY multiple files into dir
* Remove DBT source as is now part of table ingestion
* Build docker base in run_local_docker
Implementation details
I have decided to rename schema_name to database and make it mandatory. Without database there is an error while scanning all available tables. The connector doesn't support multiple databases at the moment. It has to be tested with passwords. Trino requires SSL if you use passwords. It has to be tested with impersonation. I have removed quote_plus because I don't think it's needed.
- [x] Support username
- [ ] There is an integration test
- [ ] Support impersonation
- [ ] Support passwords
- [ ] Support tokens
- [ ] Support multiple databases
* Prepare infra
* Store experiment information in MySQL & MinIO
* Use CreateMlModelEntityRequest instead of MlModel for PUT operations
* Update MlFlow infra
* Prepare MlFlow source
* Prepare Mlflow workflow
* Simplify test and prepare README
* Revert compose
* Fix compose
* Prepare warnings and fix features
* Use non-default port for integration test
* Use mlflow-skinny for the client application
- Fix broken json schema policies (minItems in array instead of minLength)
- Amend s3 ingestion to create policies
- Amend PolicyResource and PolicyRepository to support Policy Rules
* Fix#1665: Oracle connector add an option to configure oracle service name
* Fixed removal of semi colon
Fixes SQL Command Warning while ingesting
Co-authored-by: Akash Jain <15995028+akash-jain-10@users.noreply.github.com>
* Fix#1658: Ingestion changes to add dbtModel as part of Table entity
* Fixes#1652 Remove DBTModel as top level entity and capture information from DBT in existing Table entity
Co-authored-by: sureshms <suresh@getcollate.io>