* Power BI Dashboard and Tiles added
* Added Powerbi dependency
* Modified Powerbi debug logs
* Bump up _version.py
* Modified Power BI - added failure status, resolved comments
* Fix#2811: Log level option for running metadata
* Fix#2819: Kafka connector security settings along with all configs to be retrieved
* Fix#2819: Kafka connector security settings along with all configs to be retrieved
* Simplify sample data DAG
* Remove mkdir
* Generate sources before running compose
* Generate sources to install models to ingestion image
* Add python-on-whales for docker --start
* Remove python-on-whales from base
* Install venv
* Setup python
* Add logic to initialize relationships from seed data during application startup
* Remove ingestion related code for access control policies
* Move PolicyEvaluator init to PolicyResource
* Add RBAC for PATCH APIs
* Expand scope to all resources except a few (Policy, User, Role, Team, resources that doesn't support PATCH)
* Fix code smells
* Fix#1994: Add support for marking dataset entities as deleted
* Fix#1994: Add support for marking dataset entities as deleted
* Fix#1994: Add support for marking dataset entities as deleted
- Support Delete Action for S3
- Add Example s3.json
- Create AWSClient util
- Use AWSClient util in S3 ingestion source
- Remove ambiguity in policy filters by removing array with different types
* Fix typo
* Clean setup
* Update ingestion local image to be barebone on connector dependencies
* Prepare ingestion connectors base image
* Add system dependencies
* Prepare docker CLI
* Add docker provider
* Prepare entrypoint for the image
* Remove DBT pipeline as per Issue 1658
* Add TODO for ingestion build
* Bind docker socket
* Update comment
* Update README
* Use DockerOperator in sample data
* Build images with latest tag
* Prepare symlink to pass the volume to the DockerOperator
* Update README
* Prepare Base image for CI
* COPY multiple files into dir
* COPY multiple files into dir
* Remove DBT source as is now part of table ingestion
* Build docker base in run_local_docker
Implementation details
I have decided to rename schema_name to database and make it mandatory. Without database there is an error while scanning all available tables. The connector doesn't support multiple databases at the moment. It has to be tested with passwords. Trino requires SSL if you use passwords. It has to be tested with impersonation. I have removed quote_plus because I don't think it's needed.
- [x] Support username
- [ ] There is an integration test
- [ ] Support impersonation
- [ ] Support passwords
- [ ] Support tokens
- [ ] Support multiple databases
* Prepare infra
* Store experiment information in MySQL & MinIO
* Use CreateMlModelEntityRequest instead of MlModel for PUT operations
* Update MlFlow infra
* Prepare MlFlow source
* Prepare Mlflow workflow
* Simplify test and prepare README
* Revert compose
* Fix compose
* Prepare warnings and fix features
* Use non-default port for integration test
* Use mlflow-skinny for the client application