Upgrading from 1.0 to 1.1 can be done directly on your instances. This page will list few general details you should take into consideration when running the upgrade.
## Deprecation Notice
- The 1.1 Release will be the last one with support for Python 3.7 since it is already [EOL](https://devguide.python.org/versions/).
OpenMetadata 1.2 will support Python version 3.8 to 3.10.
- In 1.2 we will completely remove the Bots configured with SSO. Only JWT will be available then. Please, upgrade your
bots if you haven't done so. Note that the UI already does not allow creating bots with SSO.
- 1.1 is the last release that will allow ingesting Impala from the Hive connector. In the next release we will
only support the Impala scheme from the Impala Connector.
## Breaking Changes for 1.1 Stable Release
### OpenMetadata Helm Chart Values
With `1.1.0` we are moving away from `global.*` helm values under openmetadata helm charts to `openmetadata.config.*`. This change is introduce as helm reserves global chart values across all the helm charts. This conflicted the use of OpenMetadata helm charts along with other helm charts for organizations using common helm values yaml files.
For example, with `1.0.X` Application version Releases, helm values would look like below -
The above command will update `global.*` with `openmetadata.config.*` yaml config. Please note, the above command is only recommended for users with custom helm values file explicit for OpenMetadata Helm Charts.
For more information, visit the official helm docs for [global chart values](https://helm.sh/docs/chart_template_guide/subcharts_and_globals/#global-chart-values).
### Elasticsearch and OpenSearch
We now support ES version up to 7.16. However, this means that we need to handle the internals a bit differently
for Elasticsearch and OpenSearch. In the server configuration, we added the following key:
```yaml
elasticsearch:
searchType: ${SEARCH_TYPE:- "elasticsearch"} # or opensearch
```
If you use Elasticsearch there's nothing to do. However, if you use OpenSearch, you will need to pass the new
parameter as `opensearch`.
### Pipeline Service Client Configuration
If reusing an old YAML configuration file, make sure to add the following inside `pipelineServiceClientConfiguration`:
```yaml
pipelineServiceClientConfiguration:
# ...
# Secrets Manager Loader: specify to the Ingestion Framework how to load the SM credentials from its env
- **Pipeline Entity**: `pipelineUrl` and `taskUrl` fields of pipeline entity has now been renamed to `sourceUrl`.
- **Chart Entity**: `chartUrl` field of chart entity has now been renamed to `sourceUrl`.
- **Dashboard Entity**: `dashboardUrl` field of dashboard entity has now been renamed to `sourceUrl`.
- **Table Entity**: `sourceUrl` field has been added to table entity which will refer to the url of data source portal (if exists). For instance, in the case of BigQuery, the `sourceUrl` field will store the URL to table details page in GCP BigQuery portal.
### Other changes
- Glue now supports custom database names via `databaseName`.
- Snowflake supports the `clientSessionKeepAlive` parameter to keep the session open for long processes.
- Databricks now supports the `useUnityCatalog` parameter to extract the metadata from unity catalog instead of hive metastore.
- Kafka and Redpanda now have the `saslMechanism` based on enum values `["PLAIN", "GSSAPI", "SCRAM-SHA-256", "SCRAM-SHA-512", "OAUTHBEARER"]`.
- OpenMetadata Server Docker Image now installs the OpenMetadata Libraries under `/opt/openmetadata` directory
- Bumped up ElasticSearch version for Docker and Kubernetes OpenMetadata Dependencies Helm Chart to `7.16.3`
With 1.1.0 version we are migrating existing test cases defined in a test suite to the corresponding table, with this change you might need to recreate the pipelines for the test suites, since due to this restructuring the existing ones are removed from Test Suites - more details about the new data quality can be found [here](/how-to-guides/data-quality-observability/quality).
As a user you will need to redeploy data quality workflows. You can go to `Quality > By Tables` to view the tables with test cases that need a workflow to be set up.