mirror of
https://github.com/open-metadata/OpenMetadata.git
synced 2025-07-06 08:37:22 +00:00

* GitBook: [#177] Documentation Update - Airflow * GitBook: [#195] Removing Cron from databaseServices * GitBook: [#196] Added trino * GitBook: [#197] removed cron from config * GitBook: [#198] Added Redash Documentation * GitBook: [#199] Added Bigquery Usage Documentation * GitBook: [#200] Added page link for presto * GitBook: [#201] Added Local Docker documentation * GitBook: [#202] Added Documentation for Local Docker Setup * GitBook: [#203] Added Git Command to clone Openmetadata in docs * GitBook: [#207] links update * GitBook: [#208] Updating Airflow Documentation * GitBook: [#210] Adding Python installation package under Airflow Lineage config * GitBook: [#211] Change the links to 0.5..0 * GitBook: [#213] Move buried connectors page up * GitBook: [#214] Update to connectors page * GitBook: [#215] Removed sub-categories * GitBook: [#212] Adding Discovery tutorial * GitBook: [#220] Updated steps to H2s. * GitBook: [#230] Complex queries * GitBook: [#231] Add lineage to feature overview * GitBook: [#232] Make feature overview headers verbs instead of nouns * GitBook: [#233] Add data reliability to features overview * GitBook: [#234] Add complex data types to feature overview * GitBook: [#235] Simplify and further distinguish discovery feature headers * GitBook: [#236] Add data importance to feature overview * GitBook: [#237] Break Connectors into its own section * GitBook: [#238] Reorganize first section of docs. * GitBook: [#239] Add connectors to feature overview * GitBook: [#240] Organize layout of feature overview into feature categories as agreed with Harsha. * GitBook: [#242] Make overview paragraph more descriptive. * GitBook: [#243] Create a link to Connectors section from feature overview. * GitBook: [#244] Add "discover data through association" to feature overview. * GitBook: [#245] Update importance and owners gifs * GitBook: [#246] Include a little more descriptive documentation for key features. * GitBook: [#248] Small tweaks to intro paragraph. * GitBook: [#249] Clean up data profiler paragraph. * GitBook: [#250] Promote Complex Data Types to its own feature. * GitBook: [#251] Update to advanced search * GitBook: [#252] Update Roadmap * GitBook: [#254] Remove old features page (text and screenshot based). * GitBook: [#255] Remove references to removed page. * GitBook: [#256] Add Descriptions and Tags section to feature overview. * GitBook: [#257] Update title for "Know Your Data" Co-authored-by: Ayush Shah <ayush.shah@deuexsolutions.com> Co-authored-by: Suresh Srinivas <suresh@getcollate.io> Co-authored-by: Shannon Bradshaw <shannon.bradshaw@arrikto.com> Co-authored-by: OpenMetadata <github@harsha.io>
98 lines
3.3 KiB
Markdown
98 lines
3.3 KiB
Markdown
# Processor
|
|
|
|
The **Processor** is an optional component in the workflow. It can be used to modify the record coming from sources. The Processor receives a record from The source and can modify and re-emit the event back to the workflow.
|
|
|
|
## API
|
|
|
|
```python
|
|
@dataclass
|
|
class Processor(Closeable, metaclass=ABCMeta):
|
|
ctx: WorkflowContext
|
|
|
|
@classmethod
|
|
@abstractmethod
|
|
def create(cls, config_dict: dict, metadata_config_dict: dict, ctx: WorkflowContext) -> "Processor":
|
|
pass
|
|
|
|
@abstractmethod
|
|
def process(self, record: Record) -> Record:
|
|
pass
|
|
|
|
@abstractmethod
|
|
def get_status(self) -> ProcessorStatus:
|
|
pass
|
|
|
|
@abstractmethod
|
|
def close(self) -> None:
|
|
pass
|
|
```
|
|
|
|
**create** method is called during the workflow instantiation and creates an instance of the processor.
|
|
|
|
**process** this method is called for each record coming down in the workflow chain and can be used to modify or enrich the record
|
|
|
|
**get_status** to report the status of the processor ex: how many records, failures or warnings etc..
|
|
|
|
**close** gets called before the workflow stops. Can be used to clean up any connections or other resources.
|
|
|
|
## Example
|
|
|
|
Example implementation
|
|
|
|
```python
|
|
class PiiProcessor(Processor):
|
|
config: PiiProcessorConfig
|
|
metadata_config: MetadataServerConfig
|
|
status: ProcessorStatus
|
|
client: REST
|
|
|
|
def __init__(self, ctx: WorkflowContext, config: PiiProcessorConfig, metadata_config: MetadataServerConfig):
|
|
super().__init__(ctx)
|
|
self.config = config
|
|
self.metadata_config = metadata_config
|
|
self.status = ProcessorStatus()
|
|
self.client = REST(self.metadata_config)
|
|
self.tags = self.__get_tags()
|
|
self.column_scanner = ColumnNameScanner()
|
|
self.ner_scanner = NERScanner()
|
|
|
|
@classmethod
|
|
def create(cls, config_dict: dict, metadata_config_dict: dict, ctx: WorkflowContext):
|
|
config = PiiProcessorConfig.parse_obj(config_dict)
|
|
metadata_config = MetadataServerConfig.parse_obj(metadata_config_dict)
|
|
return cls(ctx, config, metadata_config)
|
|
|
|
def __get_tags(self) -> {}:
|
|
user_tags = self.client.list_tags_by_category("user")
|
|
tags_dict = {}
|
|
for tag in user_tags:
|
|
tags_dict[tag.name.__root__] = tag
|
|
return tags_dict
|
|
|
|
def process(self, table_and_db: OMetaDatabaseAndTable) -> Record:
|
|
for column in table_and_db.table.columns:
|
|
pii_tags = []
|
|
pii_tags += self.column_scanner.scan(column.name.__root__)
|
|
pii_tags += self.ner_scanner.scan(column.name.__root__)
|
|
tag_labels = []
|
|
for pii_tag in pii_tags:
|
|
if snake_to_camel(pii_tag) in self.tags.keys():
|
|
tag_entity = self.tags[snake_to_camel(pii_tag)]
|
|
else:
|
|
logging.debug("Fail to tag column {} with tag {}".format(column.name, pii_tag))
|
|
continue
|
|
tag_labels.append(TagLabel(tagFQN=tag_entity.fullyQualifiedName,
|
|
labelType='Automated',
|
|
state='Suggested',
|
|
href=tag_entity.href))
|
|
column.tags = tag_labels
|
|
|
|
return table_and_db
|
|
|
|
def close(self):
|
|
pass
|
|
|
|
def get_status(self) -> ProcessorStatus:
|
|
return self.status
|
|
```
|