The July 2024 release of Data Governance and Catalog and Metadata Command Center includes the following new features.
Watch the following What's New video to learn about the new features and enhancements in the July 2024 release:
Link catalog sources to generate lineage
Due to technological limitations or security constraints, you might not always see complete lineage after metadata extraction. You can use Metadata Command Center to link catalog sources and construct data lineage based on rules and other criteria. You can choose source and target catalog sources to link and create lineage. You can also choose source and target schemas to restrict lineage inference to specific subsets of data objects within the data sources.
The linked assets and generated lineage links are auto-accepted by default and appear on the Catalog Source Links page in Data Governance and Catalog. Stakeholders of the source and target catalog sources can reject the auto-accepted lineage links from the Action menu. If stakeholders initially reject the generated lineage links and later accept them, they are marked as accepted in Data Governance and Catalog. Stakeholders can also view the generated lineage on the Lineage tab of the asset.
Important: You can link only relational database source systems, such as Oracle, to generate lineage.
The following image shows the Rule Definition tab with Name Matching as the rule type:
IDMC metadata synchronizes metadata from Application Integration design-time objects
You can enable IDMC metadata in Metadata Command Center to synchronize metadata from Application Integration design-time objects into the catalog.
IDMC metadata synchronizes metadata from the following design-time objects in Application Integration:
•Process
•Process Object
•Service Connector
•Connection
•Guide
•Human Task
For information about IDMC metadata, see IDMC Metadata.
Export audit history of assets
You can now export the audit history of a single asset or multiple assets.
To export the audit history of a single asset, go to the History tab of the asset and click the Export option under Settings. Before you export the audit history of an asset, you can apply filters on Audit History, include or exclude hierarchy, and select columns you want to include in the audit history report.
The following image shows the Export option in the History tab:
To export the audit history of multiple assets and asset types, click New > Others > Export Audit History. As part of the audit history, you can select assets or asset types to export, hierarchy, date range, the operation that the user has performed on the asset, and the user who performed those operations.
The following image shows the Export Audit History option in the Others tab:
For more information about exporting the audit history of assets, see Audit history of assets.
Associate an AI Model asset with glossary assets
You can now create a relationship between an AI Model asset and one of the following glossary assets:
•Glossary Business Term
•Glossary Domain
•Glossary Metric
•Glossary Subdomain
The following image shows the Relationships tab of an AI Model asset:
•Configure a tickets widget to visualize the ticket status based on the following new group options:
- Created By
- Updated By
- Task Assignee User
- Task Assignee Role
The following image shows the new options in the Group By field of the Tickets widget:
For more information about tickets widgets, see Tickets widget.
•You can now create the data quality widget to view the data quality scores for your saved searches, all assets, or specific assets. Based on your selection, you can choose the visualization for your widget.
The following image shows the options that you can select to view data quality scores in the Data Quality widget:
• Along with web pages, you can use Informatica QuickLook to quickly search for text or keywords within Adobe PDF and Microsoft Office files that you open from a web browser.
For more information about supported file types, see Supported files.
View Data Marketplace information on data lineage
When you view the data lineage for a technical asset or business asset, you can now enable a new overlay to see the assets that are added to a data collection in Data Marketplace.
The following image shows the Overlays menu:
For more information about how you can view data lineage, see View data lineage.
Search for assets linked to Data Marketplace data collections
In a search result, you can now use the Data Marketplace filter to view the assets that are added to one or more data collections in Data Marketplace.
The following image shows the results for the keyword Glados:
For more information about how you can search for assets, see Search for assets.
View the confidence score for glossary acceptance
On the Overview page of a data element, you can now view the confidence score based on which the glossary assets are automatically recommended, accepted or rejected for a technical asset.
The following image shows the Overview page of an asset:
Email notifications for data quality score changes
Configure email notifications for data quality score changes. If you are a stakeholder of the data quality rule occurrence, you will receive email notifications when the data quality score of the data quality rule occurrence drops from good to not acceptable, good to acceptable, or acceptable to not acceptable.
You can configure the Data Quality notifications from the Notification Settings in Data Governance and Catalog.
The email notification that you receive for data quality score change contains the following information:
•A link to the data quality rule occurrence that is run on asset.
•If a rule fails, a visual indicator of the data quality score in the form of a donut chart, along with insights such as the data quality score trend.
View accepted data element classifications and glossaries in the asset tooltip
When you hover over data elements in a lineage, you can see the accepted data element classifications and glossaries in the asset tooltip without selecting Glossary or Sensitivity overlays.
The following image shows the Data Element Classifications section for a data element in a lineage:
View source code extracted from database script-based source systems
On the Code tab for technical assets in Data Governance and Catalog, you can view the source code extracted from the following Databricks Pipeline assets:
•Live Table
•Live View
•Streaming Table
•Streaming View
Rename a catalog source
You can rename a catalog source. To apply the change to all associated objects, you must rerun the metadata extraction job.
- To resolve mount points during metadata extraction from Databricks sources, you can specify the environment initialization file path when you configure the Databricks catalog source. Specify the absolute path to the Python code file that defines the mount points and other environment properties related to the Databricks source.
- You can extract metadata related to workflows from Run Job Tasks.
- You can add metadata extraction filters based on Delta Live Table pipelines.
- You can extract metadata from the following Delta Live Table objects:
▪ Live Tables
▪ Live Views
▪ Pipeline Definitions
▪ Streaming Tables
▪ Streaming Views
dbt
This release includes the following enhancements:
- You can extract metadata from tests.
- You can add metadata extraction filters based on tests.
- You can specify multiple manifest.json files from dbt projects.
- When you perform connection assignment, you can assign Databricks and Amazon Athena as endpoint catalog sources.
Greenplum
This release includes the following enhancements:
- You can extract metadata from stored procedures.
- You can add metadata extraction filters based on stored procedures.
IBM Db2 for LUW
You can extract metadata from an IBM Db2 for LUW database hosted on an Amazon RDS for Db2 database.
IBM Netezza
This release includes the following enhancements:
- You can extract metadata from stored procedures.
- You can add metadata extraction filters based on stored procedures.
Informatica Intelligent Cloud Services
This release includes the following enhancements:
- You can extract metadata from Snowflake stored procedures included in a Data Integration mapping.
- You can extract metadata from Data Integration mappings that use Access Policy transformation.
Microsoft Azure Blob Storage
You can configure partition pruning when you configure the catalog source. Partition pruning helps detect the latest partitions and schemas in source systems. It improves the performance of the catalog source as the updates to partitions and schemas are verified in an incremental mode.
Microsoft Azure Data Factory
You can extract the AppendVariable activity. You can also extract supported activities that include variables with the Array data type.
Oracle
You can add metadata extraction filter conditions based on packages.
PostgreSQL
This release includes the following enhancements:
- You can extract metadata from stored procedures.
- You can add metadata extraction filters based on stored procedures.
Salesforce
You can view the lookup relationships between fields and objects.
SAP Enterprise Resource Planning (ERP)
You can extract subpackages and their assets when you extract metadata from a package.
SAP HANA Database
You can perform connection assignment to view the lineage between an SAP HANA Database table or view and an SAP BW/4HANA DataSource or SAP BW DataSource.
Tableau
This release includes the following enhancements:
- You can extract metadata from Tableau workbooks with stored procedures.
- When you perform connection assignment, you can assign Postgresql as an endpoint catalog source.
Extract group elements from hierarchical JSON files
You can extract group elements from hierarchical JSON files using the Extract Group Elements from Hierarchical Files property for the following catalog sources:
Choose the JDK version to load metadata using Java SDK
When you configure a custom catalog source to load metadata into the catalog using Java SDK, you can choose to run the JAR file with either JDK version 17 or 11. Default is 17.
Build custom JAR files with libraries that are compatible with JDK version 17. If the libraries used in the custom JAR file are incompatible with JDK version 17, you can choose JDK version 11.