MappingAdvancedStats
With the MappingAdvancedStats REST API, you can view mapping summary and detailed statistics, such as the job throughput for mappings that run in the native or Hadoop environment.
GET Request
To request information from the MappingAdvancedStats, use the following URL:
<RESTOperationsHubService_Host>:<RESTOperationsHubService_Port>/RestOperationsHub/services/v1/MappingService/MappingAdvancedStats('jobId')
The following table describes the attributes in the MappingAdvancedStats Get URL:
Field | Type | Description |
|---|
userName | String | Required. User name to connect to the domain. You can pass the input value as a header. |
encryptedpassword | String | Required. Password for the user. Encrypt the password with the pmpasswd command line program. You can pass the input value as a header. |
securityDomain | String | Optional. The security domain to which the domain user belongs. You can pass the input value as a header. Default is Native. |
jobId | String | Required. The argument of the entity that contains the ID for the mappings. You can pass the property as part of the URL. |
Get Response
Return information for the MappingAdvancedStats for the specified Job ID.
The following table describes the MappingAdvancedStats attributes present in the body of the response for native or Hadoop environment:
Field | Type | Description |
|---|
jobId | String | The argument of the entity that contains the ID for the mapping. |
mappingStat | n/a | Container for mapping run information, such as mapping name, start time, and end time. |
status | String | State of the job run. |
mappingName | String | Name of the mapping. |
applicationName | String | Name of the application. |
serviceName | String | Name of the service. |
logFileName | String | Location of the log file. |
startTime | Integer | Start time of the job. |
endTime | Integer | End time of the job. |
executorType | String | Type of the run-time environment where you run the mapping. |
executingNode | String | The Data Integration Service node where the job ran. |
userName | String | User who started the job. |
securityDomain | String | The security domain to which the domain user belongs. |
detailedStats | n/a | Container for detailed mapping statistics. |
memoryData | String | Metrics for memory. |
createdTime | Integer | Time taken to created the job. |
cpuData | Integer | Data for CPU. |
sourceTargetStats | n/a | Container for source and target statistics. |
instanceName | String | Object name for source or target. |
isSource | Integer | For the source, specify isSource as true. For the target, specify isSource as false. |
bytes | Integer | Average number of bytes read per second for source or target. |
rows | Integer | Number of rows read for source or target. |
lastPurgeTime | Integer | Last time of the purge task. |
summaryStats | n/a | Container for the summary statistics. |
processStatSummary | n/a | Container for the CPU and memory usage. |
avgCpuUsage | Integer | Average CPU usage. |
avgMemUsage | Integer | Average memory usage. |
sourceTxStats | n/a | Container for the source statistics. |
groupStats | n/a | Container for processed and throughput statistics. |
ProcessedBytes | Integer | Bytes processed for source or target. |
ProcessedRows | Integer | Rows processed for source or target. |
bytesThrougput | Integer | Throughput in bytes for source or target. |
rowsThroughput | Integer | Throughput of rows for source or target. |
sourceThroughput | Integer | Mean rate at which messages are ingested to source. |
targetThroughput | Integer | Mean rate at which messages are written to target. |
errorRows | Integer | Error in rows for source or target. |
errorBytes | Integer | Error in bytes for source or target. |
groupName | String | Name of the group for source or target. |
firstRowTime | Integer | Time taken for the first row for source or target. |
targetTxStats | n/a | Container for target statistics. |
Sample Retrieve Advanced Mapping Statistics
The sample use case is to use the script to retrieve the details of the advanced mapping statistics on the Spark environment.
You can use the REST API to retrieve information about the advanced mapping statistics with the following request URL for a mapping with Job ID as _TNoO9ELEeiimY76kFyfuw:
<RESTOperationsHubService_Host>:<RESTOperationsHubService_Port>/RestOperationsHub/services/v1/MappingService/MappingAdvancedStats('_TNoO9ELEeiimY76kFyfuw')
Advanced Mapping Statistics Output
{
"@odata.context": "$metadata#MappingAdvancedStats/$entity",
"jobId": "_TNoO9ELEeiimY76kFyfuw",
"mappingStat": {
"status": "COMPLETED",
"mappingName": "HDFSTgtAppend_MultiPartition_SparkMode",
"applicationName": "HDFSTargetAppend",
"serviceName": "DIS_HDP_2.6",
"logFileName": "/data/Informatica/10.2.2_252/logs/node252/services/DataIntegrationService/disLogs/ms/DEPLOYED_MAPPING_HDFSTargetAppend_HDFSTgtAppend_MultiPartition_SparkMode-TNoO9ELEeiimY76kFyfuw_20181016_115325_006.log",
"startTime": 1539671005830,
"endTime": 1539671244752,
"executorType": "SPARK",
"executingNode": "node252",
"userName": "Administrator",
"securityDomain": "Native"
},
"detailedStats": {
"memoryData": [],
"createdTime": [
1539671058384
],
"cpuData": [],
"sourceTargetStats": [
{
"instanceName": "Read_students_5",
"isSource": true,
"bytes": [
-1
],
"rows": [
10
]
},
{
"instanceName": "Read_students_HDFS_src",
"isSource": true,
"bytes": [
-1
],
"rows": [
10
]
},
{
"instanceName": "Read_student",
"isSource": true,
"bytes": [
-1
],
"rows": [
10
]
},
{
"instanceName": "Write_HDFSAppendTarget",
"isSource": false,
"bytes": [
-1
],
"rows": [
28
]
}
],
"lastPurgeTime": 0
},
"summaryStats": {
"processStatSummary": {
"avgCpuUsage": 0,
"avgMemUsage": 0
},
"sourceTxStats": [
{
"instanceName": "Read_students_5",
"groupStats": [
{
"processedBytes": -1,
"processedRows": 10,
"bytesThrougput": -1,
"rowsThroughput": 10,
"errorRows": -1,
"errorBytes": -1,
"groupName": "Read_students_5",
"firstRowTime": 0
}
]
},
{
"instanceName": "Read_student",
"groupStats": [
{
"processedBytes": -1,
"processedRows": 10,
"bytesThrougput": -1,
"rowsThroughput": 10,
"errorRows": -1,
"errorBytes": -1,
"groupName": "Read_student",
"firstRowTime": 0
}
]
},
{
"instanceName": "Read_students_HDFS_src",
"groupStats": [
{
"processedBytes": -1,
"processedRows": 10,
"bytesThrougput": -1,
"rowsThroughput": 5,
"errorRows": -1,
"errorBytes": -1,
"groupName": "Read_students_HDFS_src",
"firstRowTime": 0
}
]
}
],
"targetTxStats": [
{
"instanceName": "Write_HDFSAppendTarget",
"groupStats": [
{
"processedBytes": -1,
"processedRows": 28,
"bytesThrougput": -1,
"rowsThroughput": 14,
"errorRows": -1,
"errorBytes": -1,
"groupName": "Write_HDFSAppendTarget",
"firstRowTime": 0
}
]
}
]
}
}