[ERROR] Exception: java.io.IOException: Too many open files
When you run a mapping on a Linux machine to read a large file, the mapping might fail with the following error:
[ERROR] Exception: java.io.IOException: Too many open files
To resolve this issue, perform the following steps:
1Increase the value of file-max that is the maximum File Descriptors enforced on a kernel level.
To change the file descriptor setting, edit the kernel parameter file /etc/sysctl.conf and add fs.file-max=[new value] to it.
For example:
# vi /etc/sysctl.conf
fs.file-max = 400000
2Set the ulimit. The ulimit must be less than file-max.
To change the ulimit setting, edit the file /etc/security/limits.conf and set the hard and soft limits in it.
For example:
# vi /etc/security/limits.conf
* soft nofile 40000
* hard nofile 40000
When I write a JSON file, the mapping task fails with a Java heap space error.
When you write a JSON file of size 1 GB or more, the task fails with a Java heap space error.
Set the JVM options for type DTM to increase the -Xms and -Xmx values in the system configuration details of the Secure Agent.
When I use the create a new target at runtime to write an Avro file, the schema is created with primitive data types without providing an option to include null values.
You must manually edit the schema to allow null values as required. For example: