Big Data Management User Guide > Mapping Sources in the Hadoop Environment > Flat File Sources
  

Flat File Sources

A mapping that is running in a Hadoop environment can read a flat file source from a native environment.
Consider the following limitations when you configure the mapping to read a flat file source:

Generate the Source File Name

You can generate the source file name for the flat file data object. The content of the file name column remains consistent across different modes of execution.
When you push processing to the specific engine for the required file types, the file name column returns the path based on the following formats:
Run-time Engine
Type of Files Processes
Returned Path
Hive
HDFS source files
<staged path><HDFS file path>
For example, hdfs://host name:port/hive/warehouse/ff.txt
Hive
Flat files in the local system
<local file path>
For example, /home/devbld/Desktop/ff.txt
Blaze
Flat files in the local system
<staged path><local file path>
For example, hdfs://host name:port/hive/warehouse/home/devbld/Desktop/ff.txt
Spark
HDFS source files
hdfs://<host name>:<port>/<file name path>
For example, hdfs://host name:port/hive/warehouse/ff.txt
Spark
Flat files in the local system
<local file path>
For example, /home/devbld/Desktop/ff.txt
The file name column returns the content in the following format for a high availability cluster: hdfs://<host name>/<file name path>
For example, hdfs://irldv:5008/hive/warehouse/ff.txt