Hadoop Files V2 Connector > Introduction to Hadoop Files V2 Connector > Hadoop Files V2 connector administration
  

Hadoop Files V2 connector administration

Before you use Hadoop Files V2 Connector in tasks, an administrator must verify the following prerequisites:

Access Kerberos-enabled Hadoop cluster

Configure the /etc/hosts file and copy the Kerberos configuration file for HDFS instances that use Kerberos authentication.
  1. 1Open the /etc/hosts file located in the /etc directory on the Secure Agent machine on Linux.
  2. 2To configure the Secure Agent to work with the Kerberos Key Distribution Center (KDC), make an entry of the KDC hosts in the /etc/hosts file.
  3. 3Copy the krb5.conf configuration file from the /etc directory in the Hadoop cluster node to the following location:
  4. <Secure Agent installation directory>/apps/jdk/zulu<latest_version>/jre/lib/security
    If the Secure Agent is already installed, copy to: <Secure Agent installation directory>/apps/jdk/jre/lib/security
  5. 4If the cluster is SSL enabled, import the certificate alias file to the following location:
  6. <Secure Agent installation directory>/jdk/jre/lib/security/cacerts
    If the Secure Agent is already installed, import to: <Secure Agent installation directory>/apps/jdk/zulu<latest_version>/jre/lib/security/cacerts
  7. 5Restart the Secure Agent.

Access the non-Kerberos enabled Hadoop cluster

If you have a Cloudera, Amazon EMR, Hortonworks, or Microsoft HDInsight instance that does not use Kerberos authentication and runs in a Hadoop cluster environment, perform the following steps:
  1. 1Copy the conf configuration files from the /etc/hadoop/conf directory in the Hadoop cluster node and place it in the Secure Agent location with full permission.