Cannot connect to HDFS/Hadoop with Kerberos

What is the problem you are having with rclone?

When using rclone to connect to an HDFS remote with Kerberos authentication, I get the following error:

rclone -vv ls hdfsremote:/
2021/05/19 16:54:24 DEBUG : Using config file from "/home/<user>/.config/rclone/rclone.conf"
2021/05/19 16:54:24 DEBUG : rclone: Version "v1.55.1" starting with parameters ["rclone" "-vv" "ls" "hdfsremote:/"]
2021/05/19 16:54:24 DEBUG : Creating backend with remote "hdfsremote:/"
2021/05/19 16:54:24 Failed to create file system for "hdfsremote:/": no available namenodes: SASL handshake: [Root cause: KDC_Error] KDC_Error: TGS Exchange Error: kerberos error response from KDC when requesting for hdfs/<namenode>@<realm>>: KRB Error: (7) KDC_ERR_S_PRINCIPAL_UNKNOWN Server not found in Kerberos database - LOOKING_UP_SERVER

I am kinited as the user in the rclone config, and my krb5.conf works for other commands. I have also ensured it is being read by setting the environment variable KRB5_CONFIG=/etc/krb5.conf. Any ideas what may be causing this?

What is your rclone version (output from rclone version)

rclone v1.55.1
- os/type: linux
- os/arch: amd64
- go/version: go1.16.3
- go/linking: static
- go/tags: none

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Ubuntu 20.04, 64-bit

Which cloud storage system are you using? (eg Google Drive)


The command you were trying to run (eg rclone copy /tmp remote:tmp)

See above

The rclone config contents with secrets removed.

type = hdfs
namenode = <namenode>:8020
username = <user>
service_principal_name = hdfs/_HOST@<realm>
data_transfer_protection = authentication

A log from the command with the -vv flag

See above

Take a look at this issue on the library the hdfs backend uses

It has some ideas to try there.

It might be worth asking for help there if you still can't get it going.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.