Sync Hubic backup with local folder

Hello everyone.
I would like to use RCLONE to synchronize backup folders that I have in the Hubic Cloud with my local folders. I should recover only a few gigabytes of data.
Could you tell me which command I should use?
I managed through the command
Rclone lsd remote:
To see all the list of folders but i would not go wrong with the sync command that should start from hubic to my local server.

Thank you all.


I used this command for test

rclone copy adalab:HubiC-DeskBackup_VPN_CLIENTI /Users/andrea/hubiC/adalab/

but this is the error:

2017/07/29 18:15:42 Unsolicited response received on idle HTTP channel starting with “HTTP/1.0 408 Request Time-out\r\nCache-Control: no-cache\r\nConnection: close\r\nContent-Type: text/html\r\n\r\n

408 Request Time-out

\nYour browser didn’t send a complete request in time.\n\n”; err=

and this error with the check command:

rclone check adalab:HubiC-DeskBackup_Documenti_Adalab /Users/andrea/hubiC/adalab/
2017/07/29 18:33:29 ERROR : : error listing: HTTP Error: 500: 500 Internal Error
2017/07/29 18:33:29 ERROR : Hubic Swift container HubiC-DeskBackup_Documenti_Adalab: Error building file list: error listing: Hubic Swift container HubiC-DeskBackup_Documenti_Adalab: HTTP Error: 500: 500 Internal Error
2017/07/29 18:33:29 Failed to check: error listing: Hubic Swift container HubiC-DeskBackup_Documenti_Adalab: HTTP Error: 500: 500 Internal Error


I’ve received the same error. I had to refresh the hubic token rclone config

Also, with Hubic you need to prefix the path with the defaul’ container. You will notice this if you have access Hubic through the web UI.

For example, I have a logs folder in the root of my hubic.
The command to sync this folder locally is:

rclone copy hubic:default/logs/ .

Where hubic is the name of the remote configured in ~/rclone.conf

I’m glad I am not the only left using Hubic! Because it seems pretty much dead and unmaintained by OVH these days :frowning:


Can you help me to contact the support of Hubic?

I wrote an email to but no one answered me.

Do you think that the virtual machine of OVH have a fast access to the Hubic cloud?

I must download my data from hubic because are very important for my word but there are 500 gb to download in many folders.


Hubic used to work with other containers - I have some in my account. Now you just get 500 errors trying to create them.

I don’t know why they haven’t fixed the creating containers problem!

oh, I did not know it was working with other containers.

I have read in the OVH forums (in French) that OVH was migrating the current Hubic backend to a new stable API. Written in Golang (!)
That might be the reason for this broken feature.

Maybe you could ask on the forum to see if other container support is coming back?

Hi, anyone managed to get this working? I have been copying files for the past week without any issues and this same problem happened only recently (via ACD copy and local drive copy).