Questions? Ask them here! [full now - please start a new topic per question]

If you’ve got questions about rclone then ask them here.

Really love rclone, especially being able to automate backing up to my amazon drive.

I have been trying rclone crypt, but keep running out of space on my drive. Does it encrypt the entire source path on a local drive, before uploading? If so, is there a way to specify where the local path may be? And shouldn’t it be possible to upload encrypted files and remove them before moving on to encrypt new files especially if using the don’t encrypt filenames option?

Thanks, and once again great program.

Rclone doesn’t make a copy of files it is uploading except if you are uploading crypted files to B2.

In that case rclone will store the files temporarily in a temporary directory. You can specify exactly where by setting the TMPDIR environment variable.

Hope that helps!

Nick

Hi, I’m trying to find a tool that support 2 ways authentication with google Drive. Is it supported by rclone ?
Thanks

I assume you mean 2-factor auth?

I think the oauth flow rclone uses to get its token will support 2fa but I haven’t tried it.

If you try it, please report back what happens!

Hi there!

First of all, thanks much for making this great tool. You are doing a great job.

Here my questions (using rclone on ubuntu 16.04):

  1. I have successfully configured an encrypted back up (rclone sync) on yandex using crypt command. All looks ok (I have checked accessing Yandex Disk on the web). What is not clear to me is how to get all the encrypted files in the cloud back to my local disk in an unencrypted format. Should I run again crypt? How?

  2. I guess I can set a cronjob to make rclone running automatically. Right?

Thanks for your support.

Cheers.

Hi, I’m trying to install rclone to test 2-factor auth. I did :
~ go get github.com/ncw/rclone

but the answer I got from the terminal was :

# golang.org/x/crypto/poly1305
go/src/golang.org/x/crypto/poly1305/poly1305_amd64.s:8 6a: No such file or directory: textflag.h
# golang.org/x/net/context/ctxhttp
go/src/golang.org/x/net/context/ctxhttp/ctxhttp_pre17.go:36: req.Cancel undefined (type *http.Request has no field or method Cancel)
# golang.org/x/oauth2/jws
go/src/golang.org/x/oauth2/jws/jws.go:75: undefined: base64.RawURLEncoding
go/src/golang.org/x/oauth2/jws/jws.go:93: undefined: base64.RawURLEncoding
go/src/golang.org/x/oauth2/jws/jws.go:113: undefined: base64.RawURLEncoding
go/src/golang.org/x/oauth2/jws/jws.go:124: undefined: base64.RawURLEncoding
go/src/golang.org/x/oauth2/jws/jws.go:151: undefined: base64.RawURLEncoding
go/src/golang.org/x/oauth2/jws/jws.go:174: undefined: base64.RawURLEncoding
# google.golang.org/api/googleapi
go/src/google.golang.org/api/googleapi/googleapi.go:289: u.RawPath undefined (type *url.URL has no field or method RawPath)
# golang.org/x/sys/unix
go/src/golang.org/x/sys/unix/asm.s:8 6a: No such file or directory: textflag.h

Hey guys I’m totally new to rclone and I have it running on a seedbox. However I’m pretty rubbish with Syntax and Linux in general.
I see the option to only copy a SINGLE file simultaneously in the available commands instead of the default 4, I’m just not sure how to input it. How exactly would I edit this for single file copy?

rclone copy /media/sdv1/********/private/deluge/data/forsync acd:TVShows

Thanks in advance.
Dave

Never mind. Figured it out :slight_smile:

rclone --transfers=1 copy /media/sdv1/xxxxxxxx/private/deluge/data/forsync acd:TVShows

That’s it! You fixed it before I could reply.

1 Like
  1. If you copied stuff to the cloud with

    rclone copy /path/to/files remote:path

Then you just copy them back by reversing the syntax

rclone copy remote:path /path/to/restore

That works for any remote, not just crypt

  1. Yes rclone works fine in crontab.
2 Likes

I would guess you are using an old version of go - pre 1.5.

Either use the precompiled versions at http://rclone.org/downloads or upgrade your go - the latest 1.7.1 is recommended.

Badly :
➜ rclone-v1.33-linux-amd64 ./rclone ls google-epfl:
2016/10/06 13:35:56 Failed to create file system for “google-epfl:”: couldn’t read info about Drive: googleapi: Error 403: The domain policy has disabled third-party Drive apps, domainPolicy
➜ rclone-v1.33-linux-amd64 ./rclone lsd google-epfl:
2016/10/06 13:36:06 Failed to create file system for “google-epfl:”: couldn’t read info about Drive: googleapi: Error 403: The domain policy has disabled third-party Drive apps, domainPolicy

Thanks! Worked perfectly.

What is exactly the difference bewteen sync and copy? As sync is one direction only, isn’t it basically just copying?

sync will delete files on the destination to make it match the source. copy never deletes files.

Otherwise they are identical. So in general use copy to recover files and only use sync to make the local and remote directories identical.

1 Like

Probably not a lot rclone can do about that :frowning:

Here is what stack overflow has to say

which indicates it might be something your administrator can enable

I am facing some issues with Hubic. While with Yandex everything worked great, with Hubic I cannot see my files copied.
remotehubic is the name of my remote for hubic.
I copied a /home/E5540/test1 folder to remotehubic/backup. All has been copied with crypt, so I was expecting to see some encrypted files and folders in the cloud.
Below the outcome: from what I understand there are some subfolders under /backup but actually a check in Hubic.com does not return anything. I see that /backup folder but without any subfolder.

E5540:~$ rclone lsd remotehubic:
           0 0001-01-01 00:00:00         0 Documents
     3038805 0001-01-01 00:00:00        10 backup
      533642 0001-01-01 00:00:00         4 default
2016/10/06 15:44:27 
Transferred:      0 Bytes (0 Bytes/s)
Errors:                 0
Checks:                 0
Transferred:            0
Elapsed time:          2s
E5540:~$ rclone ls remotehubic:
2016/10/06 15:45:11 
Transferred:      0 Bytes (0 Bytes/s)
Errors:                 0
Checks:                 0
Transferred:            0
Elapsed time:       800ms

Edit:

command

rclone ls crypthubic:backup

returns:

E5540:~$ rclone ls crypthubic:backup
   533642 Vademecum_viaggiare_sicuri.pdf
    73879 sub1/Hockenheim.jpg
2016/10/06 15:58:06 
Transferred:      0 Bytes (0 Bytes/s)
Errors:                 0
Checks:                 0
Transferred:            0
Elapsed time:        3.9s

So that means that files I have copied are there but I cannot see them in hubic.com.

What did you write in crypt remote? At a guess it says remotehubic:backup which means that when you do rclone ls crypthubic:backup you are listing a directory inside the backup container. This will have an encrypted name too.

So just using crypthubic: instead of crypthubic:backup will probably fix it.

I had a similar issue on Amazon Cloud Drive.
I made the mstake to configure the crypt reote to
AmazonCloud: instead of AmazonCloud:someSubfolder
Once I had it corrected, I had a folder on my amazon drive named someSubfolder where all the files were stored that I’ve uploaded to my crypt remote.

Still something weird happens. I was able to upload to Hubic following your advice, so I did copy a big file (encrypted). rclone did not report any error but on hubic.com I cannot see that file and also it does not count against my used space. Thoughts?

What does your .rclone.conf look like? It should look something like this. Note the : in the remote = line and the bucket. If there isn’t a : in the remote line then you will just be copying files to your local disk.

[crypthubic]
type = crypt
remote = remotehubic:secretbucket
filename_encryption = off