How do I copy files from public Gdrive links to my Gdrive account?

can you post the command and debug output....

as for server-side, it worked fo me

``
2020/12/22 15:01:58 DEBUG : rclone: Version "v1.54.0-beta.4772.61a166d90.fix-3625-drive-copyid" starting with parameters ["C:\data\rclone\versions\rclone-v1.54.0-beta.4772.61a166d90.fix-3625-drive-copyid-windows-amd64\rclonec.exe" "backend" "copyid" "gdrive-a1b2:" "1tLBl3rO-kvtDPPXBGomhU2DBGDkLouuc" "gdrive-a1b2:test/" "-vv"]
2020/12/22 15:01:58 DEBUG : Using config file from "c:\data\rclone\scripts\rclone.conf"
2020/12/22 15:01:59 DEBUG : Creating backend with remote "gdrive-a1b2:test/"
2020/12/22 15:01:59 DEBUG : fs cache: renaming cache item "gdrive-a1b2:test/" to be canonical "gdrive-a1b2:test"
2020/12/22 15:02:00 DEBUG : teste.rar: MD5 = 8e4d0ee4f6512742052010edbbe33b15 OK
2020/12/22 15:02:00 INFO : teste.rar: Copied (server side copy)

well, that makes you a fellow rcloner, as we all do that once, but never twice :upside_down_face:

1 Like

DEBUG : rclone: Version "v1.54.0-beta.4969.5ae5e1dd5" starting with parameters ["rclone" "-P" "backend" "copyid" "KGB:" "1kN4-4o-Y6hBIzBbsuJVI0_YvBFvYBAXm" "/patch/to/dir/drive/Shareddrives/Marv/" "-v" "--drive-server-side-across-configs" "-vv"]
2020-12-22 17:56:00 DEBUG : Using config file from "C:\Users\Uchiha\.config\rclone\rclone.conf"
2020-12-22 17:56:00 DEBUG : Creating backend with remote "/patch/to/dir/drive/Shareddrives/Marv/"
2020-12-22 17:56:00 DEBUG : fs cache: renaming cache item "/patch/to/dir/drive/Shareddrives/Marv/" to be canonical "//?/C:/patch/to/dir/drive/Shareddrives/Marv"
2020-12-22 17:56:00 INFO : Writing sparse files: use --local-no-sparse or --multi-thread-streams 0 to disable
2020-12-22 17:56:00 DEBUG : test2.rar: Starting multi-thread copy with 4 parts of size 473.938M
2020-12-22 17:56:00 DEBUG : test2.rar: multi-thread copy: stream 4/4 (1490878464-1987615966) size 473.726M starting
2020-12-22 17:56:00 DEBUG : test2.rar: multi-thread copy: stream 1/4 (0-496959488) size 473.938M starting
2020-12-22 17:56:00 DEBUG : test2.rar: multi-thread copy: stream 3/4 (993918976-1490878464) size 473.938M starting
2020-12-22 17:56:00 DEBUG : test2.rar: multi-thread copy: stream 2/4 (496959488-993918976) size 473.938M starting
Transferred: 31.668M / 1.851 GBytes, 2%, 6.232 MBytes/s, ETA 4m59s
Transferred: 0 / 1, 0%
Elapsed time: 5.6s
Transferring:

  •                                 test2.rar:  1% /1.851G, 6.315

server side is only between two remotes using the same backend.

in your case, you are downloading from gdrive to local, so that does not apply.
no --magic to get around that.

Sorry, I don't know if I understand you very well. Can't use the server side, is that it?

based on your command, canont use server side.

if source = gdrive and dest = gdrive then server side possible

if source = gdrive and dest = local then server side not possible

Got it. I think I better let it go. I wanted to copy other people's files to my drive, using rclone, as I participate in a file sharing forum. And it is very annoying to be copying files using the google drive itself, because all files come with "copy from ..." in the front.

I appreciate your help. I learned some things that I didn't know.

Thanks!
ncw
asdffdsa
Animosity02

I'm trying to copy a folder with --drive-server-side-across-configs but it keeps giving 403 forbidden?

I'm copying from a public link to my own teamdrive, and I have all the permissions for that.

If I use the add to my gdrive, then --drive-server-side-across-configs works fine

rclone copy --drive-root-folder-id 1yvNxQ1J3xbeXmJ-REDACTED 1: 1:processing/tmp --stats 2s -vv

My rclone remote is called 1:. I'm using rclone beta v1.54

can you post the debug log?

The --drive-root-folder-id is applying to the source 1: and the dest 1:. You can work around this by making a copy of 1: called 1copy: then using

export RCLONE_CONFIG_1COPY_ROOT_FOLDER_ID=1y....XXX
rclone copy 1copy: 1:processing/tmp --stats 2s -vv

Hello, I need some help regarding the same issue. the remote name is testdrive which is the main google drive remote, it gives me error. kindly help.

rclone version used- rclone v1.54.0-beta.5062.db2c38b21

DEBUG : rclone: Version "v1.54.0-beta.5062.db2c38b21" starting with parameters ["rclone" "backend" "copyid" "testdrive:" "1rE6r6XXXXXXXXXXXXMpH5r" "testdrive:000000/" "-P" "-vv"]
2021-01-15 05:37:40 DEBUG : Using config file from "C:\Users\shash\.config\rclone\rclone.conf"
Transferred: 0 / 0 Bytes, -, 0 Bytes/s, ETA -
Errors: 1 (retrying may help)
Elapsed time: 0.8s
2021/01/15 05:37:40 DEBUG : 4 go routines active
2021/01/15 05:37:40 Failed to backend: command "copyid" failed: failed copying "1rE6rXXXXXXXXXXXXXXMpH5r" to "testdrive:000000/": can't copy directory use: rclone copy --drive-root-folder-id 1rE6r6XXXXXXXXXXXXXXXMpH5r testdrive: testdrive:000000/

You can't copy a directory with copyid - it told you what to run instead in the error

rclone copy --drive-root-folder-id 1rE6r6XXXXXXXXXXXXXXXMpH5r testdrive: testdrive:000000/

I tried that too, but that was giving me some sort of error:

rclone copy --drive-root-folder-id 1rE6rXXXXXXXXXXXXMpH5r testsdbackup: testsdbackup:000000/ -vv

2021/01/16 22:43:19 DEBUG : rclone: Version "v1.54.0-beta.5062.db2c38b21" starting with parameters ["rclone" "copy" "--drive-root-folder-id" "1rE6rXXXXXXXXXXXXMpH5r" "testsdbackup:" "testsdbackup:000000/" "-vv"]
2021/01/16 22:43:19 DEBUG : Using config file from "C:\Users\shash\.config\rclone\rclone.conf"
2021/01/16 22:43:19 DEBUG : Creating backend with remote "testsdbackup:"
2021/01/16 22:43:19 DEBUG : Creating backend with remote "testsdbackup:000000/"
2021/01/16 22:43:20 DEBUG : fs cache: renaming cache item "testsdbackup:000000/" to be canonical "testsdbackup:000000"
2021/01/16 22:43:28 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: Rate Limit Exceeded, rateLimitExceeded)
2021/01/16 22:43:28 DEBUG : pacer: Rate limited, increasing sleep to 1.685336408s
2021/01/16 22:43:29 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: Rate Limit Exceeded, rateLimitExceeded)
2021/01/16 22:43:29 DEBUG : pacer: Rate limited, increasing sleep to 2.06032354s
2021/01/16 22:43:30 DEBUG : pacer: low level retry 3/10 (error googleapi: Error 403: Rate Limit Exceeded, rateLimitExceeded)
2021/01/16 22:43:30 DEBUG : pacer: Rate limited, increasing sleep to 4.42899473s
2021/01/16 22:43:35 DEBUG : pacer: Reducing sleep to 0s
2021/01/16 22:43:43 NOTICE: Building Web Applications with Node.js and Express 4.0 (UPDATE).rar: Duplicate object found in source - ignoring
2021/01/16 22:43:43 NOTICE: C# Fundamentals with C# 5.0.rar: Duplicate object found in source - ignoring
2021/01/16 22:43:43 NOTICE: SANS 660 - Kubuntu - 7.10 - Gutsy.tar.gz: Duplicate object found in source - ignoring
2021/01/16 22:43:43 NOTICE: SANS 660 - Kubuntu 6.10 - Edgy.tar.gz: Duplicate object found in source - ignoring
2021/01/16 22:43:43 NOTICE: SANS 710 - Red Hat 8.0 - Psyche.tar.gz: Duplicate object found in source - ignoring
2021/01/16 22:43:43 NOTICE: SANS 760 - Kubuntu - 12.04 - Precise Pangolin.tar.gz: Duplicate object found in source - ignoring
2021/01/16 22:43:43 DEBUG : Google drive root '000000': Waiting for checks to finish
2021/01/16 22:43:43 DEBUG : Google drive root '000000': Waiting for transfers to finish
2021/01/16 22:43:45 ERROR : Linux for Network Engineers - Part 1.rar: Failed to copy: failed to make directory: googleapi: Error 403: Forbidden, forbidden
2021/01/16 22:43:49 ERROR : 0-Day Week of 2018.07.25.rar: Failed to copy: failed to make directory: googleapi: Error 403: Forbidden, forbidden
2021/01/16 22:43:50 ERROR : 1 Hour HTML.rar: Failed to copy: failed to make directory: googleapi: Error 403: Forbidden, forbidden
2021/01/16 22:43:51 ERROR : 1 Hour JavaScript.rar: Failed to copy: failed to make directory: googleapi: Error 403: Forbidden, forbidden
2021/01/16 22:43:55 ERROR : 1. Google Certified Professional - Cloud Architect - Part 1.rar: Failed to copy: failed to make directory: googleapi: Error 403: Forbidden, forbidden
2021/01/16 22:44:00 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: Rate Limit Exceeded, rateLimitExceeded)
2021/01/16 22:44:00 DEBUG : pacer: Rate limited, increasing sleep to 1.494162261s
2021/01/16 22:44:00 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: Rate Limit Exceeded, rateLimitExceeded)
2021/01/16 22:44:00 DEBUG : pacer: Rate limited, increasing sleep to 2.916039647s
2021/01/16 22:44:02 DEBUG : pacer: Reducing sleep to 0s
2021/01/16 22:44:05 ERROR : 1. Programming Foundations - Fundamentals.rar: Failed to copy: failed to make directory: googleapi: Error 403: Forbidden, forbidden
2021/01/16 22:44:09 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: Rate Limit Exceeded, rateLimitExceeded)
2021/01/16 22:44:09 DEBUG : pacer: Rate limited, increasing sleep to 1.644867452s
2021/01/16 22:44:10 DEBUG : pacer: low level retry 2/10 (error googleapi: Error 403: Rate Limit Exceeded, rateLimitExceeded)
2021/01/16 22:44:10 DEBUG : pacer: Rate limited, increasing sleep to 2.034379908s
2021/01/16 22:44:11 DEBUG : pacer: Reducing sleep to 0s
2021/01/16 22:44:11 ERROR : 10. Jenkins Quick Start.rar: Failed to copy: failed to make directory: googleapi: Error 403: Forbidden, forbidden
2021/01/16 22:44:14 ERROR : 10. PHP Essential Training.rar: Failed to copy: failed to make directory: googleapi: Error 403: Forbidden, forbidden
2021/01/16 22:44:15 ERROR : 11. Azure IoT Essentials.rar: Failed to copy: failed to make directory: googleapi: Error 403: Forbidden, forbidden
2021/01/16 22:44:18 ERROR : 11. PHP with MySQL Essential Training - 1 The Basics.rar: Failed to copy: failed to make directory: googleapi: Error 403: Forbidden, forbidden
2021/01/16 22:44:19 ERROR : 12. AWS Backup Strategies.rar: Failed to copy: failed to make directory: googleapi: Error 403: Forbidden, forbidden
2021/01/16 22:44:20 INFO :
Transferred: 0 / 1.864 TBytes, 0%, 0 Bytes/s, ETA -
Errors: 11 (retrying may help)
Transferred: 0 / 2336, 0%
Elapsed time: 1m1.7s
Transferring:

    1. PHP with MySQL Ess…ng - 2 Build a CMS.rar: 0% /52.384M, 0/s, -
    1. Google Cloud Platf…m for the AWS User.rar: 0% /351.227M, 0/s, -
    1. Programming Founda…ons - Web Security.rar: 0% /30.209M, 0/s, -
  •      14. Implementing Azure Functions.rar:  0% /1.061G, 0/s, -
    

2021/01/16 22:44:20 ERROR : 12. PHP with MySQL Essential Training - 2 Build a CMS.rar: Failed to copy: failed to make directory: googleapi: Error 403: Forbidden, forbidden
2021/01/16 22:44:21 ERROR : 13. Google Cloud Platform for the AWS User.rar: Failed to copy: failed to make directory: googleapi: Error 403: Forbidden, forbidden
2021/01/16 22:44:25 ERROR : 13. Programming Foundations - Web Security.rar: Failed to copy: failed to make directory: googleapi: Error 403: Forbidden, forbidden
.
.
.
it continues increasing error
Errors: 65 (retrying may help)

It has copied some things, but others are getting

I think this error is because rclone is trying to recurse into directories it doesn't have permission to recurse into.

what can be done now? i just have to copy stuff from public link to my gdrive.
if there is any other command please let me know

Please read:

You can only use copyid to copy a file so use a file rather than directory.

what if there are so many files? like around 1000..
What I want to know is the command so that I can copy any publically shared folder.
or if there is any other solution please let me know..

Write a small script to run through them.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.