Copy files using drive file ID

What is the problem you are having with rclone?

So I'm trying to copy files from a share to my private_drive, But the Share has multiple files and sub-directories but the folders itslef are not shared only the files are.
I was going to make a python script to scrape file ID and folder structure and then use those to copy the files to my drive using drive-server-side-copy but i couldn't find anyway to do this using file ID in remote.
I cant use --drive-root-folder-id= either since the folders themselves aren't shared

Is there any way that I could use rclone and pass a file with bunch of File-ID and destination path and copy the files?

What is your rclone version (output from rclone version)

rclone v1.52.3-339-gb6d3cad7-beta
- os/arch: linux/amd64
- go version: go1.15
1 Like

Are the files showing up in your shared-with-me folder?

if i open them manually then yes but there are over 3K files wouldn't want them to do it manually

if rclone doesn't support this might just use python and drive api since it does support copy using ID just need to figure out what additional args to add to make it so it copies to the correct folder rather than root folder
https://developers.google.com/drive/api/v3/reference/files/copy

Did you try --drive-shared-with-me ? does that show the files?

No, I meant does something like rclone ls --drive-shared-with-me <your drive remote>: list the files? Note, the drive remote should be of your My Drive without any specific root folder specified.

I tried it myself, based on what I have understood of your question, and it does show up for me.

If it does and you have the file names, then you can filter and copy from your my-drive remote (with the shared_with_me flag set to true) to dest-remote via server-side.

yeah I know but the problem its a public share with the file-ID hidden in a api request and base64encoded and not directly shared to my email. So it wouldn't show up in Shared with me unless i open each file manually with my account. So --drive-shared-with-me is out the question

and the folder ID for the parent folders those files are contained in are not shared with anyone so you'll have to access each file manually with the file ID or use the sites UI to navigate and open the file (think its a wordpress google drive plugin)

Got it. I don't think there is a solution for that right now. You can follow this issue to get notified when that feature is added:

2 Likes

okay so this works

curl --request POST \
  'https://www.googleapis.com/drive/v3/files/[FILEID]/copy?key=[YOUR_API_KEY]' \
  --header 'Authorization: Bearer [YOUR_ACCESS_TOKEN]' \
  --header 'Accept: application/json' \
  --header 'Content-Type: application/json' \
  --data '{"parents":["folderid"]}' \
  --compressed

will probably just script this later :slight_smile:

1 Like

This would be very easy to roll up into an rclone backend command for drive. Fancy having a go?

i got busy and never got around to it but looks like @ncw as already added a commit for this

I did yes... See the latest beta and

copyid

Copy files by ID

rclone backend copyid remote: [options] [<arguments>+]

This command copies files by ID

Usage:

rclone backend copyid drive: ID path
rclone backend copyid drive: ID1 path1 ID2 path2

It copies the drive file with ID given to the path (an rclone path which
will be passed internally to rclone copyto). The ID and path pairs can be
repeated.

The path should end with a / to indicate copy the file as named to
this directory. If it doesn't end with a / then the last path
component will be used as the file name.

If the destination is a drive backend then server side copying will be
attempted if possible.

Use the -i flag to see what would be copied before copying.

1 Like

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.