How to copy multiple files from directory using API

What is the problem you are having with rclone?

I can't copy multiple files using API from SFTP to GCP bucket.

If I try to run copyfile on a directory using filter it tells me that it is a dir and not a file:

curl --request POST --url http://localhost:5572/operations/copyfile --header 'content-type: application/json' --data '{"srcFs": "TestFS:", "srcRemote": "To_Test/", "dstFs": "Test-nonProd:", "dstRemote": "test-sftp-test/test/", "_config": { "DryRun": true }}'                                                         
{
	"error": "is a directory not a file",
	"input": {
		"_config": {
			"DryRun": true
		},
		"dstFs": "Test-nonProd:",
		"dstRemote": "test-sftp-test/test/",
		"srcFs": "TestFS:",
		"srcRemote": "To_Test/"
	},
	"path": "operations/copyfile",
	"status": 500
}

I also can't use the sync/copy to bucket as it says there is no such destination bucket:

curl --request POST --url http://localhost:5572/sync/copy --header 'content-type: application/json' --data '{"srcFs": "TestFS:", "srcRemote": "To_Test", "dstFs": "Test-nonProd:", "dstRemote": "test-sftp-test/test/", "_filter": { "IncludeRule": [ "*csv" ] }}'
ERROR : To_Test/mytest.csv: Failed to copy: googleapi: Error 400: Invalid bucket name: 'To_Test', invalid
ERROR : rc: "sync/copy": error: googleapi: Error 400: Invalid bucket name: 'To_Test', invalid

Run the command 'rclone version' and share the full output of the command.

~ rclone version

rclone v1.62.2

- os/version: darwin 13.0.1 (64 bit)

- os/kernel: 22.1.0 (x86_64)

- os/type: darwin

- os/arch: amd64

- go/version: go1.20.2

- go/linking: dynamic

- go/tags: none

Which cloud storage system are you using? (eg Google Drive)

Goolge Cloud Provider (destination) and SFTP server (source)

The command you were trying to run (eg rclone copy /tmp remote:tmp)

curl --request POST --url http://localhost:5572/operations/copyfile --header 'content-type: application/json' --data '{"srcFs": "TestFS:", "srcRemote": "To_Test/", "dstFs": "Test-nonProd:", "dstRemote": "test-sftp-test/test/", "_config": { "DryRun": true }}'                                                         

The rclone config contents with secrets removed.

Paste config here

A log from the command with the -vv flag

{
	"error": "is a directory not a file",
	"input": {
		"_config": {
			"DryRun": true
		},
		"dstFs": "Test-nonProd:",
		"dstRemote": "test-sftp-test/test/",
		"srcFs": "TestFS:",
		"srcRemote": "To_Test/"
	},
	"path": "operations/copyfile",
	"status": 500
}

Use sync/copy to copy a directory. operations/copyfile is for copying files only.

Google cloud storage needs a bucket name. Buckets must not have _ in them. In the above you've called the bucket To_Test which is an illegal name.

Yes. sync/copy seems to work for uploading using filter. Thank you!

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.