A google Cloud VM to transfer files from dropbox to GCS

What is the problem you are having with rclone?

I am trying to setup a transfer from a Dropbox business account folder to a GCS bucket. I want to transfer film projects from our live storage on dropbox to an archive bucket once they are complete. They run into the TB's. This one is 1.7 TB.

I originally managed to set this up on my Mac but the transfer didn't happen 'server-side', so I have now setup a Virtual Machine on the Google Cloud Console. I installed the latest version of Rclone and setup up everything. I set the copy command and it did copy a few items and folders but then I just get access denied messages and transfer failed! Failed to copy: googleapi: Error 403: Access denied., forbidden

I am a complete novice at this and just looking for a solution for us to be able to move footage from one platform to another for longterm archival storage without it tying up a local machine and bandwidth.

Is this a problem with the Google VM or something with rclone?
The VM machine is e2-standard-4 (4 vCPUs, 16 GB Memory)
Intel Broadwell.

Any help or advice would be greatly appreciated but please remember I am a novice at this!

Run the command 'rclone version' and share the full output of the command.

rclone version
rclone v1.69.1

  • os/version: debian 12.9 (64 bit)
  • os/kernel: 6.1.0-31-cloud-amd64 (x86_64)
  • os/type: linux
  • os/arch: amd64
  • go/version: go1.24.0
  • go/linking: static
  • go/tags: none

Are you on the latest version of rclone? You can validate by checking the version listed here: Rclone downloads
-->

Which cloud storage system are you using? (eg Google Drive)

Dropbox
Google Cloud Storage

The command you were trying to run (eg rclone copy /tmp remote:tmp)

Paste command here

touch testfile.txt
rclone copy testfile.txt gcs:bhm_bu_bucket --progress

Please run 'rclone config redacted' and share the full output. If you get command not found, please make sure to update rclone.

type = dropbox
token = XXX

[gcs]
type = google cloud storage
bucket_policy_only = true
env_auth = true
project_number = XXX```



#### A log from the command that you were trying to run with the `-vv` flag  
<!-- You should use 3 backticks to begin and end your paste to make it readable.  Or use a service such as https://pastebin.com or https://gist.github.com/   -->

Paste log here

welcome to the forum,

since it worked on the mac, the problem should not with the vm or with rclone.
need to compare the setup on the mac to the setup on the vm.

for testing, do not use env_auth = true, instead put the variables into the rclone config file.
then post rclone config redacted


that is a server error, not much rclone can do about that.


need to post a full rclone debug log

How would I be able to compare the mac setup to the one on the VM?

When you say a server error, is that to do with the google console side of things? that is the main issue i am having when I try and submit a transfer from dropbox to gcs. rclone tries to do the transfer but keeps saying error after each file and that it will retry.

How do I get a full debug log?

Sorry, as mentioned this is all well outside of my skillset.

from the mac, post the output of
rclone version
rclone config redacted
rclone copy testfile.txt gcs:bhm_bu_bucket -vv

clone v1.69.1

  • os/version: darwin 15.3.1 (64 bit)
  • os/kernel: 24.3.0 (arm64)
  • os/type: darwin
  • os/arch: arm64 (ARMv8 compatible)
  • go/version: go1.24.0
  • go/linking: dynamic
  • go/tags: cmount

[dropbox]
type = dropbox
token = XXX

[gcs]
type = google cloud storage
token = XXX
client_id = XXX
project_number = XXX
client_secret = XXX
bucket_policy_only = true

2025/03/08 14:32:42 DEBUG : rclone: Version "v1.69.1" starting with parameters ["rclone" "copy" "testfile.txt" "gcs:bhm_bu_bucket" "-vv"]
2025/03/08 14:32:42 DEBUG : Creating backend with remote "testfile.txt"
2025/03/08 14:32:42 DEBUG : Using config file from "/Users/xxxxxxxx/.config/rclone/rclone.conf"
2025/03/08 14:32:42 DEBUG : fs cache: renaming child cache item "testfile.txt" to be canonical for parent "/Users/xxxxxxxxxx”
2025/03/08 14:32:42 DEBUG : Creating backend with remote "gcs:bhm_bu_bucket"
2025/03/08 14:32:43 DEBUG : testfile.txt: Need to transfer - File not found at Destination
2025/03/08 14:32:43 DEBUG : testfile.txt: md5 = 5dd39cab1c53c2c77cd352983f9641e1 OK
2025/03/08 14:32:43 INFO : testfile.txt: Copied (new)
2025/03/08 14:32:43 INFO :
Transferred: 20 B / 20 B, 100%, 0 B/s, ETA -
Transferred: 1 / 1, 100%
Elapsed time: 0.1s

you are using two different configs for remote gcs

i made mention of that when i wrote
"for testing, do not use env_auth = true, instead put the variables into the rclone config file"


the rclone config file is a simple text file using .ini format, you have two options.

  1. copy the entire rclone config file from mac to vm
  2. for remote gcs, copy and paste just the text for gcs remote from mac config file to vm config file

ok, thank you. How do i do that?

to find the config file, have two ways.

  1. from a debug log - Using config file from "/Users/xxxxxxxx/.config/rclone/rclone.conf
  2. rclone config file

The vm is saying access denied when I try and copy this. I think at this point I should just give up! It is so far beyond me now. I was hoping I could just set up a virtual machine and use rclone to copy folders from dropbox to GCS but this is too difficult. I think the issue is at the Google Cloud end rather than anything else. It is so strange as I can access the rclone config and change the settings in there but the moment I try and copy anything i get refused. I even got this message earlier. Attempt 1/3 failed with 1 errors and: googleapi: Error 403: The billing account for the owning project is disabled in state closed, accountDisabled even though this is all setup! I have tried, with the help of Gemini, to copy the config file from the mac to the VM but the public key won't work! It is all a mystery. Thank you for trying to help me. I really appreciate it but i think I'm definitely at the end of the road with this one.

really, should not matter, as i shared two ways?

and a third way:
on the vm, use rclone config to create a new remote. use the values from the working config on mac.

I have tried that too. I have even deleted the VM and started again! I still have the same issue.

Error 403: The billing account for the owning project is disabled in state closed, accountDisabled

Thats probably the core of an error. Seems to be from Google and need to be resolved with Google. I am not a Google Cloud user so I have no idea how or why