Saving a public Backblaze B2 link to Google Drive

What is the problem you are having with rclone?

I am looking for a way to save a public Backblaze B2 link like this one https://f000.backblazeb2.com/file/xrplfhdb/ledger.db to a Google Drive folder without having to download to local storage (computer) first.

Taking inspiration from this comment from Nick, I could run curl -s 'https://f000.backblazeb2.com/file/xrplfhdb/ledger.db' | rclone -v rcat drive-social:all/ledger.db -P

First question: But when I do that, I do not see the file at Google Drive during the course of the curl process. Must that download complete first? Would the upload to Google Drive happen only after that's over?

Second question: Is there a better way to handle Backblaze B2 public links, as rclone has a backend for Backblaze B2, as opposed to using curl.

Third question: I tried that command on Google Cloud Platform's "f1-micro" instance. I am wondering where the file's being stored when the curl is ongoing. That instance comes with only 0.6 GB of RAM with no physical SSD. So, I am a bit clueless on where the data from curl process is being stored, before getting uploaded to Google Drive.

What is your rclone version (output from rclone version)

rclone v1.52.3
- os/arch: darwin/amd64
- go version: go1.14.6

Which OS you are using and how many bits (eg Windows 7, 64 bit)

macOS 10.15.6 (19G2021), 64 bit.

Which cloud storage system are you using? (eg Google Drive)

Google Drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

curl -s 'https://f000.backblazeb2.com/file/xrplfhdb/ledger.db' | rclone -v rcat drive-social:all/ledger.db -P

The rclone config contents with secrets removed.

[drive-social]
type = drive
scope = drive
token = redacted

A log from the command with the -vv flag

NA

actually you could use this command to copy from backblaze to google drive on-the-fly:

rclone copyurl 'https://f000.backblazeb2.com/file/xrplfhdb/ledger.db' remote:path -a -P

this is the example result:

this is the bandwidth during transfer:

Thanks so much for this tip! This seems to be behaving similar to curl and piping the output to remote though. Or, is there something different in this mechanism? :thinking:

I tried to copy a 1GB file and I couldn't see it on Google Drive until the process was completed. This makes me wonder if the file is being stored locally somewhere on that machine (my computer if rclone is run locally, or on Google Cloud Platform's VM instance when run on it). But, that again brings the question how Google Cloud Platform VM instance is able to handle a 1GB file when it has no SSD attached to it.

I also tried copying to a Backblaze B2 remote instead of Google Drive, and that works as well. This however brings two other questions:

  • Where is the file stored before it's completed downloaded from the source? AKA, before it's fully uploaded to Backblaze B2.
  • I also wonder what egress charges would be from Google Cloud Platform, for the upload to Backblaze B2. (something for me to look into tomorrow morning)

It would be nice to see a "server-side-copy" functionality for Backblaze B2, like what's available on Google Drive and OneDrive.

I did not use GCP nor BackBlaze B2, but I believe rclone download in chunks and it will displayed in your cloud storage after completing all chunks and the process on the command above are on-the-fly, I mean you are not save the file on your hard drive, but in memory, then it will go to google drive.

BackBlaze B2 --> server running rclone --> Google Drive.

you could add -vv flag to the command above to see more detailed process.

The file isn't stored, it is streamed from source to destination.

b2 does have server side copy so if you have access to the source bucket you can use rclone copy and it will do a server side copy. If you are doing this with large files >5G then you will need the latest beta.

1 Like

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.