Extremely low speed between 1Fichier and Google Drive

What is the problem you are having with rclone?

Hello, I use RClone to copy files remotely between 1Fichier and Google Drive but the transfer speed is extremely low (maximum 3.5 MiB/s).

Run the command 'rclone version' and share the full output of the command.

rclone v1.59.0

os/version: Microsoft Windows 10 Home 21H1 (64 bit)
os/kernel: 10.0.19043.1826 (x86_64)
os/type: windows
os/arch: amd64
go/version: go1.18.3
go/linking: static
go/tags: cmount

Which cloud storage system are you using? (eg Google Drive)

1Fichier, Google Drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone copy -P "1Fichier:/folder/myfile" "gdrive:/folder2" --drive-chunk-size 256M -vv

The rclone config contents with secrets removed.

[gdrive]
type = drive
scope = drive
token = {"access_token":"removed"}
team_drive = 
client_id = removed
client_secret = removed

[1Fichier]
type = fichier
api_key = removed

A log from the command with the -vv flag

2022/08/09 16:56:42 DEBUG : Creating backend with remote "1Fichier:/folder/myfile"
2022/08/09 16:56:42 DEBUG : Using config file from "\rclone.conf"
2022/08/09 16:56:44 DEBUG : fs cache: adding new entry for parent of "1Fichier:/folder/myfile", "1Fichier:folder"
2022/08/09 16:56:44 DEBUG : Creating backend with remote "gdrive:/folder2"
2022/08/09 16:56:44 DEBUG : gdrive: detected overridden config - adding "{A6J6b}" suffix to name
2022/08/09 16:56:44 DEBUG : Google drive root 'folder2': 'root_folder_id = 0AIOokVZo4U6hUk9PVA' - save this in the config to speed up startup
2022/08/09 16:56:44 DEBUG : fs cache: renaming cache item "gdrive:/folder2" to be canonical "gdrive{A6J6b}:folder2"
2022-08-09 16:56:45 DEBUG : myfile: Need to transfer - File not found at Destination
2022-08-09 16:56:56 INFO : myfile: Copied (new)
Transferred: 2.352 MiB / 2.352 MiB, 100%, 241.270 KiB/s, ETA 0s
Transferred: 1 / 1, 100%
Elapsed time: 13.2s
2022/08/09 16:56:56 INFO :
Transferred: 2.352 MiB / 2.352 MiB, 100%, 241.270 KiB/s, ETA 0s
Transferred: 1 / 1, 100%
Elapsed time: 13.2s

2022/08/09 16:56:56 DEBUG : 5 go routines active

hi,
need to test the speed for each remote individually.

How can do this?

test downloading.
rclone copy 1Fichier:folder/myfile /path/to/local/dir -P -vv

Currently rclone is running and I can't stop it.

2022/08/09 18:46:22 DEBUG : rclone: Version "v1.59.0" starting with parameters ["rclone" "copy" "1Fichier:myfile" "C:\Users\xxxx\Desktop\test" "-P" "-vv"]
2022/08/09 18:46:22 DEBUG : Creating backend with remote "1Fichier:myfile"
2022/08/09 18:46:22 DEBUG : Using config file from "C:\Users\xxxx\.config\rclone\rclone.conf"
2022/08/09 18:46:23 DEBUG : fs cache: adding new entry for parent of "1Fichier:myfile", "1Fichier:"
2022/08/09 18:46:23 DEBUG : Creating backend with remote "C:\Users\xxxx\Desktop\test"
2022/08/09 18:46:23 DEBUG : fs cache: renaming cache item "C:\Users\xxxx\Desktop\test" to be canonical "//?/C:/Users/xxxx/Desktop/test"
2022-08-09 18:46:23 DEBUG : myfile: Need to transfer - File not found at Destination
2022-08-09 18:46:35 DEBUG : myfile: whirlpool = ef3ef19114b3b2ff43b6f45528a94d15a54432a0c674e2acc0a6845e7e1e8f8c0c0e5397fac654566b2b8e423f5853fda3877321e2cd66805ab1e242c4c9c8d8 OK
2022-08-09 18:46:35 INFO : myfile: Copied (new)
Transferred: 55.387 MiB / 55.387 MiB, 100%, 3.849 MiB/s, ETA 0s
Transferred: 1 / 1, 100%
Elapsed time: 12.9s
2022/08/09 18:46:35 INFO :
Transferred: 55.387 MiB / 55.387 MiB, 100%, 3.849 MiB/s, ETA 0s
Transferred: 1 / 1, 100%
Elapsed time: 12.9s

2022/08/09 18:46:35 DEBUG : 3 go routines active

2022/08/09 18:55:27 DEBUG : rclone: Version "v1.59.0" starting with parameters ["rclone" "copy" "gdrive:/myfile" "C:\Users\xxxx\Desktop\test" "-P" "-vv"]
2022/08/09 18:55:27 DEBUG : Creating backend with remote "gdrive:/myfile"
2022/08/09 18:55:27 DEBUG : Using config file from "C:\Users\xxxx\.config\rclone\rclone.conf"
2022/08/09 18:55:27 DEBUG : Google drive root 'myfile': 'root_folder_id = 0AIOokVZo4U6hUk9PVA' - save this in the config to speed up startup
2022/08/09 18:55:28 DEBUG : fs cache: adding new entry for parent of "gdrive:/mifile", "gdrive:"
2022/08/09 18:55:28 DEBUG : Creating backend with remote "C:\Users\xxxx\Desktop\test"
2022/08/09 18:55:28 DEBUG : fs cache: renaming cache item "C:\Users\xxxx\Desktop\test" to be canonical "//?/C:/Users/xxxx/Desktop/test"
2022-08-09 18:55:28 DEBUG : myfile: Need to transfer - File not found at Destination
2022-08-09 18:55:42 DEBUG : myfile: md5 = 431a7e7e88e8079ccdcbca59d835c0c6 OK
2022-08-09 18:55:42 INFO : myfile: Copied (new)
Transferred: 63.955 MiB / 63.955 MiB, 100%, 4.234 MiB/s, ETA 0s
Transferred: 1 / 1, 100%
Elapsed time: 15.1s
2022/08/09 18:55:42 INFO :
Transferred: 63.955 MiB / 63.955 MiB, 100%, 4.234 MiB/s, ETA 0s
Transferred: 1 / 1, 100%
Elapsed time: 15.1s

2022/08/09 18:55:42 DEBUG : 5 go routines active

--- all the speeds are very close to one another.
--- as a test, might want to transfer multiple files. might get a better results.
--- what is the result of a speedtest, such as speedtest.net

--- as a side note:
it might be, that with 1fichier
rclone has to calculate the hash before rclone can start the upload,
then that time is included in the speed calculations.

so the resultant calculated speed will be less than the actual speed to upload.

Speedtest: https://www.speedtest.net/result/13517898788.png

--- at best, your download speed from 1fichier would approx 6.75MiB/s
i never used 1fichier, no idea what to expect for speeds.

--- the rclone speeds seems ok to me, given downloading a single file of small size,

--- as a test, download multiple files so that rclone is downloading more than one file at a time.

Okay, so I understand that the transfer speed is strictly PC dependent.
Then is there a way to increase the speed considerably by doing without the PC? Seeing the logs of other users I notice lightning speed. Any best practice or suggestion?

Your upload speed is 16.7 Mbit/s this is 2.1 MByte/s.
Your download speed is 53.6 Mbit/s this is 6.7 MByte/s

For a cloud to cloud transfer, the maximum speed you can expect is the minimum of the two so 2.1 MByte/s

Can you do the transfer from a VM with much better connectivity in a datacenter? You could rent one from hetzner for a few hours / days to do the transfer at very little cost and that would probably be able to do 100 MByte/s transfers.

you can rent a cheap vm/seedbox in the cloud and run rclone on that vm.
i do this on a regular basis.

i have one blachost vm and two hetzner vm.
both are very reliable.

fwiw, with that vm in the cloud, should encrypt the config file.

Thank you, I solved it with: GitHub - TheCaduceus/Multi-Cloud-Transfer-Tool: The most Advanced yet simple Multi Cloud tool to transfer/manage your Data from any cloud to any cloud remotely based on Rclone and other engines remotely.⚡
Maximum transfer rate (1Fichier --> Google Drive) achieved 87,080 MiB/s.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.