Coping large files from gdrive on mac mojave

rclone copy gdrive: gdrive2: --dry-run -v --fast-list --drive-server-side-across-configs

Transferred: 33.294T / 33.294 TBytes, 100%, 2.892 TBytes/s, ETA 0s
Transferred: 33717 / 33717, 100%
Elapsed time: 36.5s

the transfer was successful

what is the next process??
i hope you dont mind this many questions, i want to also learn how it works

that command was a dry-run test, would not copy any files.

Yes and it was successful
Which command should i use if i want to really copy the files

remove --dry-run

keep in mind that gdrive has lots of limits in terms of how much data you can copy in a 24 hours period.
not sure exactly how it works with server-side-copying.
might want to add this https://rclone.org/drive/#drive-stop-on-upload-limit
and a debug log

I saw they have a 700gb limit aa day, not sure how accurate it is

well, you will soon find out....

I am getting an Error 404 files not found

post the command with the debug output.

2021/03/28 02:35:08 ERROR : Copied/Series/American Gods S1-2 1080p/American Gods Season 2 Mp4 1080p/American Gods S02E04.mp4: Failed to copy: googleapi: Error 404: File not found:

2021/03/28 02:38:07 ERROR : Copied/Series/24.S01-S09.and.Movie.COMPLETE.1080p.BluRay.WEB-DL.DD5.1.H.264 (464.2GB)/24 S1-3 Bluray 1080p/24.S01.1080p.BluRay.x264-SHORTBREHD[rartv]/24.S01E18.1080p.BluRay.x264-SHORTBREHD.mkv: Failed to copy: googleapi: Error 404: File not found: 1qgE7csx2eoDePso_vVpZ8GAiZo6VPNVm., notFound

i cannot see into your computer....

post the command with the debug log

i am getting a list of every file, each listed like this, i am not sure if this is what you want to see

2021/03/28 02:45:53 ERROR : Copied/4K-Remux/21.Bridges.2019.1080p.BluRay.REMUX.AVC.DTS-HD.MA.5.1-FGT.mkv: Failed to copy: googleapi: Error 404: File not found: 1YEa6mHblnIuWmOvmn4gXbaLE_toq_1QE., notFound

i have also checked my second shared drive, seems it has folders of the files trying to be copied over but the folders are empty

add -vv to the command for debug output.
or
use a debug file with --log-level=DEBUG and --log-file=log.txt`

post the top 50 lines of the output
when you post that text, use three backticks before and after the text so it will be formatted like this

2021/03/14 18:54:32 DEBUG : rclone: Version "v1.54.1" starting with parameters ["C:\\data\\rclone\\scripts\\rclone.exe" "copy" "v:\\EN07\\en07.rcloner\\keepass" "aws01_iam_vserver03_en07_rcloner:vserver03.en07.rcloner/en07.rcloner/rclone/backup/keepass" "--immutable" "--stats=0" "--include=/zip/**" "--fast-list" "--bind=192.168.62.234" "--s3-chunk-size=256M" "--s3-upload-concurrency=8" "--dry-run" "--log-level=DEBUG" "--log-file=C:\\data\\rclone\\logs\\en07.rcloner_aws01_iam_vserver03_en07_rcloner_keepass\\20210314.185330\\rclone.log"]
2021/03/14 18:54:32 DEBUG : Creating backend with remote "v:\\EN07\\en07.rcloner\\keepass"
2021/03/14 18:54:32 DEBUG : Using config file from "C:\\data\\rclone\\scripts\\rclone.conf"
2021/03/14 18:54:32 DEBUG : fs cache: renaming cache item "v:\\EN07\\en07.rcloner\\keepass" to be canonical "//?/v:/EN07/en07.rcloner/keepass"
2021/03/14 18:54:32 DEBUG : Creating backend with remote "aws01_iam_vserver03_en07_rcloner:vserver03.en07.rcloner/en07.rcloner/rclone/backup/keepass"
2021/03/14 18:54:32 DEBUG : fc: Excluded
2021/03/14 18:54:32 DEBUG : S3 bucket vserver03.en07.rcloner path en07.rcloner/rclone/backup/keepass: Waiting for checks to finish
2021/03/14 18:54:32 DEBUG : zip/keepass.20191025.200507.7z: Size and modification time the same (differ by 0s, within tolerance 100ns)
2021/03/14 18:54:32 DEBUG : zip/keepass.20191025.200507.7z: Unchanged skipping
2021/03/14 18:54:32 DEBUG : zip/keepass.20191102.153738.7z: Size and modification time the same (differ by 0s, within tolerance 100ns)

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.