[BUG] Sync dropbox to gdrive failing for large files > 50GB ( error unexpected EOF )


#17

:frowning:

Can you try this to copy the file to local but with a bwlimit? I want to see if that gives the same Unexpected EOF error

 rclone copy dropbox:/crypt/0ibcad98vheukutqb6r435td0c/4e7hr02dubchi1dsakobuu894l33gg5vf0uodu58b3687p2vdp3m7pcg08hokc1cu3fcahm0pf80m/a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08 /storage/ -vv --bwlimit 2.5M

#18

@ncw its some small file that got instantly copied

2019/02/10 07:12:59 DEBUG : rclone: Version "v1.45-175-gb7de616e-fix-dropbox-list-beta" starting with parameters ["rclone" "copy" "dropbox:/crypt/0ibcad98vheukutqb6r435td0c/4e7hr02dubchi1dsakobuu894l33gg5vf0uodu58b3687p2vdp3m7pcg08hokc1cu3fcahm0pf80m/a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08" "/storage/" "-vv" "--bwlimit" "2.5M" "--log-file=/storage/rclonetest.log"]
2019/02/10 07:12:59 DEBUG : Using config file from "/home/plex/.config/rclone/rclone.conf"
2019/02/10 07:12:59 INFO  : Starting bandwidth limiter at 2.500MBytes/s
2019/02/10 07:13:00 DEBUG : Dropbox root 'crypt/0ibcad98vheukutqb6r435td0c/4e7hr02dubchi1dsakobuu894l33gg5vf0uodu58b3687p2vdp3m7pcg08hokc1cu3fcahm0pf80m/a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08': Using root namespace "1856006048"
2019/02/10 07:13:02 DEBUG : a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08: Size and modification time the same (differ by 0s, within tolerance 1s)
2019/02/10 07:13:02 DEBUG : a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08: Unchanged skipping
2019/02/10 07:13:02 INFO  :
Transferred:             0 / 0 Bytes, -, 0 Bytes/s, ETA -
Errors:                 0
Checks:                 1 / 1, 100%
Transferred:            0 / 0, -
Elapsed time:        2.6s

2019/02/10 07:13:02 DEBUG : 5 go routines active
2019/02/10 07:13:02 DEBUG : rclone: Version "v1.45-175-gb7de616e-fix-dropbox-list-beta" finishing with parameters ["rclone" "copy" "dropbox:/crypt/0ibcad98vheukutqb6r435td0c/4e7hr02dubchi1dsakobuu894l33gg5vf0uodu58b3687p2vdp3m7pcg08hokc1cu3fcahm0pf80m/a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08" "/storage/" "-vv" "--bwlimit" "2.5M" "--log-file=/storage/rclonetest.log"]

#19

Change it for bigger file in that folder

rclone copy dropbox:/crypt/0ibcad98vheukutqb6r435td0c/4e7hr02dubchi1dsakobuu894l33gg5vf0uodu58b3687p2vdp3m7pcg08hokc1cu3fcahm0pf80m/a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08 /storage/ -vv --bwlimit 2.5M --log-file=/storage/rclonetest.log

Doing copy now

DEBUG : rclone: Version "v1.45-175-gb7de616e-fix-dropbox-list-beta" starting with parameters ["rclone" "copy" "dropbox:/crypt/0ibcad98vheukutqb6r435td0c/4e7hr02dubchi1dsakobuu894l33gg5vf0uodu58b3687p2vdp3m7pcg08hokc1cu3fcahm0pf80m/a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08" "/storage/" "-vv" "--bwlimit" "2.5M" "--log-file=/storage/rclonetest.log"]
2019/02/10 07:15:13 DEBUG : Using config file from "/home/plex/.config/rclone/rclone.conf"
2019/02/10 07:15:13 INFO  : Starting bandwidth limiter at 2.500MBytes/s
2019/02/10 07:15:14 DEBUG : Dropbox root 'crypt/0ibcad98vheukutqb6r435td0c/4e7hr02dubchi1dsakobuu894l33gg5vf0uodu58b3687p2vdp3m7pcg08hokc1cu3fcahm0pf80m/a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08': Using root namespace "1856006048"
2019/02/10 07:15:14 DEBUG : a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08: Couldn't find file - need to transfer
2019/02/10 07:16:14 INFO  :
Transferred:      145.340M / 48.363 GBytes, 0%, 2.371 MBytes/s, ETA 5h47m5s
Errors:                 0
Checks:                 0 / 0, -
Transferred:            0 / 1, 0%
Elapsed time:      1m1.2s
Transferring:
 * a0joarrmo3e080n6pkdgst…59tgfh873a192i7hboog08:  0% /48.363G, 2.516M/s, 5h27m9s

2019/02/10 07:17:14 INFO  :
Transferred:      295.340M / 48.363 GBytes, 1%, 2.435 MBytes/s, ETA 5h36m58s
Errors:                 0
Checks:                 0 / 0, -
Transferred:            0 / 1, 0%
Elapsed time:      2m1.2s
Transferring:
 * a0joarrmo3e080n6pkdgst…59tgfh873a192i7hboog08:  0% /48.363G, 2.500M/s, 5h28m8s

Until its figured out can I use--exclude /crypt/0ibcad98vheukutqb6r435td0c/ to exclude 4K movies since they stuck my sync.


#20

@ncw something is beyond weird

rclone ls dropbox:/crypt/0ibcad98vheukutqb6r435td0c/4e7hr02dubchi1dsakobuu894l33gg5vf0uodu58b3687p2vdp3m7pcg08hokc1cu3fcahm0pf80m/
      67474 1upgpnmsd7o1bmgpf3e219t7q8lil2aiugo93qvlmdiljan58snam6f5k8e0nsk46ede3cotdvjgqum1dgnckdq0krrghoceebpva30
51929126633 a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08

cmd:
rclone copy dropbox:/crypt/0ibcad98vheukutqb6r435td0c/4e7hr02dubchi1dsakobuu894l33gg5vf0uodu58b3687p2vdp3m7pcg08hokc1cu3fcahm0pf80m/a0joarrmo3 e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08 /storage/ -vv --bwlimit 2.5M --log-file=/storage/rclonetest.log

The file is around 50GB however last entry in log shows over 100GB transferred but no errors.

2019/02/10 19:06:14 INFO  :
Transferred:      104.102G / 141.281 GBytes, 74%, 2.499 MBytes/s, ETA 4h13m56s
Errors:                 0
Checks:                 0 / 0, -
Transferred:            0 / 1, 0%
Elapsed time:  11h51m1.2s
Transferring:
 * a0joarrmo3e080n6pkdgst…59tgfh873a192i7hboog08: 23% /48.363G, 2.500M/s, 4h13m49s

Full log: https://ufile.io/fqook ( I did not stop copy yet )

Will run it over night with 1M:

2019/02/10 19:18:23 DEBUG : rclone: Version "v1.45-175-gb7de616e-fix-dropbox-list-beta" starting with parameters ["rclone" "copy" "dropbox:/crypt/0ibcad98vheukutqb6r435td0c/4e7hr02dubchi1dsakobuu894l33gg5vf0uodu58b3687p2vdp3m7pcg08hokc1cu3fcahm0pf80m/a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08" "/storage/" "--bwlimit" "1M" "--log-level=DEBUG" "--log-file=/storage/rclonetest.log"]
2019/02/10 19:18:23 DEBUG : Using config file from "/home/plex/.config/rclone/rclone.conf"
2019/02/10 19:18:23 INFO  : Starting bandwidth limiter at 1MBytes/s
2019/02/10 19:18:24 DEBUG : Dropbox root 'crypt/0ibcad98vheukutqb6r435td0c/4e7hr02dubchi1dsakobuu894l33gg5vf0uodu58b3687p2vdp3m7pcg08hokc1cu3fcahm0pf80m/a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08': Using root namespace "1856006048"
2019/02/10 19:18:25 DEBUG : a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08: Couldn't find file - need to transfer
2019/02/10 19:19:24 INFO  :
Transferred:       60.090M / 48.363 GBytes, 0%, 1002.894 kBytes/s, ETA 14h1m44s
Errors:                 0
Checks:                 0 / 0, -
Transferred:            0 / 1, 0%
Elapsed time:      1m1.3s
Transferring:
 * a0joarrmo3e080n6pkdgst…59tgfh873a192i7hboog08:  0% /48.363G, 1.014M/s, 13h33m16s

#21

OK, I have a cunning plan…

If the source returns an error then this beta will re-open it at the right place and carry on. So what I’m hoping is that this fix your problems.

It should write debugs when it kicks in

https://beta.rclone.org/branch/v1.46.0-006-g4fc90dd7-fix-2108-resume-downloads-beta/ (uploaded in 15-30 mins)

I think what has happened is you’ve had a high level retry and it has retried the sync from the start.


#22

I had default flags, when i cancled the transfer the file on HDD was 5GB and it transferred close to 50GB.

Testing now with new version
rclone copy dropbox:/crypt/0ibcad98vheukutqb6r435td0c/4e7hr02dubchi1dsakobuu894l33gg5vf0uodu58b3687p2vdp3m7pcg08hokc1cu3fcahm0pf80m/a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08 /storage/ --bwlimit 9M --log-level=DEBUG --log-file=/storage/rclonetest.log

First part of log:

2019/02/11 07:31:09 DEBUG : rclone: Version "v1.46.0-006-g4fc90dd7-fix-2108-resume-downloads-beta" starting with parameters ["rclone" "copy" "dropbox:/crypt/0ibcad98vheukutqb6r435td0c/4e7hr02dubchi1dsakobuu894l33gg5vf0uodu58b3687p2vdp3m7pcg08hokc1cu3fcahm0pf80m/a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08" "/storage/" "--bwlimit" "9M" "--log-level=DEBUG" "--log-file=/storage/rclonetest.log"]
    2019/02/11 07:31:09 DEBUG : Using config file from "/home/plex/.config/rclone/rclone.conf"
    2019/02/11 07:31:09 INFO  : Starting bandwidth limiter at 9MBytes/s
    2019/02/11 07:31:10 DEBUG : Dropbox root 'crypt/0ibcad98vheukutqb6r435td0c/4e7hr02dubchi1dsakobuu894l33gg5vf0uodu58b3687p2vdp3m7pcg08hokc1cu3fcahm0pf80m/a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08': Using root namespace "1856006048"
    2019/02/11 07:31:11 DEBUG : a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08: Couldn't find file - need to transfer
    2019/02/11 07:32:10 INFO  :
    Transferred:      500.746M / 48.363 GBytes, 1%, 8.171 MBytes/s, ETA 1h39m59s
    Errors:                 0
    Checks:                 0 / 0, -
    Transferred:            0 / 1, 0%
    Elapsed time:      1m1.2s
    Transferring:
     * a0joarrmo3e080n6pkdgst…59tgfh873a192i7hboog08:  1% /48.363G, 8.989M/s, 1h30m53s

Will post updates later on today


#23

Every low-level retry will potentially send more data too.

Thank you :smile: Look forward to hearing how it went!


#24

@ncw It worked :slight_smile:
Version "v1.46.0-006-g4fc90dd7-fix-2108-resume-downloads-beta"

cmd:
rclone copy dropbox:/crypt/0ibcad98vheukutqb6r435td0c/4e7hr02dubchi1dsakobuu894l33gg5vf0uodu58b3687p2vdp3m7pcg08hokc1cu3fcahm0pf80m/a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08 /storage/ --bwlimit 9M --log-level=DEBUG --log-file=/storage/rclonetest.log

Last part of log:

2019/02/11 09:03:07 INFO  : a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08: Copied (new)
2019/02/11 09:03:07 INFO  : 
Transferred:   	   48.363G / 48.363 GBytes, 100%, 8.974 MBytes/s, ETA 0s
Errors:                 0
Checks:                 0 / 0, -
Transferred:            1 / 1, 100%
Elapsed time:  1h31m58.4s

2019/02/11 09:03:07 DEBUG : 6 go routines active
2019/02/11 09:03:07 DEBUG : rclone: Version "v1.46.0-006-g4fc90dd7-fix-2108-resume-downloads-beta" finishing with parameters ["rclone" "copy" "dropbox:/crypt/0ibcad98vheukutqb6r435td0c/4e7hr02dubchi1dsakobuu894l33gg5vf0uodu58b3687p2vdp3m7pcg08hokc1cu3fcahm0pf80m/a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08" "/storage/" "--bwlimit" "9M" "--log-level=DEBUG" "--log-file=/storage/rclonetest.log"]

Full log: https://ufile.io/agvri


#25

Excellent :smile:

I can see the workaround doing its thing here

2019/02/11 09:01:21 DEBUG : a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08: Reopening on read failure after 50948210688 bytes: retry 1/10: unexpected EOF

Which is where it would have bombed out previously.

I’ll merge this fix in a moment.


#26

I’ve merged this to master now which means it will be in the latest beta in 15-30 mins and released in v1.47


#27

@ncw is working and so far I had this errors:

2019/02/13 12:23:26 ERROR : crypt/qbu2chuhnvlm4gjmeofgbfhjfk/8netjqhdfmldpa0f0eqghn90ug/o9ug765ouan7qg9rtlpn404cds/bt5h0iaa8kfnegtq7qq8jm56i8aqvdvldg0k63gndfn8n3186os3e15bh3rkug0nsgmdd31msg7aq: Failed to copy: failed to open source object: path/not_found/...

2019/02/13 15:08:23 DEBUG : pacer: low level retry 1/10 (error googleapi: got HTTP response code 502 with body: <!DOCTYPE html>|
|---|---|
<title>Error 502 (Server Error)!!1</title>|
*{margin:0;padding:0}html,code{font:15px/22px arial,sans-serif}html{background:#fff;color:#222;padding:15px}body{margin:7% auto 0;max-width:390px;min-height:180px;padding:30px 0 15px}* > body{background:url(//www.google.com/images/errors/robot.png) 100% 5px no-repeat;padding-right:205px}p{margin:11px 0 22px;overflow:hidden}ins{color:#777;text-decoration:none}a img{border:0}@media screen and (max-width:772px){body{background:none;margin-top:0;max-width:none;padding-right:0}}#logo{background:url(//www.google.com/images/branding/googlelogo/1x/googlelogo_color_150x54dp.png) no-repeat;margin-left:-5px}@media only screen and (min-resolution:192dpi){#logo{background:url(//www.google.com/images/branding/googlelogo/2x/googlelogo_color_150x54dp.png) no-repeat 0% 0%/100% 100%;-moz-border-image:url(//www.google.com/images/branding/googlelogo/2x/googlelogo_color_150x54dp.png) 0}}@media only screen and (-webkit-min-device-pixel-ratio:2){#logo{background:url(//www.google.com/images/branding/googlelogo/2x/googlelogo_color_150x54dp.png) no-repeat;-webkit-background-size:100% 100%}}#logo{display:inline-block;height:54px;width:150px}|
<p><b>502.</b> <ins>That’s an error.</ins>|
<p>The server encountered a temporary error and could not complete your request.<p>Please try again in 30 seconds.  <ins>That’s all we know.</ins>|

Otherwise its working ok, current last log

Transferred:   	    1.058T / 1.570 TBytes, 67%, 8.998 MBytes/s, ETA 16h35m12s
Errors:                 1 (retrying may help)
Checks:             77511 / 77511, 100%
Transferred:          374 / 568, 66%
Elapsed time:  34h14m0.6s
Transferring:
 * crypt/qbu2chuhnvlm4gjm…6dcla2cqdr31j7qks8q9a4: 16% /3.315G, 2.336M/s, 20m16s
 * crypt/qbu2chuhnvlm4gjm…7v582opt2q0ptnbtovves4: 22% /4.318G, 2.361M/s, 24m16s
 * crypt/qbu2chuhnvlm4gjm…kmoluhi69b7agfg49e21ei: 61% /1.400G, 2.080M/s, 4m23s
 * crypt/qbu2chuhnvlm4gjm…u7n6toq6nfii5ks8dal1rg: 93% /2.849G, 2.217M/s, 1m20s

#28

It is looking much better than before :slight_smile:

Strange… Did the file disappear during the sync?

That appears to be the google server being overloaded and returning an HTML 502 error! It looks like rclone retried successfully and it carried on.


#29

No more 502 errors, this one was new one

019/02/14 05:13:10 DEBUG : pacer: low level retry 1/10 (error Post https://www.googleapis.com/upload/drive/v3/files?alt=json&fields=id%2Cname%2Csize%2Cmd5Checksum%2Ctrashed%2CmodifiedTime%2CcreatedTime%2CmimeType%2Cparents%2CwebViewLink&uploadType=resumable&upload_id=AEnB2UqQBUYEQHyRw6WrnCRpx52UMTdv7ULihMghpp5JCmZ09lAkLwOA6atxHp-KH82NlAJ021J_S2I72eKbNuqlcp3IyguuWA: read tcp 19X.XXX.XXX.XXX:39434->172.217.18.74:443: read: connection reset by peer)

46h into sync

Transferred:   	    1.445T / 1.564 TBytes, 92%, 8.996 MBytes/s, ETA 3h50m32s
Errors:                 5 (retrying may help)
Checks:             77511 / 77511, 100%
Transferred:          549 / 564, 97%
Elapsed time:  46h48m0.6s
Transferring:
 * crypt/hm2ka8or5qp1pefh…46jbfo1r0uqn09dfbo9elo:  5% /3.211G, 2.294M/s, 22m35s
 * crypt/hm2ka8or5qp1pefh…eunu530ccavd54n1ic2482:  4% /9.549G, 2.194M/s, 1h11m15s
 * crypt/hm2ka8or5qp1pefh…ljiert7g5d8iae31fa5drc: 45% /17.774G, 2.207M/s, 1h14m18s
 * crypt/hm2ka8or5qp1pefh…vq49e8cion3e5hp2gn18qg: 22% /2.896G, 2.287M/s, 16m39s

#30

That looks like a google drive error - it rejected the upload for some reason.

Looks like you have 5 files which will need to be retried out of 564 which is certainly much better than before. Rclone will retry them automatically at the end of the sync.


#31

Yup its working, last info in log ( and its still going on )

2019/02/14 19:51:55 INFO  : 
Transferred:   	    1.780T / 1.780 TBytes, 100%, 8.591 MBytes/s, ETA 0s
Errors:                 1 (retrying may help)
Checks:            337775 / 337775, 100%
Transferred:          679 / 679, 100%
Elapsed time:  60h21m0.6s

#32

Finished

Transferred:   	    1.844T / 1.844 TBytes, 100%, 7.535 MBytes/s, ETA 0s
Errors:                 2 (retrying may help)
Checks:            780831 / 780831, 100%
Transferred:          701 / 701, 100%
Elapsed time:  71h16m0.6s


2019/02/15 06:47:23 DEBUG : Google drive root '': failed to copy 2 directories
2019/02/15 06:47:23 DEBUG : Google drive root '': copied 31 directories
2019/02/15 06:47:23 ERROR : Google drive root '': not deleting files as there were IO errors
2019/02/15 06:47:23 ERROR : Google drive root '': not deleting directories as there were IO errors
2019/02/15 06:47:23 ERROR : Attempt 10/10 failed with 2 errors and: googleapi: Error 404: File not found: 1k9WoFcE5klx5_W2_3_aWrmIm8c8gZ1oR., notFound
2019/02/15 06:47:23 Failed to sync: googleapi: Error 404: File not found: 1k9WoFcE5klx5_W2_3_aWrmIm8c8gZ1oR., notFound

Full log: https://ufile.io/cn0v1


#33

That looked pretty good. There were a couple of files which were “notFound” - these were the ones that wouldn’t transfer. Did they disappear in the source?


#34

Yes its more then likely one of the replaced/upgraded movies or tv shows.

Next sync finished fast and without errors

2019/02/15 09:24:50 INFO  : Google drive root '': Waiting for checks to finish
2019/02/15 09:24:50 INFO  : Google drive root '': Waiting for transfers to finish
2019/02/15 09:24:50 INFO  : Waiting for deletions to finish
2019/02/15 09:24:51 INFO  : syncit: Deleted
2019/02/15 09:24:54 INFO  : 
Transferred:   	   27.324G / 27.324 GBytes, 100%, 4.131 MBytes/s, ETA 0s
Errors:                 0
Checks:             78181 / 78181, 100%
Transferred:            8 / 8, 100%
Elapsed time:  1h52m53.3s

So there was some critical error in dropbox when downloading large files.

Hopefully we can find someone to help with dropbox polling so it can get on pair with gdrive


#35

Hooray!

Do you know anyone who would like to hack on the dropbox backend?


#36

No, I was looking on github who was doing any kind of dropbox fixes on rcone.
I was hoping jake-stripe would be around but it seems he is inactive.