[BUG] Sync dropbox to gdrive failing for large files > 50GB ( error unexpected EOF )

CMD
/usr/bin/rclone sync dropbox: gdrive: --retries=5 --transfers=4 --checkers=4 --tpslimit=1 --drive-pacer-min-sleep=100ms --bwlimit=9M --dropbox-chunk-size 128M --drive-chunk-size=128M --log-level=INFO --log-file=$LogFile

ERROR

2019/02/03 16:26:11 INFO  : 
Transferred:   	  339.190G / 1.271 TBytes, 26%, 8.824 MBytes/s, ETA 31h1m9s
Errors:                 0
Checks:             75765 / 75765, 100%
Transferred:            2 / 176, 1%
Elapsed time:  10h56m0.6s
Transferring:
 * crypt/0ibcad98vheukutq…59tgfh873a192i7hboog08:  0% /48.363G, 0/s, -
 * crypt/0ibcad98vheukutq…hfbku4sdjjnk61lubuelo8: 25% /47.144G, 58.364k/s, 174h31m39s
 * crypt/0ibcad98vheukutq…ob7lqjtei3kqgavdt2hkkg:  0% /69.469G, 0/s, -
 * crypt/0ibcad98vheukutq…of2dpfmbm5m77f5fmm7psk:  0% /56.085G, 138.821k/s, 117h36m58s

2019/02/03 16:26:49 ERROR : crypt/0ibcad98vheukutqb6r435td0c/2r2tlt1gvp4or8q22p6mcs7a6jlbmd2qnllug2tk4hsngc2akvrnocq8afjcp5bgpdei85gcm9ije/5psgvnk663e7s4ht1ta13efiskpn5nf91hplv1rcfqf3eanbqi4og9gjq7bspqv8bl7fbvogpeftvklh5hfbku4sdjjnk61lubuelo8: Failed to copy: Post https://www.googleapis.com/upload/drive/v3/files?alt=json&fields=id%2Cname%2Csize%2Cmd5Checksum%2Ctrashed%2CmodifiedTime%2CcreatedTime%2CmimeType%2Cparents%2CwebViewLink&uploadType=resumable&upload_id=AEnB2UqlbojsxUB39JeSDUzfX2EzvntoVT7T-K_H1vWNSN64ojNlpMMVM3Lkzm9488Ft8wAFLlSmFPAL9bTNw_801A-waW4bOQ: unexpected EOF
2019/02/03 16:27:11 INFO  : 
Transferred:   	  339.315G / 1.237 TBytes, 27%, 8.814 MBytes/s, ETA 29h55m20s
Errors:                 1 (retrying may help)
Checks:             75781 / 75781, 100%
Transferred:            2 / 175, 1%
Elapsed time:  10h57m0.6s
Transferring:
 * crypt/0ibcad98vheukutq…59tgfh873a192i7hboog08:  0% /48.363G, 0/s, -
 * crypt/0ibcad98vheukutq…ob7lqjtei3kqgavdt2hkkg:  0% /69.469G, 3.877M/s, 5h5m17s
 * crypt/0ibcad98vheukutq…of2dpfmbm5m77f5fmm7psk:  0% /56.085G, 2.889k/s, 5651h50m43s
 * crypt/0ibcad98vheukutq…qrop0aap34r59ehtli4sgg:  0% /41.127G, 0/s, -

Full log: https://ufile.io/e35wg

p.s. Running it with debug now and will post full log later on today.

Thanks - I think I’ll need that :slight_smile:

@ncw full debug log https://ufile.io/an995

Not sure what else I could change with sync flags since I used the most “safe” setting with only 1 tps and 100ms pacer.

Thanks for the log.

I’m not sure what is going on exactly. It might be that the uploads are timing out somehow at google’s end. The error UnexpectedEOF means that rclone was trying to read data but didn’t get it. However this is happening on the upload to google rather than the read from dropbox.

You could try reducing --drive-chunk-size=128M but I suspect increasing --low-level-retries to 20 or 50 will solve the problem.

Dropped tps limit to 1 and chunk size to 32 + added 50 low level retries.

/usr/bin/rclone sync dropbox: gdrive: \
                              --bwlimit=9M \
                              --transfers=3 \
                              --checkers=3 \
                              --retries=10 \
                              --low-level-retries=50 \
                              --tpslimit=1 \
                              --drive-pacer-min-sleep=100ms \
                              --dropbox-chunk-size 32M \
                              --drive-chunk-size=32M \
                              --log-level=DEBUG \
                              --log-file=$LogFile

@ncw Even with 5o low level retries its failing

2019/02/05 06:46:35 DEBUG : pacer: low level retry 49/50 (error Post https://www.googleapis.com/upload/drive/v3/files?alt=json&fields=id%2Cname%2Csize%2Cmd5Checksum%2Ctrashed%2CmodifiedTime%2CcreatedTime%2CmimeType%2Cparents%2CwebViewLink&uploadType=resumable&upload_id=AEnB2Uq2HGtNgF0-7qSler825OjGxJIb6SkaZRbmV5VanYw-TOcb4ROuV7-W0gVlz93yGByecKEIqH7pn717iM-GetilIm0PSsDGNGlla2KktNzuIkS5_YQ: unexpected EOF)

2019/02/05 06:46:38 DEBUG : pacer: low level retry 50/50 (error Post https://www.googleapis.com/upload/drive/v3/files?alt=json&fields=id%2Cname%2Csize%2Cmd5Checksum%2Ctrashed%2CmodifiedTime%2CcreatedTime%2CmimeType%2Cparents%2CwebViewLink&uploadType=resumable&upload_id=AEnB2UqLhYJ05DrrMV0qgtDM45Ts7yg5f8q3RR0uwVTFPnOUjzF5FCXgv6Z0glQzOjXtU-FmuWhkTGzXh1cufay3VkrwGgsYBH18BcrZpvGw8ShatF-rOBo: unexpected EOF)

2019/02/05 06:46:38 DEBUG : crypt/0ibcad98vheukutqb6r435td0c/2r2tlt1gvp4or8q22p6mcs7a6jlbmd2qnllug2tk4hsngc2akvrnocq8afjcp5bgpdei85gcm9ije/5psgvnk663e7s4ht1ta13efiskpn5nf91hplv1rcfqf3eanbqi4og9gjq7bspqv8bl7fbvogpeftvklh5hfbku4sdjjnk61lubuelo8: Received error: Post https://www.googleapis.com/upload/drive/v3/files?alt=json&fields=id%2Cname%2Csize%2Cmd5Checksum%2Ctrashed%2CmodifiedTime%2CcreatedTime%2CmimeType%2Cparents%2CwebViewLink&uploadType=resumable&upload_id=AEnB2UqLhYJ05DrrMV0qgtDM45Ts7yg5f8q3RR0uwVTFPnOUjzF5FCXgv6Z0glQzOjXtU-FmuWhkTGzXh1cufay3VkrwGgsYBH18BcrZpvGw8ShatF-rOBo: unexpected EOF - low level retry 1/50

2019/02/05 06:46:40 DEBUG : pacer: low level retry 47/50 (error Post https://www.googleapis.com/upload/drive/v3/files?alt=json&fields=id%2Cname%2Csize%2Cmd5Checksum%2Ctrashed%2CmodifiedTime%2CcreatedTime%2CmimeType%2Cparents%2CwebViewLink&uploadType=resumable&upload_id=AEnB2UoRIv8k6sS65HlZKnDdUUVMt23wzL5Ql3RZAJRzsMPiMMDmPdiHS8x9I1i0OBhea6pKw6kDu22qg_0Y2i3LwtOJTHMC5ffgMXTNGJHdI9fyWPxIPDo: unexpected EOF)

2019/02/05 06:46:41 DEBUG : pacer: low level retry 50/50 (error Post https://www.googleapis.com/upload/drive/v3/files?alt=json&fields=id%2Cname%2Csize%2Cmd5Checksum%2Ctrashed%2CmodifiedTime%2CcreatedTime%2CmimeType%2Cparents%2CwebViewLink&uploadType=resumable&upload_id=AEnB2Uq2HGtNgF0-7qSler825OjGxJIb6SkaZRbmV5VanYw-TOcb4ROuV7-W0gVlz93yGByecKEIqH7pn717iM-GetilIm0PSsDGNGlla2KktNzuIkS5_YQ: unexpected EOF)

2019/02/05 06:46:41 DEBUG : crypt/0ibcad98vheukutqb6r435td0c/4e7hr02dubchi1dsakobuu894l33gg5vf0uodu58b3687p2vdp3m7pcg08hokc1cu3fcahm0pf80m/a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08: Received error: Post https://www.googleapis.com/upload/drive/v3/files?alt=json&fields=id%2Cname%2Csize%2Cmd5Checksum%2Ctrashed%2CmodifiedTime%2CcreatedTime%2CmimeType%2Cparents%2CwebViewLink&uploadType=resumable&upload_id=AEnB2Uq2HGtNgF0-7qSler825OjGxJIb6SkaZRbmV5VanYw-TOcb4ROuV7-W0gVlz93yGByecKEIqH7pn717iM-GetilIm0PSsDGNGlla2KktNzuIkS5_YQ: unexpected EOF - low level retry 1/50

2019/02/05 06:46:45 DEBUG : pacer: low level retry 1/50 (error googleapi: Error 403: Rate Limit Exceeded, rateLimitExceeded)
2019/02/05 06:46:49 DEBUG : pacer: low level retry 48/50 (error Post https://www.googleapis.com/upload/drive/v3/files?alt=json&fields=id%2Cname%2Csize%2Cmd5Checksum%2Ctrashed%2CmodifiedTime%2CcreatedTime%2CmimeType%2Cparents%2CwebViewLink&uploadType=resumable&upload_id=AEnB2UoRIv8k6sS65HlZKnDdUUVMt23wzL5Ql3RZAJRzsMPiMMDmPdiHS8x9I1i0OBhea6pKw6kDu22qg_0Y2i3LwtOJTHMC5ffgMXTNGJHdI9fyWPxIPDo: unexpected EOF)

Starting again with

/usr/bin/rclone sync dropbox: gdrive: \
                              --bwlimit=9M \
                              --transfers=1 \
                              --checkers=1 \
                              --retries=10 \
                              --low-level-retries=100 \
                              --tpslimit=1 \
                              --drive-pacer-min-sleep=1000ms \
                              --dropbox-chunk-size=32M \
                              --drive-chunk-size=32M \
                              --log-level=DEBUG \
                              --log-file=$LogFile

If that didn’t fix it then maybe it is the read from dropbox which is failing part of the way through.

Can you download these 50GB files to local storage OK? What if you put a --bwlimit 2.5M say to slow it down to the speed you are getting for dropbox -> google?

Will wait a bit more, before changing flags.

I did get even 100/100 low level retry, but so far no error in transfer

2019/02/05 09:45:54 DEBUG : pacer: low level retry 100/100 (error Post https://www.googleapis.com/upload/drive/v3/files?alt=json&fields=id%2Cname%2Csize%2Cmd5Checksum%2Ctrashed%2CmodifiedTime%2CcreatedTime%2CmimeType%2Cparents%2CwebViewLink&uploadType=resumable&upload_id=AEnB2UogRTkQDKWpjTfFaxYTiYl5im13iC_IDzJsHIxFy5axENYzwW9VjONMMaBBZ1QkTV9d0dPigfOlzebBTuxMaXnypk9tHD3Tn_zj65JzstDQj5O5z2c: unexpected EOF)

Transferred:   	  120.467G / 1.000 TBytes, 12%, 5.633 MBytes/s, ETA 45h37m44s
Errors:                 0
Checks:             52788 / 52788, 100%
Transferred:            5 / 187, 3%
Elapsed time:    6h5m0.6s
Transferring:
 * crypt/0ibcad98vheukutq…hfbku4sdjjnk61lubuelo8:  5% /47.144G, 6.466M/s, 1h56m59s

p.s. @ncw how can i reverse the filename so i try to copy it to the local disk.

Use rclone cryptdecode to turn the encrypted file name into a non encrypted file name.

Last sync was still failing (log: https://ufile.io/sv7bs )

I copied file from dropbox to local successfully ( strait copy of encrypted file to local folder )

2019/02/06 05:43:36 INFO  : a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08: Copied (new)
2019/02/06 05:43:36 INFO  :
Transferred:       48.363G / 48.363 GBytes, 100%, 22.841 MBytes/s, ETA 0s
Errors:                 0
Checks:                 0 / 0, -
Transferred:            1 / 1, 100%
Elapsed time:     36m8.1s

2019/02/06 05:43:36 DEBUG : 4 go routines active
2019/02/06 05:43:36 DEBUG : rclone: Version "v1.45-123-gb26276b4-beta" finishing with parameters ["rclone" "copy" "dropbox:/crypt/0ibcad98vheukutqb6r435td0c/4e7hr02dubchi1dsakobuu894l33gg5vf0uodu58b3687p2vdp3m7pcg08hokc1cu3fcahm0pf80m/a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08" "/storage/" "-vv"]

Iam lowering bwlimit, setting pacer to 5000ms and trying again with

/usr/bin/rclone sync dropbox: gdrive: \
                              --bwlimit=2M \
                              --transfers=1 \
                              --checkers=1 \
                              --retries=10 \
                              --low-level-retries=100 \
                              --tpslimit=1 \
                              --drive-pacer-min-sleep=5000ms \
                              --dropbox-chunk-size=32M \
                              --drive-chunk-size=32M \
                              --log-level=DEBUG \
                              --log-file=$LogFile

@ncw something is really wrong with large file sync, I hope someone else could test this as well

Hmm… Can you try it again with a --bwlimit

rclone copy dropbox:/crypt/0ibcad98vheukutqb6r435td0c/4e7hr02dubchi1dsakobuu894l33gg5vf0uodu58b3687p2vdp3m7pcg08hokc1cu3fcahm0pf80m/a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08 /storage/ -vv --bwlimit 2.5M

This will simulate the google drive transfer rate better.

Can you also try uploading that file from local -> gdrive with a --bwlimit 2.5M to see if that works?

That sync failed because of this:

2019/02/05 17:23:17 ERROR : crypt/hm2ka8or5qp1pefhtc1e62jn50/d8dda88lkipp5er4clrcasp147hqqck23rg14igto7jbdkead2vg: error reading source directory: 

Your previous log had the same message in a different place.

2019/02/04 10:54:14 ERROR : crypt/hm2ka8or5qp1pefhtc1e62jn50/38vca4osshln3gmmdaf9fl829vp68ul3s9khb3u9ll6mu5ms448g: error reading source directory: 

I don’t know what is causing that - the error message is blank!

Here is a work-around - can you try this?

https://beta.rclone.org/branch/v1.45-175-gb7de616e-fix-dropbox-list-beta/ (uploaded in 15-30 mins)

Eh i noticed that my rclone config on download/upload server did not use own API key, I copied plex rclone.conf and will test if its working. Need like 5, 6h to see if it will fail.

1 Like

Previous test failed badly, even with own API key it seems it was just looping for ever eg no errors but over 700GB uploaded while no transfers made ( the first 2 transfers were some small files that i use for checking if sync is needed )

Here is full log: https://ufile.io/52yba

I will use your fix-dropbox and running it now with:

/usr/bin/rclone sync dropbox: gdrive: \
                              --bwlimit=2.5M \
                              --transfers=1 \
                              --checkers=1 \
                              --retries=10 \
                              --low-level-retries=100 \
                              --tpslimit=1 \
                              --drive-pacer-min-sleep=100ms \
                              --log-level=DEBUG \
                              --log-file=$LogFile

Great - let me know how it goes.

No luck check logs ( bottom part as i did not truncate before switching to new version)

Can I exclude 4k movie folder for sync

:frowning:

Can you try this to copy the file to local but with a bwlimit? I want to see if that gives the same Unexpected EOF error

 rclone copy dropbox:/crypt/0ibcad98vheukutqb6r435td0c/4e7hr02dubchi1dsakobuu894l33gg5vf0uodu58b3687p2vdp3m7pcg08hokc1cu3fcahm0pf80m/a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08 /storage/ -vv --bwlimit 2.5M

@ncw its some small file that got instantly copied

2019/02/10 07:12:59 DEBUG : rclone: Version "v1.45-175-gb7de616e-fix-dropbox-list-beta" starting with parameters ["rclone" "copy" "dropbox:/crypt/0ibcad98vheukutqb6r435td0c/4e7hr02dubchi1dsakobuu894l33gg5vf0uodu58b3687p2vdp3m7pcg08hokc1cu3fcahm0pf80m/a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08" "/storage/" "-vv" "--bwlimit" "2.5M" "--log-file=/storage/rclonetest.log"]
2019/02/10 07:12:59 DEBUG : Using config file from "/home/plex/.config/rclone/rclone.conf"
2019/02/10 07:12:59 INFO  : Starting bandwidth limiter at 2.500MBytes/s
2019/02/10 07:13:00 DEBUG : Dropbox root 'crypt/0ibcad98vheukutqb6r435td0c/4e7hr02dubchi1dsakobuu894l33gg5vf0uodu58b3687p2vdp3m7pcg08hokc1cu3fcahm0pf80m/a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08': Using root namespace "1856006048"
2019/02/10 07:13:02 DEBUG : a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08: Size and modification time the same (differ by 0s, within tolerance 1s)
2019/02/10 07:13:02 DEBUG : a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08: Unchanged skipping
2019/02/10 07:13:02 INFO  :
Transferred:             0 / 0 Bytes, -, 0 Bytes/s, ETA -
Errors:                 0
Checks:                 1 / 1, 100%
Transferred:            0 / 0, -
Elapsed time:        2.6s

2019/02/10 07:13:02 DEBUG : 5 go routines active
2019/02/10 07:13:02 DEBUG : rclone: Version "v1.45-175-gb7de616e-fix-dropbox-list-beta" finishing with parameters ["rclone" "copy" "dropbox:/crypt/0ibcad98vheukutqb6r435td0c/4e7hr02dubchi1dsakobuu894l33gg5vf0uodu58b3687p2vdp3m7pcg08hokc1cu3fcahm0pf80m/a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08" "/storage/" "-vv" "--bwlimit" "2.5M" "--log-file=/storage/rclonetest.log"]

Change it for bigger file in that folder

rclone copy dropbox:/crypt/0ibcad98vheukutqb6r435td0c/4e7hr02dubchi1dsakobuu894l33gg5vf0uodu58b3687p2vdp3m7pcg08hokc1cu3fcahm0pf80m/a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08 /storage/ -vv --bwlimit 2.5M --log-file=/storage/rclonetest.log

Doing copy now

DEBUG : rclone: Version "v1.45-175-gb7de616e-fix-dropbox-list-beta" starting with parameters ["rclone" "copy" "dropbox:/crypt/0ibcad98vheukutqb6r435td0c/4e7hr02dubchi1dsakobuu894l33gg5vf0uodu58b3687p2vdp3m7pcg08hokc1cu3fcahm0pf80m/a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08" "/storage/" "-vv" "--bwlimit" "2.5M" "--log-file=/storage/rclonetest.log"]
2019/02/10 07:15:13 DEBUG : Using config file from "/home/plex/.config/rclone/rclone.conf"
2019/02/10 07:15:13 INFO  : Starting bandwidth limiter at 2.500MBytes/s
2019/02/10 07:15:14 DEBUG : Dropbox root 'crypt/0ibcad98vheukutqb6r435td0c/4e7hr02dubchi1dsakobuu894l33gg5vf0uodu58b3687p2vdp3m7pcg08hokc1cu3fcahm0pf80m/a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08': Using root namespace "1856006048"
2019/02/10 07:15:14 DEBUG : a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08: Couldn't find file - need to transfer
2019/02/10 07:16:14 INFO  :
Transferred:      145.340M / 48.363 GBytes, 0%, 2.371 MBytes/s, ETA 5h47m5s
Errors:                 0
Checks:                 0 / 0, -
Transferred:            0 / 1, 0%
Elapsed time:      1m1.2s
Transferring:
 * a0joarrmo3e080n6pkdgst…59tgfh873a192i7hboog08:  0% /48.363G, 2.516M/s, 5h27m9s

2019/02/10 07:17:14 INFO  :
Transferred:      295.340M / 48.363 GBytes, 1%, 2.435 MBytes/s, ETA 5h36m58s
Errors:                 0
Checks:                 0 / 0, -
Transferred:            0 / 1, 0%
Elapsed time:      2m1.2s
Transferring:
 * a0joarrmo3e080n6pkdgst…59tgfh873a192i7hboog08:  0% /48.363G, 2.500M/s, 5h28m8s

Until its figured out can I use--exclude /crypt/0ibcad98vheukutqb6r435td0c/ to exclude 4K movies since they stuck my sync.

@ncw something is beyond weird

rclone ls dropbox:/crypt/0ibcad98vheukutqb6r435td0c/4e7hr02dubchi1dsakobuu894l33gg5vf0uodu58b3687p2vdp3m7pcg08hokc1cu3fcahm0pf80m/
      67474 1upgpnmsd7o1bmgpf3e219t7q8lil2aiugo93qvlmdiljan58snam6f5k8e0nsk46ede3cotdvjgqum1dgnckdq0krrghoceebpva30
51929126633 a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08

cmd:
rclone copy dropbox:/crypt/0ibcad98vheukutqb6r435td0c/4e7hr02dubchi1dsakobuu894l33gg5vf0uodu58b3687p2vdp3m7pcg08hokc1cu3fcahm0pf80m/a0joarrmo3 e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08 /storage/ -vv --bwlimit 2.5M --log-file=/storage/rclonetest.log

The file is around 50GB however last entry in log shows over 100GB transferred but no errors.

2019/02/10 19:06:14 INFO  :
Transferred:      104.102G / 141.281 GBytes, 74%, 2.499 MBytes/s, ETA 4h13m56s
Errors:                 0
Checks:                 0 / 0, -
Transferred:            0 / 1, 0%
Elapsed time:  11h51m1.2s
Transferring:
 * a0joarrmo3e080n6pkdgst…59tgfh873a192i7hboog08: 23% /48.363G, 2.500M/s, 4h13m49s

Full log: https://ufile.io/fqook ( I did not stop copy yet )

Will run it over night with 1M:

2019/02/10 19:18:23 DEBUG : rclone: Version "v1.45-175-gb7de616e-fix-dropbox-list-beta" starting with parameters ["rclone" "copy" "dropbox:/crypt/0ibcad98vheukutqb6r435td0c/4e7hr02dubchi1dsakobuu894l33gg5vf0uodu58b3687p2vdp3m7pcg08hokc1cu3fcahm0pf80m/a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08" "/storage/" "--bwlimit" "1M" "--log-level=DEBUG" "--log-file=/storage/rclonetest.log"]
2019/02/10 19:18:23 DEBUG : Using config file from "/home/plex/.config/rclone/rclone.conf"
2019/02/10 19:18:23 INFO  : Starting bandwidth limiter at 1MBytes/s
2019/02/10 19:18:24 DEBUG : Dropbox root 'crypt/0ibcad98vheukutqb6r435td0c/4e7hr02dubchi1dsakobuu894l33gg5vf0uodu58b3687p2vdp3m7pcg08hokc1cu3fcahm0pf80m/a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08': Using root namespace "1856006048"
2019/02/10 19:18:25 DEBUG : a0joarrmo3e080n6pkdgst6f8u0dk7549ic68cpu5rdkjautfgra9adtqkqjl4g6rsk61ek6f980pk4lg59tgfh873a192i7hboog08: Couldn't find file - need to transfer
2019/02/10 19:19:24 INFO  :
Transferred:       60.090M / 48.363 GBytes, 0%, 1002.894 kBytes/s, ETA 14h1m44s
Errors:                 0
Checks:                 0 / 0, -
Transferred:            0 / 1, 0%
Elapsed time:      1m1.3s
Transferring:
 * a0joarrmo3e080n6pkdgst…59tgfh873a192i7hboog08:  0% /48.363G, 1.014M/s, 13h33m16s