Chunked files assemble issue in chunker

[ I am facing same issue which is mentioned in following post.

](https://forum.rclone.org/t/chunked-files-reassemble-issue-in-chunker/38204)

rclone v1.62.2

  • os/version: ubuntu 20.04 (64 bit)
  • os/kernel: 5.15.0-1033-aws (x86_64)
  • os/type: linux
  • os/arch: amd64
  • go/version: go1.20.2
  • go/linking: static
  • go/tags: none

I am using owncloud (primary storage AWS S3)

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone copy  sporders/'SPS Orders.csv' overlay: --progress --dump-headers

The rclone config contents with secrets removed.

[destination]
type = webdav
url = xxxxxx
vendor = owncloud
user = xxxxx
pass = xxxxxxx

[overlay]
type = chunker
remote = xxxxxx
chunk_size = 500Mi

A log from the command with the -vv flag

2023-05-16 07:33:00 INFO  : SPS Orders.csv: Copied (new)
Transferred:           10 GiB / 10 GiB, 100%, 266.870 MiB/s, ETA 0s
Checks:                11 / 11, 100%
Renamed:               11
Transferred:            1 / 1, 100%
Elapsed time:      3m18.2s

can you copy a single file and post a full debug log.
perhaps without --progress --dump-headers

What are you expecting to see and what did you actually see @jatinder ?

rclone copy sporders/'SPS Orders.csv' overlay: --log-level DEBUG
2023/05/17 05:25:47 DEBUG : rclone: Version "v1.62.2" starting with parameters ["rclone" "copy" "sporders/SPS Orders.csv" "overlay:" "--log-level" "DEBUG"]
2023/05/17 05:25:47 DEBUG : Creating backend with remote "sporders/SPS Orders.csv"
2023/05/17 05:25:47 DEBUG : Using config file from "/home/ubuntu/.config/rclone/rclone.conf"
2023/05/17 05:25:47 DEBUG : fs cache: renaming cache item "sporders/SPS Orders.csv" to be canonical "/home/ubuntu/rclone-v1.62.2-linux-amd64/sporders/SPS Orders.csv"
2023/05/17 05:25:47 DEBUG : Creating backend with remote "overlay:"
2023/05/17 05:25:47 DEBUG : Creating backend with remote "primary_owncloud:Owncloud"
2023/05/17 05:25:47 DEBUG : found headers:
2023/05/17 05:25:47 DEBUG : Reset feature "ListR"
2023/05/17 05:25:47 DEBUG : Chunked 'overlay:': Waiting for checks to finish
2023/05/17 05:25:47 DEBUG : Chunked 'overlay:': Waiting for transfers to finish
2023/05/17 05:25:47 DEBUG : SPS Orders.csv: skip slow MD5 on source file, hashing in-transit
2023/05/17 05:26:47 INFO :
Transferred: 1.953 GiB / 5 GiB, 39%, 39.444 MiB/s, ETA 1m19s
Transferred: 0 / 1, 0%
Elapsed time: 1m0.2s
Transferring:

  •                            SPS Orders.csv: 39% /5Gi, 39.444Mi/s, 1m19s
    

2023/05/17 05:27:47 INFO :
Transferred: 3.456 GiB / 5 GiB, 69%, 20.922 MiB/s, ETA 1m15s
Transferred: 0 / 1, 0%
Elapsed time: 2m0.2s
Transferring:

  •                            SPS Orders.csv: 69% /5Gi, 20.922Mi/s, 1m15s
    

2023/05/17 05:28:42 INFO : SPS Orders.csv.rclone_chunk.001_skht9r: Moved (server-side) to: SPS Orders.csv.rclone_chunk.001
2023/05/17 05:28:43 INFO : SPS Orders.csv.rclone_chunk.002_skht9r: Moved (server-side) to: SPS Orders.csv.rclone_chunk.002
2023/05/17 05:28:44 INFO : SPS Orders.csv.rclone_chunk.003_skht9r: Moved (server-side) to: SPS Orders.csv.rclone_chunk.003
2023/05/17 05:28:44 INFO : SPS Orders.csv.rclone_chunk.004_skht9r: Moved (server-side) to: SPS Orders.csv.rclone_chunk.004
2023/05/17 05:28:45 INFO : SPS Orders.csv.rclone_chunk.005_skht9r: Moved (server-side) to: SPS Orders.csv.rclone_chunk.005
2023/05/17 05:28:46 INFO : SPS Orders.csv.rclone_chunk.006_skht9r: Moved (server-side) to: SPS Orders.csv.rclone_chunk.006
2023/05/17 05:28:46 INFO : SPS Orders.csv.rclone_chunk.007_skht9r: Moved (server-side) to: SPS Orders.csv.rclone_chunk.007
2023/05/17 05:28:47 INFO :
Transferred: 8.418 GiB / 8.418 GiB, 100%, 188.501 MiB/s, ETA 0s
Checks: 7 / 8, 88%
Renamed: 7
Transferred: 0 / 1, 0%
Elapsed time: 3m0.2s
Checking:

  •    SPS Orders.csv.rclone_chunk.008_skht9r:  0% /500Mi, 0/s, -, moving
    

Transferring:

  •                            SPS Orders.csv:100% /5Gi, 15.965Mi/s, 0s
    

2023/05/17 05:28:47 INFO : SPS Orders.csv.rclone_chunk.008_skht9r: Moved (server-side) to: SPS Orders.csv.rclone_chunk.008
2023/05/17 05:28:48 INFO : SPS Orders.csv.rclone_chunk.009_skht9r: Moved (server-side) to: SPS Orders.csv.rclone_chunk.009
2023/05/17 05:28:48 INFO : SPS Orders.csv.rclone_chunk.010_skht9r: Moved (server-side) to: SPS Orders.csv.rclone_chunk.010
2023/05/17 05:28:49 INFO : SPS Orders.csv.rclone_chunk.011_skht9r: Moved (server-side) to: SPS Orders.csv.rclone_chunk.011
2023/05/17 05:28:51 DEBUG : SPS Orders.csv: md5 = 54c76af885fd467ce313852dba819b28 OK
2023/05/17 05:28:51 INFO : SPS Orders.csv: Copied (new)
2023/05/17 05:28:51 INFO :
Transferred: 10 GiB / 10 GiB, 100%, 263.086 MiB/s, ETA 0s
Checks: 11 / 11, 100%
Renamed: 11
Transferred: 1 / 1, 100%
Elapsed time: 3m4.0s

2023/05/17 05:28:51 DEBUG : 5 go routines active

Hi Nick,

I can see chunked files listed in owncloud directory . I am expecting all chunked should be merged back as a single original file. So that I can download that single file from owncloud portal.

That isn't the way chunker works.

If you upload and download with chunker, then it will seamlessly break your contents into chunks and join them back together again.

I want to upload files having 5GB or more than that in size from S3 to owncloud using rclone.
Rclone allows me to transfer < 1.2GB only.

How can I upload files having a large size of > 1.2Gb?
Could you suggest me any way out of that?

Thanks

Why? Is that an owncloud limitation? What happens?

did not see a s3 remote into the config file you posted?

the remote is missing, what does xxxxxx point to.

can you post a full debug log that shows the issue?

When I upload a large file from owncloud portal.
It divides the file into chunks during the upload process and then merged those chunks into the original file after all chunks are completely transferred to server.

To upload large files, Owncloud breaks the files into chunks and then merge it back. I want to perform same using rclone , But I get chunks not the original file.

S3 Configuration
type = s3
provider = AWS
env_auth = true
region = XXXX
acl = private
server_side_encryption = XXXX
storage_class = STANDARD
location_constraint = XXX
upload_concurrency = 4
upload_chunk_size = 16M

rclone copy S3:escalon-rclone-data/'Luna Magic, Inc'/'5. Month End'/2021/'SPS Orders.csv' primary_owncloud:'Luna Magic, Inc' -vv
2023/06/01 11:59:13 DEBUG : rclone: Version "v1.62.2" starting with parameters ["rclone" "copy" "S3:escalon-rclone-data/Luna Magic, Inc/5. Month End/2021/SPS Orders.csv" "primary_owncloud:Luna Magic, Inc" "-vv"]
2023/06/01 11:59:13 DEBUG : Creating backend with remote "S3:escalon-rclone-data/Luna Magic, Inc/5. Month End/2021/SPS Orders.csv"
2023/06/01 11:59:13 DEBUG : Using config file from "/home/ubuntu/.config/rclone/rclone.conf"
2023/06/01 11:59:13 DEBUG : fs cache: adding new entry for parent of "S3:escalon-rclone-data/Luna Magic, Inc/5. Month End/2021/SPS Orders.csv", "S3:escalon-rclone-data/Luna Magic, Inc/5. Month End/2021"
2023/06/01 11:59:13 DEBUG : Creating backend with remote "primary_owncloud:Luna Magic, Inc"
2023/06/01 11:59:13 DEBUG : found headers:
2023/06/01 11:59:13 DEBUG : SPS Orders.csv: Need to transfer - File not found at Destination
2023/06/01 11:59:15 ERROR : SPS Orders.csv: Failed to copy:

413 Request Entity Too Large

413 Request Entity Too Large


nginx/1.23.4 : 413 Request Entity Too Large 2023/06/01 11:59:15 ERROR : Attempt 1/3 failed with 1 errors and: 413 Request Entity Too Large

413 Request Entity Too Large


nginx/1.23.4 : 413 Request Entity Too Large 2023/06/01 11:59:15 DEBUG : SPS Orders.csv: Need to transfer - File not found at Destination 2023/06/01 11:59:16 ERROR : SPS Orders.csv: Failed to copy: 413 Request Entity Too Large

413 Request Entity Too Large


nginx/1.23.4 : 413 Request Entity Too Large 2023/06/01 11:59:16 ERROR : Attempt 2/3 failed with 1 errors and: 413 Request Entity Too Large

413 Request Entity Too Large


nginx/1.23.4 : 413 Request Entity Too Large 2023/06/01 11:59:17 DEBUG : SPS Orders.csv: Need to transfer - File not found at Destination 2023/06/01 11:59:18 ERROR : SPS Orders.csv: Failed to copy: 413 Request Entity Too Large

413 Request Entity Too Large


nginx/1.23.4 : 413 Request Entity Too Large 2023/06/01 11:59:18 ERROR : Attempt 3/3 failed with 1 errors and: 413 Request Entity Too Large

413 Request Entity Too Large


nginx/1.23.4 : 413 Request Entity Too Large 2023/06/01 11:59:18 INFO : Transferred: 996 KiB / 996 KiB, 100%, 199.199 KiB/s, ETA 0s Errors: 1 (retrying may help) Elapsed time: 5.7s

2023/06/01 11:59:18 DEBUG : 8 go routines active
2023/06/01 11:59:18 Failed to copy:

413 Request Entity Too Large

413 Request Entity Too Large


nginx/1.23.4 : 413 Request Entity Too Large

so your config should contain following remotes:

[s3] <--  it works fine as I can see

[primary_owncloud] <-- which is your owncloud remote


[chunker]
type = chunker
remote = primary_owncloud:
chunk_size = 500Mi

then your command will be:

rclone copy S3:escalon-rclone-data/'Luna Magic, Inc'/'5. Month End'/2021/'SPS Orders.csv' chunker:'Luna Magic, Inc' -vv

when using chunker remote files are split/combined on the fly


rclone lsf chunker:
OneBigFile.csv

but when using primary_owncloud you will see all chunks:

rclone lsf primary_owncloud:
OneBigFile.csv.rclone_chunk.001
OneBigFile.csv.rclone_chunk.002
OneBigFile.csv.rclone_chunk.003

if you want to copy combined file back from primary_owncloud you have to use chunker remote e.g.:

rclone copy chunker:OneBigFile.csv .

Thank you kapitainsky,

The steps you mentions are working fine except the last command.
[overlay]
type = chunker
remote = primary_owncloud:Owncloud
chunk_size = 100Mi

Here is output of last step:

rclone copy overlay:'SPS Orders.csv' . -vv
2023/06/01 13:46:51 DEBUG : rclone: Version "v1.62.2" starting with parameters ["rclone" "copy" "overlay:SPS Orders.csv" "." "-vv"]
2023/06/01 13:46:51 DEBUG : Creating backend with remote "overlay:SPS Orders.csv"
2023/06/01 13:46:51 DEBUG : Using config file from "/home/ubuntu/.config/rclone/rclone.conf"
2023/06/01 13:46:51 DEBUG : Creating backend with remote "primary_owncloud:Owncloud/SPS Orders.csv"
2023/06/01 13:46:51 DEBUG : found headers:
2023/06/01 13:46:51 DEBUG : fs cache: adding new entry for parent of "primary_owncloud:Owncloud/SPS Orders.csv", "primary_owncloud:Owncloud"
2023/06/01 13:46:51 DEBUG : Reset feature "ListR"
2023/06/01 13:46:51 DEBUG : Creating backend with remote "."
2023/06/01 13:46:51 DEBUG : fs cache: renaming cache item "." to be canonical "/home/ubuntu/rclone-v1.62.2-linux-amd64"
2023/06/01 13:46:52 DEBUG : SPS Orders.csv: Size and modification time the same (differ by 0s, within tolerance 1s)
2023/06/01 13:46:52 DEBUG : SPS Orders.csv: Unchanged skipping
2023/06/01 13:46:52 INFO :
Transferred: 0 B / 0 B, -, 0 B/s, ETA -
Elapsed time: 0.8s

2023/06/01 13:46:52 DEBUG : 4 go routines active

It is working as well - simply your local file is the same as remote so no need to copy (Unchanged skipping)

Run it in some other folder for test.

I want to upload the original full file to the primary_owncloud remote.

If you can not as for whatever reasons your primary_owncloud has some size limitations solution is to use chunker remote to split file into acceptable size chunks.

Please put it to another thread - otherwise it is too messy

Owncloud as such does not have file upload size limit but you use owncloud via webdav. Depending on your webdav owncloud configuration this interface can have max file limit.

There is nothing rclone can do about it.

You change your owncloud configuration to allow bigger files via webdav or use rclone chunker as a workaround solution.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.