I had made a backup of my web site to an rclone crypt, and then decided to explicitly move the files & folders into a /var/www/html subdirectory, as the source path it came from had been (FWIW, this is fairly easy on OSX : one can generally just do a 'new folder with selection' : alas, doesn't work with crypted dirs on pcloud, and rclone makes this IMO surprisingly unintuitive). There were over a dozen directories and files : most seem to have been moved successfully (though slower than I would have expected if server side moves were supported?), but two of the folders failed, spewing quite a few errors in the process (claiming duplicate directories & objects, though none visible on the source vps, and 'rclone lsf' of the directories doesn't show anything that I found peculiar, though FWIW once showed a couple of timeouts & bumped wait multx up to 40ms) that I found unexpected. FWIW, Also tried 'dedupe' which failed.
Run the command 'rclone version' and share the full output of the command.
~# rclone version
rclone v1.60.1-DEV
os/version: debian 10.13 (64 bit)
os/kernel: 4.19.0 (x86_64)
os/type: linux
os/arch: amd64
go/version: go1.19.4
go/linking: dynamic
go/tags: none
Which cloud storage system are you using? (eg Google Drive)
pCloud
The command you were trying to run (eg rclone copy /tmp remote:tmp)
Very basic / effectively:
[pCloud_WebDav]
type = webdav
url = https://webdav.pcloud.com
vendor = other
user = anon@ymo.us
pass = xxx
[BackupCrypt]
type = crypt
remote = pCloud_WebDav:BackupCrypt
password = xxx
...
A log from the command with the -vv flag
2022/12/29 09:43:06 NOTICE: .htaccess: Duplicate directory found in destination - ignoring
2022/12/29 09:43:06 NOTICE: VueCode: Duplicate directory found in destination - ignoring
2022/12/29 09:43:06 NOTICE: VueCode: Duplicate directory found in destination - ignoring
2022/12/29 09:43:06 NOTICE: ZGPxFN2.jpg: Duplicate directory found in destination - ignoring
2022/12/29 09:43:06 NOTICE: ZGPxFN2.jpg: Duplicate directory found in destination - ignoring
2022/12/29 09:43:06 NOTICE: blah.html: Duplicate directory found in destination - ignoring
2022/12/29 09:43:06 NOTICE: blah.html: Duplicate directory found in destination - ignoring
2022/12/29 09:43:06 NOTICE: cache: Duplicate directory found in destination - ignoring
2022/12/29 09:43:06 NOTICE: classes: Duplicate directory found in destination - ignoring
... [dozens of files]
2022/12/29 09:43:39 ERROR : subfolder/subfolders/map.js: Couldn't move: Copy call failed: <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>404 Not Found</title>
</head><body>
<h1>Not Found</h1>
<p>The requested URL /remoteName/m9jtchcfgdaon9vsqamh2nuqhcheo534th4cv2hkql131h26b9s0/uu8b5nooa2vl0e5p4k2j3h3710/n3irja8sbc472psi588ggj3sak/jfkucpbmi382a3g6m8ni2l0t18/48ju8r1m12cr3nqf1red3ckdo0 was not found on this server.</p>
<hr>
<address>Apache/2.4.10 (Debian) Server at webdav.pcloud.com Port 443</address>
</body></html>: 404 Not Found
[...]
2022/12/29 09:43:39 NOTICE: subfolder/subfolders/xxxfile.js: Duplicate object found in source - ignoring
2022/12/29 09:43:39 NOTICE: subfolder/subfolders/country_code.js: Duplicate object found in source - ignoring
2022/12/29 09:43:39 NOTICE: subfolder/subfolders/default_country.js: Duplicate object found in source - ignoring
... [dozens of files]
Couldn't move: Copy call failed: <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>404 Not Found</title>
</head><body>
<h1>Not Found</h1>
<p>The requested URL /remoteName/m9jtchcfgdaon9vsqamh2nuqhcheo534th4cv2hkql131h26b9s0/uu8b5nooa2vl0e5p4k2j3h3710/63osstuopln0rle58e4ajlf6n8/oghffmle3do0up78hi8u577u7g was not found on this server.</p>
<hr>
<address>Apache/2.4.10 (Debian) Server at webdav.pcloud.com Port 443</address>
</body></html>: 404 Not Found
... [dozens of files]
I've just tried that again: I'm having difficulty with the http://localhost:53682/ redirection stuff, as this is a headless machine; the only browser that I have on it is lynx, and I've read elsewhere that rclone is requiring javascript for some reason!?. PITA to set up iptables (reportedly pcloud likely using dpts:8398:8399 too?) When I 'go to the following link: http://127.0.0.1:53682/auth?state=', all I get is an effectively blank page with ~a few back-arrows on it.
[Edit: after trying other things which unintuitively didn't work (such as opening iptables on 53682 to remote access) managed to route around that & get auth token by using the 'ssh -L localhost:53682:localhost:53682 nnn@xxx' trick]
After getting pcloud api working, and setting the source of the crypt to originate from there instead, the same 'rclone move' commands at first do appear to be completing without any obvious errors shown, however in performing an 'rclone lsf', the directories do not appear to have moved (!?). rclone copy of a test file does appear to have uploaded it...
Running 'rclone move' with -vv option shows a bunch of errors:
2022/12/29 12:06:39 DEBUG : rclone: Version "v1.60.1-DEV" starting with parameters ["... "-vv"]
2022/12/29 12:06:39 DEBUG : Creating backend with remote ".../"
2022/12/29 12:06:39 DEBUG : Using config file from "/root/.config/rclone/rclone.conf"
2022/12/29 12:06:39 DEBUG : Creating backend with remote "pCloud_API:remoteName/m9jtchcfgdaon9vsqamh2nuqhcheo534th4cv2hkql131h26b9s0"
2022/12/29 12:06:39 DEBUG : Creating backend with remote "remoteName:/var/www/html/"
2022/12/29 12:06:39 DEBUG : Creating backend with remote "pCloud_API: remoteName/ro1ena11ndg5kbu66gc96b5jd4/i9gi6co62s72sfub2qtb4lu52g/mekiifutqusrmopc8j6lhp9gcs"
2022/12/29 12:06:40 DEBUG : Encrypted drive 'remoteName:/var/www/html/': Using server-side directory move
2022/12/29 12:06:40 INFO : Encrypted drive 'remoteName:/var/www/html/': Server side directory move failed - fallback to file moves: can't copy directory - destination already exists
2022/12/29 12:06:40 DEBUG : .DS_Store: Skipping undecryptable file name: illegal base32 data at input byte 0
2022/12/29 12:06:40 DEBUG : u8d14nr3adrk0dhqp5e2ure80k/.davfs.tmp111f09: Skipping undecryptable file name: illegal base32 data at input byte 0
2022/12/29 12:06:40 DEBUG : u8d14nr3adrk0dhqp5e2ure80k/.davfs.tmp38d211: Skipping undecryptable file name: illegal base32 data at input byte 0
[... quite a few files]
2022/12/29 12:06:51 DEBUG : uu8b5nooa2vl0e5p4k2j3h3710/d5kue6bcct381aq5sh8sl0fkno/ull248avr2bkh7gj7mshkd3tlk/fcj3m1qn4h0a5jsl8aq4qc5u8g/94r53jitgdsp3pfm0lj90c931k/tint00hsoj31ehb03eostg8dn4/.davfs.tmp2c40c9: Skipping undecryptable file name: illegal base32 data at input byte 0
2022/12/29 12:06:51 DEBUG : Encrypted drive 'remoteName:/var/www/html/': Waiting for checks to finish
2022/12/29 12:06:51 DEBUG : Encrypted drive 'remoteName:/var/www/html/': Waiting for transfers to finish
2022/12/29 12:06:51 INFO : There was nothing to transfer
2022/12/29 12:06:51 INFO :
Transferred: 0 B / 0 B, -, 0 B/s, ETA -
Elapsed time: 12.7s
2022/12/29 12:06:51 DEBUG : 4 go routines active
And it's claiming that the destination directory already exists, though it doesn't appear in lsf!?.
I did not ask for any of these to be created (.DS_Store is automatically created by OSX when a folder is mounted i.e. via pCloud fuse, and IMO should probably be ignored); I am wondering if there may be some sort of cleanup command which may removes any leftover davfs.tmp files, before switching over to pCloud api, at least? I doubt that'll fix things such as the ~phantom directory. Methinks something is currently ~horked.
something like rclone ls remote: --include='*.davfs.tmp*'
is that output is good, then change ls to delete --dry-run
and if that output looks good, then remove --dry-run
fwiw, if that backup is not critical, then i would start over with a clean source and new crypt remote as dest
--noappledouble Ignore Apple Double (._) and .DS_Store files (supported on OSX only) (default true)
Well, I manually recursively removed those files in the OSX fuse mount (& did a sync & purge, and even tried --cache-db-purge, and your ~suggestion of rclone delete remoteName: --include='*.DS_*'), but rclone still appears to be seeing them, and also still running into the ~phantom directory (!?) [which doesn't even show up with rclone ls ... --max-depth 1].
2022/12/29 13:26:30 DEBUG : rclone: Version "v1.60.1-DEV" starting with parameters ["rclone" "move" "remoteName:folderName/" "remoteName:/var/www/html/" "-vv"]
2022/12/29 13:26:30 DEBUG : Creating backend with remote "remoteName: folderName/"
2022/12/29 13:26:30 DEBUG : Using config file from "/root/.config/rclone/rclone.conf"
2022/12/29 13:26:30 DEBUG : Creating backend with remote "pCloud_API:remoteName/m9jtchcfgdaon9vsqamh2nuqhcheo534th4cv2hkql131h26b9s0"
2022/12/29 13:26:31 DEBUG : Creating backend with remote "remoteName:/var/www/html/"
2022/12/29 13:26:31 DEBUG : Creating backend with remote "pCloud_API:remoteName/ro1ena11ndg5kbu66gc96b5jd4/i9gi6co62s72sfub2qtb4lu52g/mekiifutqusrmopc8j6lhp9gcs"
2022/12/29 13:26:31 DEBUG : Encrypted drive 'remoteName:/var/www/html/': Using server-side directory move
2022/12/29 13:26:31 INFO : Encrypted drive 'remoteName:/var/www/html/': Server side directory move failed - fallback to file moves: can't copy directory - destination already exists
2022/12/29 13:26:31 DEBUG : .DS_Store: Skipping undecryptable file name: illegal base32 data at input byte 0
2022/12/29 13:26:31 DEBUG : ocd5ludaf3umvnuc492s5fj8kk/.DS_Store: Skipping undecryptable file name: illegal base32 data at input byte 0
2022/12/29 13:26:32 DEBUG : ocd5ludaf3umvnuc492s5fj8kk/tpckrffakja5e6ha088mp72ov0/.DS_Store: Skipping undecryptable file name: illegal base32 data at input byte 0
...
I am running rclone on a linux system which doesn't support fuse (with the pCloud drive occasionally mounted on OSX via fuse for other files), so apparently those flags won't work for me unless/until rclone makes them available ~everywhere.
FWIW, I think I generally would prefer to use tools that assume that all backups should be considered ~critical, and provide options for cleanup & recovery.
After I had already performed the rclone copy to backup, the pCloud drive was also mounted via the 'pCloud Drive', which installs macFUSE, app on OSX.
No, the apparent ~'phantom' directory is within the crypt directory structure somehow, which was entirely created by rclone.
I had installed the pCloud api ~ at your request/recommendation, which doesn't seem to have helped much if anything; I think that the issue with failing to perform simple copying of files & directories, within a crypt, with the original webdav setup should be reproducible.
FWIW, my VPS doesn't have much spare room to perform fancy manipulations of files (and rclone seems to even be incapable of performing something as simple piping via rcat without making a /tmp duplicate of the information before sending it).
You're talking about issues that came up after I installed the 'pCloud API' version (& had also tried mounting the thing to take a look at it via OSX/macFUSE)...
The original errors that I received, after getting up in the morning and finding that my directory structure hadn't been reproduced in the copy, and attempted to move files into subdirectory, were [and I did not have -vv on] DirMove MOVE call failed ... EOF's:
for f in cat HB_listA.txt; do rclone move remoteName:$f remoteName:var/www/html/$f; done;
I managed to get that working for webdav and sftp, and lo & behold, my OSX 'CyberDuck' was finally able to move those two directories into /var/www/html; however, there appear to be some conflicts, as some of the folders had apparently moved over in previous efforts (folders without contents left behind) before, and in attempting to delete those, it's failing, claiming ".DS_Store: Skipping undecryptable file name: illegal base32 data at input byte 0", but attempting to remove files or folders, i.e. "rclone delete ... --include='*.DS_*'" isn't working, neither does "rclone ls ... --include='*.DS_*'" or other such variants even show that such file is there.
Well, 'rclone purge' seems to be about the only thing able to delete the folders with ~invisible files, otherwise giving "...rmdir failed: pcloud error: Folder is not empty...", however, said change doesn't seem to propagate to the server, even if I hit 'refresh' multx, unless I completely disconnect & reconnect. Might take awhile at this rate (doesn't seem reasonable to expect people to do)...
Bizarre; having a similar issue after I copied the 'pSDKBuild' directory into another of mine on pCloud itself (downloading subsequently via an unencrypted source using the 'WebDav' interface : all others since you asked me to using the 'pCloud API'), and then attempted to 'rclone copy' the libs folder: