Rename of .xlsx files results in native equivalent being moved to bin

What is the problem you are having with rclone?

In an attempt to change ownership of all files from multiple users to the current Rclone mounted owner I'm using find searching for all files larger than zero bytes piped to a script that copies/moves in a series of steps that results in a new owner with the original timestamp (a subsequent phase needs all owners to manually change ownership of the native folders/docs)

All works fine except in a folder that has an .xlsx file that has been "saved as" a native gdoc. In those cases the script results in the gdoc file being moved to the bin.

Web view of test folder:

What is your rclone version (output from rclone version)

$ rclone version
rclone v1.55.1

  • os/type: linux
  • os/arch: amd64
  • go/version: go1.16.3
  • go/linking: static
  • go/tags: none

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Ubuntu 20.04, 64 bit

Which cloud storage system are you using? (eg Google Drive)

Google Drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

The basic steps of the script manually run on the command line:

$ ls -nl
total 184
-rw-rw-r-- 1 1000 1000      0 May 10 18:46 'Copy of x.xlsx'
-rw-rw-r-- 1 1000 1000      0 May 10 18:27  NewBlank.xlsx
-rw-rw-r-- 1 1000 1000 187894 Dec 16  2016  x.xlsx
$
$ cp -p x.xlsx x.changedowner.xlsx
$ mv x.xlsx x.original.xlsx
$ mv x.changedowner.xlsx x.xlsx

At this point the native gdoc file is moved to the bin. I've also tried --no-clobber on the mv.

$ ls -nl
total 367
-rw-rw-r-- 1 1000 1000      0 May 10 18:46 'Copy of x.xlsx'
-rw-rw-r-- 1 1000 1000      0 May 10 18:27  NewBlank.xlsx
-rw-rw-r-- 1 1000 1000 187894 Dec 16  2016  x.original.xlsx
-rw-rw-r-- 1 1000 1000 187894 Dec 16  2016  x.xlsx
$

Is there a way to prevent this behaviour or alternatively make both the .xlsx file and it's native equivalent visible so that the script can test to see if a native one exists and skip?

I don't use it day to day but I checked on the Google supplied Gdrive app on a Windows PC and notice that the file manager shows .xlsx for the MS file and .gsheet for the gdoc equivalent which got me thinking it might be possible and I just don't know the right switches/option.

The rclone config contents with secrets removed.

[SoulOfxxxxxxxxx]
type = drive
scope = drive
token = {"access_token":"xxxx","token_type":"Bearer","refresh_token":"xxxx","expiry":"2021-05-11T16:16:59.262402082+01:00"}
root_folder_id = xxxx

A log from the command with the -vv flag

Only unix commands used.

What is happening is effectively you've got a duplicate file name and rclone gets confused about which one you mean...

The google doc x is being exported as x.xlsx and you've got the actual x.xlsx.

Well that gave me an idea...

You can change how the file is exported so if you picked a different export format - say --drive-export-formats link.html then I think it should work.

Many thanks, I realised the reason for it being masked was the effective duplicate.

I'll give your export format idea a go and let you know how I get on.

I'm not too worried if takes a few passes to achieve the end result as this is a one off cleanup exercise helping out a friend.

1 Like

Using the different export format did indeed allow me to identify all the "duplicates" so many thanks for the suggestion.

Basic steps

  1. Mounted the drive with --drive-export-formats link.html
  2. Used find to create a list of files with link.html suffix
  3. For each find, strip the link.html suffix, add a * wildcard and ls
  4. Review list looking for potential duplicates e.g. files with both .xlsx and link.html suffixes.

I didn't take this too far as I knew there were few instances but scattered over a huge tree structure and manually reviewed the generated list which of course also picked up files that had "longer" names.

If a huge number had been identified I would have programtically honed down the list but for me not worth the time & effort being a one off so a mixture of a bit of scripting and manual review.

1 Like

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.