Google photos API


#21

Maybe rclone should be dealing with duplicates in the drive interface… Google drive objects (and photo objects) have a unique ID so maybe rclone should stick it on the end if it finds a duplicate… So

file.txt
file.txt

Would become

file.123898127391823791287.txt
file.698173981729817239187.txt

That would break syncing though if it stuck all of them on the end, so maybe it should choose the oldest to not have the extension and put extensions on all the others, so

file.txt
file.698173981729817239187.txt

Interesting, thank you

:smile:


#22

Something like this would be good. I was going to suggest a flag to download duplicates because of the photos/drive interface but didn’t bring it up. Because photos is ‘supposed to’ have duplicates, I found that I couldn’t download them because rclone would just complain about them. It would list them, of course, but since a single destination can’t deal with the duplicates (linux FS) that it would just politely tell me there was some.

It would be nice to have a flag to rename them in much the same way as ‘dedupe’ does. I could pass the flag of ‘skip|first|newest|oldest|rename’ and it would do that operation on the copy. In my use case, I didn’t want to dedupe, I wanted to download specific slices of a ‘duplicate’. I do like the idea in addition (as part of rename) to use either the hash or the ID though.


#23

That is a really nice idea! so a --drive-duplicates option. There should probably be one more error which is effectively the default at the moment.

Do you fancy makinge a new issue on github about this?


#24

The default should absolutely just be the default of error.


#25

#26

Hello,

Is it possible you explain in lay terms how you managed to use the new GPhoto API with RClone?

Thanks a bunch!