Downloading one copy from Google Photos?

What is the problem you are having with rclone?

Downloading from Google Photos to local drive copies files multiple times under different directories.

My question - is there a way (e.g. a specific path or pahts) that I can give rclone to make it copy each photo only once?

I found the advice here: Rclone don't stop sync process [Google Photos] - #3 by glemag which recommends using gphotos:media/by-year/ but from looking at the copied images I have on my local drive so far (the command haven't finished after running for a few days now and copying 255GB from expected 45.3GB space reported by Google) I'm not 100% if it will cover all my images exactly once, without missing any.

I'd also like to keep track of albums (collections of photos) without storing multiple copies of the images included in them.

Run the command 'rclone version' and share the full output of the command.

rclone v1.66.0
- os/version: darwin 14.4.1 (64 bit)
- os/kernel: 23.4.0 (arm64)
- os/type: darwin
- os/arch: arm64 (ARMv8 compatible)
- go/version: go1.22.1
- go/linking: dynamic
- go/tags: none

This is the latest version on the website at this time.

Which cloud storage system are you using? (eg Google Drive)

Google Drive to local disk

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone copy --inplace --progress google-photos:/ ./google-photo

Please run 'rclone config redacted' and share the full output. If you get command not found, please make sure to update rclone.

[google-drive]
type = drive
scope = drive
token = XXX
team_drive =

[google-photos]
type = google photos
read_only = true
token = XXX

[s3-backup]
type = s3
provider = AWS
access_key_id = XXX
secret_access_key = XXX
region = us-east-1
acl = private

A log from the command that you were trying to run with the -vv flag

Transferred:   	   13.603 GiB / 13.603 GiB, 100%, 0 B/s, ETA 0s
Errors:               829 (retrying may help)
Checks:            105681 / 105681, 100%
Transferred:         2145 / 4923, 44%
Elapsed time:   1h30m11.5s
Transferring:
 *          media/all/PXL_20240314_015523617.mp4: transferring
 *          media/all/PXL_20240314_023139629.jpg: transferring
 *          media/all/PXL_20240314_061013168.jpg: transferring
 *          media/all/PXL_20240314_061127929.jpg: transferring^C

Look, I would advise you to download your images from Google Photos through Google Takeout, considering that as far as I know the Rclone API will download your photos with reduced quality, that is, if a video is 1 gb the version downloaded by Rclone would be 500mb for less. I could even be wrong, but if anything, a more experienced colleague here on the forum will give you a better answer.

1 Like

Thank you, @edsonsbj , I've been using Takeout over the years and am still hoping that Rclone could help me automate the process.
As it is now, with takeout and my 2FA Google security I have to manually authenticate and pick every archive file of the takeout from my home laptop with limited Internet bandwidth. If I manage to get Rclone to do what I need then I could script it on an AWS EC2 instance and upload the data straight to S3.

But there is a way that you can have Google Takeout store your photos and leave them in Google Drive or Onedrive, then you would use Rclone to copy them to the desired location

1 Like

Yes, I remembered that right after I sent my previous reply. I'm trying this.

The slight disadvantage of that is that I duplicate my photos storage used on Google by re-uploading them into Google Drive (I don't have accounts on the other alternatives and don't see much point in creating ones just to transfer to the final destination of AWS S3).

On the way, I'll try to take advantage of the fact that I have a substantial (but incomplete) copy of the photos downloaded to my laptop directly using Rclone to see if the files in the Takeout and direct Rclone are similar.

Well, if you don't care about the quality of your photos, then keep using it this way with rclone. The suggestion given for you to create a copy within Google Drive would be momentary, that is, you would download it and then delete it.
I've already racked my brains over this and from my own experience I don't recommend using rclone with Google Photos for years I used to let my photos go to Google Photos with the minimum quality and, for example, a photo that was taken with 50 MP it was at 3mp and when downloading it through rclone it was even smaller, and for me it was the last straw when I needed to develop some photos to give as gifts, the photos were horrible with the terrible quality, all distorted, so I stopped using Google Photos and I created a local NAS server so that the photos would be uploaded in their original quality and today it has been 3 years without Google photos. I don't blame the tool, but Google because of its API

1 Like

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.