I have terrible upload speed. However, I have access to a place where I can upload from occasionally (work)
What I'd like to do to start off with and populate my remote initially (many TB) is to have my home connection uploading - but then grab a listing of files that rclone locally hasn't touched yet, and copy those files to an external drive, which I'll bring to work and upload using copy.
What I'm trying to accomplish this is to create a script that will list files on the remote, and list files on the local storage, and compare the two. then give me an output of what is different, so i can select a specific folder or set of folders to copy to an external, bring to work, rclone copy, and my home setup will not have to use my terrible upload speed to finish copying.
I know theres a really good way of doing this. I don't know what that is. I've tried using rclone lsf or rclone ls piped to a cut command to show me the files there, and then trying ls locally, but i don't know what type of fancy linux foo to get the format of both to match, so that i can try to compare or diff.
any fancy scripters able to help me out?