Hmm ok, that clarified it some - although I'm still not too clear on what the ideal way to handle these "files with changed extensions" would be. Are they identical aside from the filename? If so then --track-renames function would be useful to identify these and just perform a rename operation rather than having to reupload the file - thus both being efficient and solving the problem of leaving an old outdated file behind.
Otherwise it seems like you could probably get some use out of filtering here, so I would advice looking into that:
https://rclone.org/filtering/
I think this would save you from doing most, if not all of that scripting. Not that scripting is a problem, but it's generally preferable and less likely to have bugs to use well-tested code where possible.
This could allow you to perform a sync, but only on spesific folders. (and much much more, as the filtering system is very flexible).
If these folders are static - then you can just define them in a simple filter-file.
If they are dynamic and change all the time, then we will have to generate them on demand.
Thankfully rclone can do most of that work for you.
Let's say for example we are operating on subfolders of gdrive:/media as sompared to sufolders in /home/media ...
we could then run
rclone lsf /home/media --dirs-only > outputfile.txt
(now we should have a list of all the local folders, not including subfolders as -R, recursive is not default on lsf)
Then you can feed this into...
rclone sync /home/media gdrive/media --files-from outputfile.txt
Since filters apply to both sides - this should sync only the folders that exist locally, and ignore the others entirely (ie. not delete them).
This seems to be what you want to do here right? Scripting this would just need 2 lines, so it simplifies things greatly. Note that this is a rough example and it's likely that we may need to tweak the format somewhat to get exactly the result we want here, but I can probably assist you in that if you get stuck.
Good practice is to use --dry-run while testing such things until you are sure it's working as intended. Especially on any sync commands that could result in files being unintentionally deleted.
Does this seem like a solution to you, or have I missed the mark?
EDIT: Changed from --include-from to --files-from . These are subtly, but importantly different and I sometimes confuse them myself. I recommend understanding the difference if you use either. Feel free to ask for such clarifications if needed.