Best way to maintain backups? - Newbie

Greetings - I managed to get rclone-v1.65.2 working on my QNAP NAS.

I am using Mega for storage and have issued the command rclone copy /local_share remote:backup

I see that data is being copied. It's been running for a few hours and about 13GB has been copied so far.

I'm backing up my music collection which is a few TB, so I understand that it will take days/weeks to finish.

My question is: Once the initial copy has completed, what's the best method/command(s) to keep the remote share on Mega up to date as I continue to add more music to my local share?

I don't want stuff to disappear from Mega if they are somehow lost locally. That doesn't make sense to me as the whole point of copying everything to Mega is to have a cloud backup in case of a catastrophe at home.

I know this might seem like a silly question, but it seems like there are lots of different options and I was wondering what would be best/most simple?

Any suggestions will be greatly appreciated! Thanks!!

I'm guessing that you will want to continue to use the copy command rather than the sync command.

Sync will delete files from the destination (if they are not on the source) which is your mega storage, and you don't want anything deleted from there.

The copy command will not, at least, it should not recopy anything that exists in the destination.

You can always stop and start and Rclone will pick up where it left off nothing will be duplicated.

Check out the -P flag when copying. It will give you interactive statistics about the transfer that is underway, how much has been done how much there is to go, how fast it is going etc.

You can also use --backup-dir flag to create the closest to "backup like" solution using rclone. It will protect you to some extend from unrecoverable propagated deletions and changes. Depends on your usage restoring such backup to specific date in the past might be very cumbersome though.

In general rclone is not a substitute for proper backup software. It is great tool to transfer files to/from almost any cloud. And real backup requires much more than that.

For proper solution use something designed for the job. There are many free and paid options available. For open source I would recommend to look at restic/rustic or kopia.

1 Like

Thanks for the info on these backup products. New to me.

If you would like to check out a backup app that has a nice GUI, check out FreeFileSync. I combine the two regularly to do my backups.

Ok, so the initial copy ran for a couple days, copying about 200GB until it hit a snag.

Seeing the following error: Can't follow symlink without -L/--copy-links

It wasn't progressing any so killed it.

So, I guess I kick off a copy again and use -L ?

Will it automatically skip stuff that's already been uploaded or do I need to use some flag for that?

Any other suggestions?

yes. and quite normal running transfer first time to discover some details which have to be ironed. Next full transfer should be much easier:)

In terms of links I suggest you read documentation first to decide what you want to do and how to handle links.

Thanks LeoW and kapitaninsky!

I re-ran the same command with -P and it picked up right where it left off and shows status, which is awesome!

So I am thinking that once this initial copy is done, I'll just set up a crob job to run the same command once a week or something...

2 Likes

If you use only MEGA then use the Mega CMD MEGA CMD - MEGA there is no need for rclone at all. Thee cmd has an backup method integrated.

I tried MEGA-CMD before I found rclone.

Didn't have a lot of success with Mega-CMD. It just froze/hung after a while.

Having much more success with rclone.

1 Like

Once you put rclone into a cron job, make sure you add logging. Use:

--log-file=/where/to/save/file.log --log-level=INFO

I write shell scripts for backup then link the crontab to that. Easier to make changes and also shell script useful to add custom dates in file and do other interesting things at the same time.

I use rclone extensively for backup. It's great to have a historic log file to work from too.

Got just over 1TB uploaded before hitting a snag.

Apparently there is some corrupt file that is causing problems, and I can't delete it.

When I ssh into my QNAP NAS it shows like this:

? d????????? ? ? ? ? ? .@__thumb/

Can't delete it no matter what I try.

Suggestions?

So, I was able to move the directory with the problem file out of the path being copied by rclone to Mega and I am no longer seeing errors relating to that file.

But there's still a problem - It appears that rclone thinks it has copied my entire Music directory up to Mega, because when I run the copy command again it, here is what it returns:

But it has only copied about 1/3 of my collection, up to the letter N in the Artists list.

Here is trhe details of the Music directory:
Music folder 2024.02.18

And The Mega webpage shows it only has 1.10TB

Nothing seems to happen when I run cleanup

So, why does rclone think that everything has been copied when it hasn't and how do I get back on track?

Nobody can guess without you providing all info. rclone copies everything you ask it to. Most likely there is some problem with your share/access rights etc. Maybe instead of using Windows and shares install rclone directly on your NAS and run from there.

On your pictures you definitely compare different things - as your Music Property contains 103039 files and directory you run rclone copy from 116366 files.

Also when possible do not post pictures - they are hard to read + not easy to use any data from them to even give you some suggestions. Copy/paste text in the future.

Appreciate your response!

I discovered a few more interesting things (and btw, I am running rclone directly from the NAS via ssh)

The output from the rclone command does show 116399 files, however when checking folder info from the Mega website it shows only 53735 files. That's why I believe that rclone thinks it copied everything, but it didn't.

Another interesting thing is that as a test I decided to copy a random folder containing a few albums. Total size 1.71GB.

The rclone output appears to indicate that it was successful. However, it completed much too quickly - 6.8 s.

Then when I look at my drive on Mega, the folder is not shown. However, when I search for the folder name on Mega, it does find the subdirectories but the sizes are all 0B. These folders are only found when a search is done, they do not show up when I browse the parent folder.

Hopefully the above explanation is sufficent. Including some screen shots below as well to help illustrate.

Try to do the same but add crypt to your setup. Maybe mega does not like some copyrighted content.

Now as you have found single folder you can replicate the issue with it shouldn't take long to test.

1 Like

Ok, I added crypt ( I think I did it correctly) and then copied another folder but the behavior appears the same. It completed in 0.4s, not showing on Mega when browsing, but can be found using search, and shows as 0B

I think you have may have been correct before when you suggested that there may be an issue with my share/permissions, unfortunately.

I probably need to get that sorted before going further down the rclone rabbit hole.

rclone was working perfectly for a while...

I first started having issues with Roon, having to restore the database from backup several times, etc.

Then things were working again but rclone seemed to terminate when a particular file was scanned. And that same file is the one that I cannot delete from my NAS no matter what I try. I was able to move it to a different folder which isn't in the path being copied to rclone though. But since then, things have not been the same.

The file shows as follows from ssh session to the NAS

? d????????? ? ? ? ? ? .@__thumb/

When I try to delete it I get errors like "directory not empty" and "structure needs cleaning", etc.

So, there seems to be some file/db corruption on the NAS? I am just wondering if this happened because I started using rclone? Maybe a roon db backup and a rclone copy happening simultaneously? Maybe I could've/should've done things differently?

One more thing:

I made a new remote: and tried copying a folder there.

Same deal. It appears to me moving much too quickly. It seems everything as 0B?

Is it saying that it transferred over 16GB in 1m19.5s? And that the ETA for over 308GB is 21m35?

  1. Stop posting pictures - it is not Instagram - use copy/paste for text output
  2. Post all details including your remotes setup and logs

Otherwise it is just a story maybe interesting to read but with no value to help to troubleshoot.

rclone copy NAS remote: does not change anything on your system. So it is close to impossible. Most likely by being very strict and double checking everything it just reveals problems with you storage (NAS).

Sorry that the pictures are annoying you, kaptin. I'll refrain from posting more.

I wasn't trying to imply that rclone changed anything on my system. I was just thinking that maybe something could have gotten corrupted if it was being accessed by another program while rclone was doing a copy or a scan? Not trying to blame rclone.

Anyway, whatever happened is now preventing me from copying the rest of my collection to Mega using rclone, which is a bummer.

I think one clue is that when I try to copy my collection again it shows 116386 / 116386 files checked but Mega only shows 46832 files in my backup directory..

The number of files on Mega is lower than before because I deleted a bunch of @__thumb files which I though might be causing problems.

Below is the output from the command:

/rclone copy /share/Multimedia/Music/ backup/backup: -P -L

Transferred: 0 B / 0 B, -, 0 B/s, ETA -
Checks: 116386 / 116386, 100%
Elapsed time: 9.1s

My read is that it checks the whole local directory, thinks everything has already been copied and then quits. But how to fix?