Sql Server backup to S3 (vfs-cache-mode)

What is the problem you are having with rclone?

Hello all!
I'm trying to back up my sql server directly to S3 to decrease my storage costs, today I have 50+ servers that need to be at least twice the size of my database just to back up locally and later upload to S3.
In my scenario I have a server with the total disk of 500GB and 50GB free and my database is 100GB, if I use the mount command with the option --vfs-cache-mode off and try to create the backup in my mounted drive I see the error message ERROR : home/`my-bucket/mybackup.bak: WriteFileHandle.Write: can't seek in file without --vfs-cache-mode >= writes, but if I change the --vfs-cache-mode option to minimal or writes the backup will fail because I don't have enouth space in the disk to create the cache. So does anyone have any idea/solutions on how to solve this?
I'm totally open to any kind of ideas.

Thank you!

What is your rclone version (output from rclone version)

rclone v1.52.2

  • os/arch: windows/amd64
  • go version: go1.14.4

Which OS you are using and how many bits (eg Windows 7, 64 bit)

Windows Server 2016 Datacenter
Version 10.0.14393 Build 14393
System Type x64-based PC

Which cloud storage system are you using? (eg Google Drive)

Amazon S3

The command you were trying to run (eg rclone copy /tmp remote:tmp)

rclone.exe mount mybucket-backup:mybucket-backup/ S: --vfs-cache-mode off --fuse-flag --VolumePrefix=\backupserver\s3

or

rclone.exe mount mybucket-backup:mybucket-backup/ S: --vfs-cache-mode writes --fuse-flag --VolumePrefix=\backupserver\s3

The rclone config contents with secrets removed.

[my-backup]
type = s3
provider = AWS
access_key_id = XXXXXXXXXXXXXXXX
secret_access_key = XXXXXXXXXXXXXXXXXXXXXXXXXXXXX
acl = bucket-owner-full-control
server_side_encryption = AES256

A log from the command with the -vv flag

Paste  log here

Disclaimer: I know nothing about SQL server

What is the tool that you use to back up SQL server?

If this was MySQL I would use msqldump and stream the output into rclone rcat, something like this (with added gzip)

mysqldump -u user –ppassword database_name | gzip -9 | rclone rcat remote:db_backup.sql.gz

The rclone rcat streams the output to the cloud storage.

Is there a tool which can stream the backup to stdout for you?

Hi Nick, thank you for the fast response.

Disclaimer: I am also not an MSSQL expert

I'm using the Microsoft SQL Server Managament Studio, but this it is just the interface.
The command I use is BACKUP DATABASE MyDatabaseName TO DISK = '\\backupserver\s3\mybackup.bak'.
I'm following the microsoft documentation on their site called "BACKUP (Transact-SQL)".
I understood your idea and I’m going to find out if there’s any tool that makes the stream like the mysqldumb does.

Thank you!

for the 500GB drive, not sure what else you have stored on it.
you could use windows built-in transparent compression, and free up some space.
i have been using such compression on microsoft products since dos 6.22
you should be able to free up some space.

image

I found this which might help

good find, but not for for the backup of a production database.

the ms.sql backup is very powerful, full/differential/incremental, can verify the backups, can encrypt the backups and perhaps most important, backup the transaction logs.

so perhaps the OP should buy a cheap external usb hard-drive for $30.00, backup the database to that and then rclone move.

1 Like

Thank you!
I'll try it to save some space.

Hi Nick!

I'm trying a similar approach now. SQL Server has a command to backup a database to a VIRTUAL_DEVICE so I found this on github (https://github.com/eesquibel/VdiStreamS3).
This code creates a Virtual Device and execute the command BACKUP DATABASE [mydatabase] TO VIRTUAL_DEVICE='{0}' WITH STATS = 1 , apparently this is how we create the a stream, and in this case the stream is being passed directly to S3. I'll make some tests to verify if this will work.

Thank you!

1 Like

I am actually a SQL server DBA in real life. My recommendation is to enable compression for your backup and also enable trace flag 3042 in SQL Server to use a different backup pre-allocation algorithm that uses less disk space. see https://docs.microsoft.com/en-us/sql/t-sql/database-console-commands/dbcc-traceon-trace-flags-transact-sql?view=sql-server-ver15 for additional info. Backup compression is simply an argument in the BACKUP command, so very easy to implement.

the streaming to s3 solution @rafaelchei posted is interesting, but wouldn't use Rclone as it directly interacts with S3 storage instead of using the rclone mounted volume.

you should also be aware that SQL Server has native support for Azure blob storage directly from the backup command. This would be simple to implement and is roughly the same cost as AWS blob storage, but is built into SQL Server and requires no additional software or config. something to think about if you are just looking for a cloud storage destination for SQL Server backups.

2 Likes

Would it be possible to adapt that solution so it could be piped to rclone somehow with the rclone rcat command?

Thank you for this post. Good luck.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.