SQL best practices
Coming from Solarwinds Backup and Recovery and a new Cloudberry user. A customer of mine has a 100GB SQL database for their veterinary practice software. I have a backup job running frequently for local external storage and that's working fine. I am using Wasabi for cloud based storage.
What is the best way to back up the database into the cloud? I am thinking of a regular full with differentials, but am concerned with the time it will take to send the 82GB database backup file each time the full backup runs.
SolarWinds handled the local and cloud backups using the same back up job, so it was simpler in that regard.
Any input would be greatly appreciated.
First thing is do not run separate backups for local and cloud. If you do, and you're using differential backups or taking transaction log backups, you may end up with restore issues. Second, SQL Server will compress the database backup if you enable that option (and you're not using an edition of SQL Server that does not support compressed backups). If the full is really 82 GB, that should not be a problem even for slower broadband connections. You did not post the upstream speed, but unless you're saddled with DSL speeds, you should be fine running the full backups at night when there is no other activity. Third, consider performing Hybrid backups from CloudBerry so you have local and cloud backups that are the same and are performed in a simple pass (like SolarWinds). Lastly, you can reduce your full backup frequency if your differential backups remain small when compared to your full backup size. Consider reducing the full backup frequency to every 2 or 4 weeks with differentials on other nights.
You did not mention transaction log backups. It's possible the database is running in Simple Recovery and they are not needed or the software that uses SQL Server does not support their use. But if not, I would consider using them and running the t-log backups throughout the day based on your customer's restore point needs. In other words, how much data are they prepared to lose if the database needs to be restored. If it's an hour, then run those t-log backups every hour.
The database is running in Simple Recovery. Also, I had selected "Copy-Only" for the backups I ran to date.
I will re-create the backup job and use the hybrid method.
I am also doing an image backup of the server locally and in the cloud, so its backing up the physical database files as well (not a correct SQL backup of course). Probably no way to exclude those from the image based backup (I guess I would not want to if I had to restore the server from the image) since I am doing a SQL based backup as well?
You can exclude the folders with the database and log files if you do not want to back up those files in the image if it's a concern of size.
Sign in or register to add a comment.
Add a Comment
Welcome to MSP360 Forum!
MSP360 Managed Products
Managed Backup - General
Managed Backup Windows
Managed Backup Mac
Managed Backup Linux
Managed Backup SQL Server
Managed Backup Exchange
Managed Backup Microsoft 365
Managed Backup G Workspace
Connect (Remote Desktop)
Backup for Linux
Backup SQL Server
Connect Free/Pro (Remote Desktop)
How to build a Disaster recovery plan in a 2nd AWS region for a small SQL Server DB (30 GB)
SQL Server backup failure "Possible reason: low local disk space"
Frequent backup failure false Alerts in SQL server
Cloudberry Backup Trial version not able to select SQL Server
Can not restore Differential SQL Server with error 2006
Terms of Service
Useful Hints and Tips
Created with PlushForums
© 2022 MSP360 Forum