• Randy Wright
    0
    Coming from Solarwinds Backup and Recovery and a new Cloudberry user. A customer of mine has a 100GB SQL database for their veterinary practice software. I have a backup job running frequently for local external storage and that's working fine. I am using Wasabi for cloud based storage.

    What is the best way to back up the database into the cloud? I am thinking of a regular full with differentials, but am concerned with the time it will take to send the 82GB database backup file each time the full backup runs.

    SolarWinds handled the local and cloud backups using the same back up job, so it was simpler in that regard.

    Any input would be greatly appreciated.
  • David Gugick
    61
    First thing is do not run separate backups for local and cloud. If you do, and you're using differential backups or taking transaction log backups, you may end up with restore issues. Second, SQL Server will compress the database backup if you enable that option (and you're not using an edition of SQL Server that does not support compressed backups). If the full is really 82 GB, that should not be a problem even for slower broadband connections. You did not post the upstream speed, but unless you're saddled with DSL speeds, you should be fine running the full backups at night when there is no other activity. Third, consider performing Hybrid backups from CloudBerry so you have local and cloud backups that are the same and are performed in a simple pass (like SolarWinds). Lastly, you can reduce your full backup frequency if your differential backups remain small when compared to your full backup size. Consider reducing the full backup frequency to every 2 or 4 weeks with differentials on other nights.

    You did not mention transaction log backups. It's possible the database is running in Simple Recovery and they are not needed or the software that uses SQL Server does not support their use. But if not, I would consider using them and running the t-log backups throughout the day based on your customer's restore point needs. In other words, how much data are they prepared to lose if the database needs to be restored. If it's an hour, then run those t-log backups every hour.
  • Randy Wright
    0
    The database is running in Simple Recovery. Also, I had selected "Copy-Only" for the backups I ran to date.
    I will re-create the backup job and use the hybrid method.

    Thanks.
  • Randy Wright
    0
    I am also doing an image backup of the server locally and in the cloud, so its backing up the physical database files as well (not a correct SQL backup of course). Probably no way to exclude those from the image based backup (I guess I would not want to if I had to restore the server from the image) since I am doing a SQL based backup as well?
  • David Gugick
    61
    You can exclude the folders with the database and log files if you do not want to back up those files in the image if it's a concern of size.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment