• jeferson
    1
    Hello people
    i want some help to make a really funcional backup strategy if someone could share your knowledgement and patience to read.

    this is my scenario

    SERVER
    -1 8tb hd for Electomechanics department (almost 7tb used), around 1TB data changed weekly
    -1 6tb hd for 5 departments , around 400gb data changed weekly
    -1 3tb hd for Substation, around 300gb data changed weekly

    Backup server

    my internet connection is 50mbps dedicated (will change to 100mbps this week, but in brazil, the price of internet is very very very high, like USD 650,00 monthly). I will use 50mbps to backup and 50mbps to users. After commercial time i will use 80mbps to backup and 20mbps to some users who work remotely.

    This client is a engineer company. Due large data changed weekly, online backup can't follow all change.

    15 days ago, a catastrophe with one hd forced me to recover from backup but local backup take several hours restrutcturing until start recover and (i believe this problem was caused due block level in local backup)

    how is the best way to make a reliable local backup (without versioning, only last version to make fast recover) + online with block level to make fast upload and versioning?
  • Matt
    91
    What type of plans are you trying to implement? Only file backups? Only image based ones or a mix of both?
  • jeferson
    1
    hi!
    Yes, a file only backup.

    I forgot to complement: We have a local file server backup with same HD capacities.
  • David Gugick
    57
    I think only keeping one version is risky and the only value it provides is reduced storage. I would encourage you to keep multiple versions in case someone makes a mistake in a file and needs to go back to an older version.

    Block-level can help greatly with the speed in backing up changes to large files, but for local restores where bandwidth is not an issue, you can certainly disable that option and back up entire files should they change.

    To do what you want, you'll need two backup plans: One for local and one for cloud. If you decide to enable block-level for all backups, then you can use a single Hybrid backup plan to the local and cloud targets.If you want to speak with support about the recovery time issue you had with the local restore (I do not think block-level should have caused a performance issue with local storage) then please reach out. You can send logs from the computer in question from the Tools | Diagnostic toolbar option.
  • Dave Schierenbeck
    0
    So, I want to explore this further.

    I'm setting up a backup plan for small office. I have 3 laptops that are in the office 60% of the time, 3 workstations in the office 100% of the time, but get shut down at night, & 1 server that runs 24/7.

    On the server, I've added 6 TB of open storage that I have access to for each of the workstations/laptops on the local domain.

    I am planning on backing up the 3 laptops and 3 workstations to the server storage every day, since that is quick, local, and will get me the quickest recovery in case of disaster. Obviously, the problem is that this storage is still "local" to the office, so that is where my next question comes in:

    1) Should I do a cloud backup of each of these machines, staggering them throughout the day to minimize data dumping over my internet connection.

    2) Or should I just backup the server to the cloud daily, along with the local data store daily - performing this backup at night when there is no load on the internet connection.

    I have 100 MB connection (up & down).

    Would love to hear pros and cons of both strategies from the experts out there . . .

    Thanks!
  • Matt
    91
    It's always good to have an off-site copy of the data just in case and most of our customers prefer to simply run backups in non-business hours for each machine. I'd say the main problem here are the laptops, since they are not available 100% of time.

    Be sure to set up bandwidth schedule when running these plans, that will help with the load on your network. If you're confident in your local setup you can also perform cloud backups of just the most important data on those machines to minimize the time it takes to transfer the files. Just make sure that those plans are not running simultaneously.
  • Dave Schierenbeck
    0
    Matt:

    Thanks for the feedback. I agree that we need off-site copies of the data - that is not the issue.

    The issue is the best way to get that done: The current process is that ALL workstations and laptops are shutdown at night. Both for security (cleaning crew comes in after hours, etc.) and just because it makes no sense to keep them running for 16 hours when no one is here.

    So, back to the original question: Should I just copy everything to the server, and back up the server data (including the backups of the workstations/laptops) to the cloud at night (I keep the server running 24/7), or back up each machine individually to the cloud during regular business hours?
  • Matt
    91
    I don't recommend doing a backup of a backup, if that's what you're referring to by "copy everything to the server". Since I'm not really familiar enough with your setup it's difficult to recommend any particular steps. If you're backing up important data on those laptops it would be more efficient to use them for your backups and set up a bandwidth limit, this should be enough.
  • Steve Putnam
    11
    Have you considered using redirected folders for the workstations? No need to backup the individual PCs since the data is in their user folder on the server. We keep a base image of a standard workstation build for each client if they special software installed, but using redirected folders saves us a lot of money and time.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to MSP360 (CloudBerry) Forum!

Thank you for visiting! Please take a moment to register so that you can participate in the discussions!