• Retention Policy Problem with V7
    Thanks. No need for incrementals, as fhese are Disaster Recovery images or VHDx files. A month or week old image is fine to get someone backup,and running, using these as a base plus daily file based backups
  • Retention Policy Problem with V7
    Very simple. Used to do full backups monthly with 3 day retention. Once newest full backup completed, the previous full got purged. Worked great.
  • Retention Policy Problem with V7
    This is becoming a large problem. While there is absolutely no reason to need to keep two fulls in Cloud storage of our Image backups, there is no way to keep only the latest one. So our storage costs have doubled from 17TB to 35TB each month. That equates to an additional $85/month, which might not seem like a lot, but we are a small MSP, and did not anticipate this added expense.
    Have not heard anything back about whether this will be remedied in a future release, so am posting here to hopefully get an update.
  • Change Legacy File Backup to only purge
    Whenever we replace a server, and choose to re-upload all the data, we put a note on the calendar for 90 days out to remove the old server backup data., We use Cloudberry Explorer to do the deletions.
    If you are not replacing the machine, but simply want to stop doing backups, keeping the same server/PC, then simply identify and delete the folders that you no longer need (after your retention period expires). You can do this kind of deletion from the server console (once you turn on the Organization: Companies:Agent Option that allow deletions from the console.
  • Backup Plans run at computer startup - even when option not selected
    Yes they did get back to me finally - (thanks to you I suspect).
    I keep having to explain to clients why their backup runs in the morning and why it fails.
    Thanks.
  • Incremental Forever and Synthetic Full
    As an MSP customer, I whole-heartedly endorse what David G is saying.
    The new format is great for Cloud Image and VHDx backups - as long as you are using a cloud vendor that supports Synthetic fulls. We previously used Google Nearline storage for the Image/VHDx backups, but they did not support synthetic fulls, so we moved them to Backblaze B2 (Not Backblaze S3 compliant) and it has ben absolutely fantastic!
    Full Image/VHDx uploads to the cloud that used to take three days are done in 12 hours or less.
    For standard file-based backups (cloud and local) , we plan to keep using the legacy format so that there is no need to re-upload the entire data-set.
  • Searching for a file or files in backup history
    Ok. Take away the search bar. Don’t tease us.
  • Searching for a file or files in backup history
    Look. It does not, and has not, worked in the portal. It is important that MSP360 understands that when you have hundreds of clients you can’t log into their servers to look things up. The portal should be able to use a basic wildcard search function to find out what files got backed up and which files had a problem. Shouldn’t be rocket science, especially after seven+ years.
  • Searching for a file or files in backup history
    BTW - Would love you to prove me wrong!
  • Searching for a file or files in backup history
    This has been broken for years.
    Multiple tickets submitted and it STILL does not work the way we want it to.
    Only way that I know see file level backup history is to go to RMM, Plan list, Legacy Plan list page and select Backup history, but the Search function DOES NOT WORK! It only works if you provide the entire path of a particular file.
    How can we get a list of *.pst files that got backed up from a particular device over the past week? - Answer me that!
  • Client Side deduplication in Version 7 - Results of testing
    Additional thoughts:
    The new format demands a periodic backup of everything - like tape backups used to be.
    So all of the files, even if they have never changed, will get backed up to the cloud again during a full backup.

    If you are using a Cloud storage vendor that supports Synthetic full in-cloud copying ( Wasabi, Amazon, or BackBlaze B2 (not BB S3 compatible), the periodic "full" backups will not be a problem. But if you are using Google Nearline or Coldline, you would need to re-upload EVERTHING every month - which would take a VERY long time.
    So I recommend that for Local and Google backups you stick with the legacy format and if you use Backblaze B2,Wasabi, or Amazon, switch to the Version 7 new format going forward fior new backup plans.
    For existing clients, I would not recommend re-uploading everything just to get to the new format..
  • How to Delete entire Folder from Cloud Backup Storage
    Just Be sure that you have “ allow Data Deletion in Backup Agent” selected under companies: Agent options. You should be able to remove unwanted folders.
  • Expected Data Usage Compared to Drive Size
    You want an image backup of a machine to be able to restore the OS and all of the installed apps. If the users folder is not huge, then the image will also include a copy of all of the data files. But if the user data folders are large (greater than 200GB in our model) we exclude it from the image backups. The data is already being backed up separately via daily file backups with a 90 day retention period.
    The image backups are for disaster recovery - not file recovery.
    I would be happy to work with you to get things set up properly, let me know if you want help. (I don't charge anything)
  • Expected Data Usage Compared to Drive Size
    Get yourself a 4TB usable NAS device ( with room to add drives if needed later). That should be plenty for your situation and they are fairly inexpensive.
    To attempt to answer your question:
    For standard (legacy format) file backups of 200GB with a retention period such as yours, I would expect you to consume between 190GB and 240GB depending on the data change rate. How could it be less? Compression. We typically get ~20% compression rate overall. Most data never changes (pdf’s pictures, etc) , and if you are backing up QuickBooks, Word, Excel docs etc, you get a high compression rate and block level incrementals are small.
    There are some data types (like app specific backups) that can generate a new GB+ file each day, such that keeping a month’s worth will consume more.
    If you plan on doing local Image backups for six devices, you will consume a lot more space, but daily block incrementals tend to be small, and you can exclude the data folders from the image backup since you are already backing up the files separately.
    Remember that with a once per month full image, you will always have two months worth of backups in storage since you can’t purge the previous month until the last block incremental for that month has reached 30 days old on day 59.
  • Image Restore Followed By File Restore
    Under the Remote Management tab, where the devices are listed, you will see a gear with a drop down arrow. Select Edit : Edit options and you can do a repo sync.
  • Image Restore Followed By File Restore
    Either from the server console or the MBS portal, select Options: Repository, then select the cloud/ local storage that you want and synchronize it. This will update the repository such that a file restore will get the most current backup data.
  • Prefilled Encryption Key in MBS portal - Big Problem
    Figured out a way - The CBB_Report Folder has the same strings associated with the plans so by loolking at the plan run report I can match them up.
  • Prefilled Encryption Key in MBS portal - Big Problem
    So it turns out that the prefilled encryption key is not causing the issue, the original encryption key was wrong so when I put in the right one, it said it had changed and would do a new full.
    Looking for a way to tell which other plans have the wrong encryptiuon key . In legacy, I would simply download a small file using Cloudberry Explorer and if the Encrtyption keyis correct, the download works.
    In V7 however, all I see in CB Explorer is a lengthy string for each plan. Is there any way to display the string associated with each plan?
  • Configuring incremental backups with periodic full backup
    That was my plan - stick with legacy for file backups and use the new format for Image/VHDx backups taking advantage of the synthetic full capability. What worried me was the statement (hopefully since rertracted) that there will come a time when legacy format will no longer be supported. Lets hope that day never comes.