• Bug with retention policy not sticking in new backup format
    Never mind. I was able to reproduce. I am checking with the team on whether this is a known issue and a fix is coming or if it's new. Either way, I will report back when I know more.
  • Client Side deduplication in Version 7 - Results of testing
    Thanks Steve for the feedback (and great news on performance).
  • Bug with retention policy not sticking in new backup format
    Can you verify for me again The exact remote deploy plan settings you have under retention. Thanks.
  • "Cannot read x files" not reporting in Event Log
    Can you post what you're seeing in the agent or notification email?
  • Slow browsing
    Use the diagnostic menu option to open a case and submit logs automatically or head over to our Support section of the msp360.com web site.
  • CLI: Usefulness and Experiences
    What are you looking to do with the CLI?
  • Slow browsing
    You'll need to open a support case for further analysis by the team.
  • Bare Metal Recovery
    You can open a case from here on mobile if necessary. https://support.msp360.com/
  • Bare Metal Recovery
    8d suggest you open a support case to get this answered.
  • MSPBackups.com Website Slow
    how is performance today? I know the team made some of those changes I was referencing previously on the servers during the last service interval.
  • Bare Metal Recovery
    on the computer where you're creating the USB, does it have the same prefix as the computer that you're looking to restore?
  • Slow browsing
    I only have the S3 Drive version installed locally, but check the following:
    - Open up Drive - Options
    - Click on Mapped Drives and Edit the Mapped Drive in question
    - Click Advanced Options at the bottom of the dialog
    - Make sure Use File Cache and Optimize for Windows Explorer options are checked
    - Go back to the main Options screen and click on Logging - make sure it's set to Low Level (preferred) or No Log
    - Go back to Options and click on the Connection tab - check your Queue Thread Count - what is it set to?
    - Go back to Options and click on the Advanced tab
    - Check the File Cache Directory location - is it on the optimal, fast drive? If not, relocate
    - Is LImit Cache Size checked? If so, consider unchecking

    Can you also confirm the Azure region you are using is local to your location.

    If we cannot identify something obvious from Options, you may need to open a support case.
  • Bare Metal Recovery
    Where are your backups located? What options did you select when creating the bootable USB stick? Backups are not moved to the USB, so the storage accounts need to be available for the restore to take place.
  • Slow browsing
    Can you confirm if the Azure storage class you are using is hot, cool, or archive?
  • Reporting to provide customer
    You could use something like the Group Report in HTML format. Send via Email to yourself or Download and print to PDF. It's formatted. I would be happy to review a more detailed request from you if you feel like putting together an example.
  • Reporting to provide customer
    actual billing is much tougher because every MSP sells backup services different ways. Some sell as a part of a base set of services that include things like security and RMM, whereas others sell backup services at different prices per endpoints sometimes with free storage rolled in. Can you DM me an example, or post here, of the type of bill you're looking for?
  • Unfinished Large Files
    My understanding is that was addressed in 7.1.4.29
  • Trying to understand the new backup approach
    If you're referring to the legacy format, then there is no concept of a full backup. Everything is incremental forever as the backups are at the file level. If you're referring to the new backup format, then we recommend full backups are run periodically, but that interval is up to you. Currently, the user-interface implies they should be run monthly, but that may change with a future update and allow less frequent fulls to be run. The new backup format uses a generational format of large archive files that contain the individual files being backed up and there is chain of backups that may be needed to perform a restore. We always run consistency checks with the new backup format, so if there's a missing backup archive, you'll be notified, and the remedy is to run a new full backup. Full backups are usually synthetic in the cloud, and they tend to run much much faster than a full backup of the source data. But, the new full backup generation does use an equivalent storage amount. So, in your case, where you only want a single copy of each source file in backup storage, the legacy format is fine.
  • Trying to understand the new backup approach
    When you run an incremental backup, locally deleted files are noted in the new backup data. When you start a new full backup (which will be synthetic on most clouds), a new full is created with only the current set of live files. Your retention settings for Keep Backups For and GFS will determine how long those backups stick around. If you're using the legacy format to back up directly to Amazon S3 Glacier, then you can stick with that format if you desire. It's not going away. When you say "a synced version" it sounds like you want a copy of the current data in the cloud and do nto really need generations of backups. So stick with the legacy format if it works for you. If you want to move to the new format, you might see much improved backup performance, you you'll most certainly see improved restore performance. But if you're backing to Glacier, then you are not planning to do restores unless absolutely necessary. If you want to elaborate on your exact use case, I can provide additional guidance.
  • Just installed v3.2.3.148_20211103000805, now it crash dumps repeatedly and apps won't start
    You can either uninstall, restart, and then a fresh reinstall, or you'd have to reach out to support for log analysis to identify the issue.