• S3 to Glacier
    Yes, Backblaze B2 does not have a retention requirement. It bills like S3 Standard using average monthly usage. Work with Dmitry and see what you find. Good luck.
  • 5TB limit in 2021
    I'll need to check with Support. I'll reply back here once I hear back.
  • S3 to Glacier
    You can target S3 Glacier and S3 Glacier Deep Archive directly if you have the option to show storage classes enabled in Settings - Global Agent Options.

    You can also set up the Lifecycle Rules at the storage level in the Managed Backup management console in Storage Management for the S3 storage account in question. You should only have to do this once and the data should transition. You can transfer to S3 Glacier after 1 day if you want - or leave the data in S3 - Standard for a little longer to make restores easier.

    S3 Glacier is a little different than classic Glacier. We use S3 Glacier which has some benefits over the older design.

    In any case, what you are proposing is not something I would recommend. Glacier is for Archival storage. If you're using it just to lower costs, I fear you will not be happy with performance and cost should you need a restore. I would think you are better off moving to a lower cost cloud storage provider instead since they will give you hot storage at a lower cost.

    Or, if you stay with Amazon, there are ways to lower storage costs without moving directly to Glacier. You can use something like S3 Infrequent Access or S3 One Zone Infrequent Access as the target for backups. They require data stay for a minimum of 30 days and then you can use a lifecycle transition policy to move the data to Glacier after 30 Days (or longer if desired). But only move the data to Glacier if you are certain you are not going to need to restore except in an unusual case or emergency. Restoring from Glacier can be slow and expensive. It's not hot storage and should not be thought of that way. You should do some tests to understand speed and have a look at the AWS Calculator to understand costs. https://calculator.aws/#/createCalculator/S3

    S3- IA is about half the cost of S3 Standard. One Zone IA is a little less than that. Using services like Backblaze B2 would cost about $5 / TB. Wasabi would cost $6 / TB.
  • Restoration wizard doesn't present option to restore HyperV image backup to Amazon EC2
    Are you saying that on the Restore Wizard on the Restore Type tab, you do not see an option called "Restore to Amazon Web Services"?
  • 5TB limit in 2021
    Are you on the latest version? If not, I'd try updating. You can continue using the old backup format on the 7.x version.If you have already upgraded, I'll have to reach out to Support.
  • 5TB limit in 2021
    Free is limited to 200 GB on cloud storage vendors other than Amazon AWS which has a 5 TB limit. Pro has a 5 TB limit for all cloud storage vendors.

    You can find a comparison table here: https://www.msp360.com/backup/windows.aspx

    Could you post the relevant details of the email here so I can see what was sent?

    Thanks.
  • MSP 360 Remote Desktop Very Poor Performance
    What CPU are you using? OS? Can you also verify the Remote Desktop versions being used at both ends? Thanks.
  • Deleting Favorites
    Go to Tools - Favorites and you can manage them all from there.
  • Remote Deploy: Image-based backups
    Are you referring to image-based backups or file folder backups? In either case I think the problem with staggering is that in many cases it may not provide the effect that you need, which is limiting the number of backups running in parallel. The issue is that any full backup whether it's the first image backup or the first file folder backup can take some time while all of the data is moved to the cloud and in that sense even staggering 20 minutes between machines may not provide the desired effect. I'm thinking, in order to make staggering work properly with image backups, you would need to stagger the full backups especially the recurring fulls that are not taking advantage of the synthetic full backup option to different days of the week. How do you see staggering working for you in your environment? Are you using synthetic fulls?
  • Duration for initial backup?
    The incremental should be good, with the exception that some files were not backed up. It doesn't mean that the entire backup failed. if those files in question do not need to be backed up, and they were explicitly selected in the backup, you can uncheck them, you can back them up with a different plan. Or I can we can find out if you can run the process under a different user that has broader permissions to access the files in question. What I would do now is go through a test restore process for one of the files that had an incremental backup just to satisfy yourself that the latest backup, despite the failures of some files, can still restore all the other files that were backed up.
  • Hyper-V backup: no option to force full backup?
    The Force Full Backup is still there. It has just been moved to the drop-down menu on the Play button for the plan in question. I imagine it was moved to avoid customers accidentally clicking that option instead of an incremental backup.
  • New backup format in V7 Cloudberry Backup
    I recommend you stick with the old backup format if you're protecting production data. You can run parallel backups as suggested as a way to both protect and test the new format. We are planning to have it production ready by the end of Q2.
  • Removing Computers
    Can you remove them from the web management console? That should remove them from the iOS app next time you run it.
  • Removing Data from Wasabi Backup
    Two options:
    1 - If you are still using the computer in question for backups, but do not need some older backups from old backup plans you are no longer using, you should use the Storage tab in the agent, right-click the folder in the left pane that is no longer needed, and select Delete. That will remove the backup files and ensure the local repository is kept in sync with the storage.
    2- If you are no longer backing up the computer in question and the software agent has already been removed, then you can delete the folder from the cloud storage provider's management console (just be careful to avoid removing the wrong folder).

    If you're working with a different use case, let me know.

    Thanks.
  • Duration for initial backup?
    I am unaware of such a limitation with managed backup. Stand-alone is not managed and requires a perpetual license after trial. Are you a managed service provider? If not, and you only need a single license, then stand-alone would likely be a better option.
  • Duration for initial backup?
    When you add a storage account in the managed backup UI, you would need to select Amazon S3 as the storage account. Then you can click the gear icon and add a backup destination bucket in the same region where the data already resides. Then you would add that storage account to the customer (or user as you had done previously) and create a new backup plan that uses the new storage account (or edit the existing one and simply change the storage destination).
  • I/O when using encryption/compression
    it may be related to the Fast NTFS scan option. You can try disabling it and see if the I/O returns to normal numbers.
  • Duration for initial backup?
    You don't need to assign the storage to the user. You can do it that way and that's the way it was traditionally was done, but as I stated above you can assign the storage to the customer directly if that is more convenient for you to do. But it sounds like you figured it out.
  • Duration for initial backup?
    once you define the storage account, you need to create a backup destination on that same screen, which is really just a bucket name and a region if the cloud storage provider has multiple regions. Then you need to go into the customer in question that you're working on, and add that storage account at the customer or company level as we call it in the product. Then you create a backup plan and for the storage options for that plan you assign it the storage destination that you just created. If on the other hand you want to use a hybrid backup that backs up to the local network and the cloud at the same time then you would have to make sure you've created the file system storage account for the network location where you want the backup saved, and then you would also assign that account to the customer, and finally you would select hybrid as the backup type in the backup plan and make sure that both the local network storage account and the cloud destination are selected as targets.