Forum tip: Always check when replies were posted. Technology evolves quickly, so some answers may not be up-to-date anymore.

Comments

  • ID and PIN accessibility
    I don't have a date yet, but it's the next non-bugfix release. I'll ask the team if they know.
  • Purge delay and retention policy
    Where did you check to see what file versions are available? Did you do this from the Storage Tab in the product? If not, please check there and also post a screenshot of the Retention settings tab in the wizard. Please check the post after submitting to make sure the image is visible. Thanks. I'll review again.
  • Retention policy vs Ransomware
    Starting with #2, this is all handled by the software. There's nothing a user needs to do to manage the block-level backups and related retention. Even if you are infected, then the only thing that might be affected is the last block-level, but previous ones will remain in place and allow for restores to any point-in-time across those backups. More likely though, the ransomware will cause the full file to be backed up (since the entire file changes) and in that case, previous versions would be flagged for deletion - and this is bad. Easy to fix though - just keep at least 2 versions of files and the previous backup set (full file backup + any block-level backups) would remain in storage. The other thing you should consider is adding a Purge Delay. That way, files that are to be deleted have their deletion delayed a # of days before it actually happens.

    Moving on to 3: Yes, it's a good idea and since you keeping files for at least 90 days, you're going to end up with multiple versions for frequently edited files, but what if a file is not changed during that time and 90 days has passed since its backup and the ransomware attack. The whole point of backup is flexibility in restoring. Keeping more than 1 version is ideal.

    And now 1: You may lose everything in that case without a purge delay and only keeping one version. if the ransomware changes the file extension *file is renamd)e, the old file is considered deleted and the software may remove the deleted file if you don't have versions and / or a purge delay (or both) in place. If the ransomware simply encrypts the file in place without a rename, you have effectively created a new version and the old one could be removed depending on when the last backup of that file took place. So add versions and a purge delay.
  • Purge delay and retention policy

    Scenario 1:
    You did not mention how many versions you are keeping in Retention. Ideally, you are keeping at least 2 versions. But according to your scenario, since Thursday nothing changed, nothing is backed up. Assuming you're keeping all versions, then you have the Monday file (good), the Tuesday file (may be block-level, but regardless, it's good), and Wednesday (probably not block-level since the entire file changed, but regardless, it's not good). You can then restore either Monday or Tuesday.

    Since you did not mention versions you are keeping, remember that any files that are backed up using the block-level algorithm would be tied to the original full file backup at the start of your 90 day cycle, so if a file changed every day, you'd end up with 90 versions in storage. If you wanted to keep fewer versions, you would need to run the incremental more frequently than every 90 days (it's only going to back up files that changed that day and for those files that were backed up using block-level, they would be backed up in full - unchanged files are not backed up again after the very first backup).

    Also keep in mind that "most" ransomware tools change file extensions, effectively deleting the original file and creating a new one with the same name but an added extension. They may not all work that way, but if that happens all your files would be flagged for deletion and in 30 days (since you have a 30 day purge), the files would be removed from storage. Presumably, you'd know pretty fast that there was an issue and can then restore.

    But I would ensure you keep at least 2 versions of files and to avoid excessive storage for larger block-level backed up files, consider changing the Incremental to monthly or weekly, as needed.

    Scenario 2:
    Not correct. Purging is only related to files that have been deleted. If a file is never deleted, it's never removed from storage using that option. If you want to manage this, then you need to allow for multiple versions to be created as mentioned above.
  • Purge delay and retention policy
    I'll reply to this shortly. Just wanted you to know we saw the post...
  • Problem downloading files in S3 that have moved to Glacier
    If you’re using Glacier, Amazon takes 3-5 hours by default to get the files ready for access. You can pay extra for Expedited Retrieval, if time is of the essence.
  • MSP360 on Linux CentOS 7.4
    Nothing big stands out other than Linux is file / folder backup only. Do you have any more information on what you're going to need to do on the Linux machines from a backup and retention perspective? You'll find the UI is very, very similar to the Mac edition (assuming you back up any Apple macOS computers), but like Windows, you can use the management console to configure and manage the Linux machines. Let me know if you have specific questions. Linux has many distributions and versions; not to mention installation options and UIs. If something comes up, you can post here or reach out to Support for assistance.
  • Wake Computer For Backup
    Wake from Sleep is targeted for a future release. I'll add your comments and see if it can be moved up. In the meantime, I'd schedule the backups to run before you normally shut down. You can limit bandwidth utilization during business hours in Options, if needed.
  • SFTP/FTP Pull
    Amazon S3 uses the S3 API for file access. While there are methods to use SFTP as a front-end to S3, it's likely too complicated and expensive for your use case. https://aws.amazon.com/blogs/aws/new-aws-transfer-for-sftp-fully-managed-sftp-service-for-amazon-s3/

    You could use MSP360 Drive to expose the S3 bucket as a local drive or network share and copy the files that way. https://www.msp360.com/drive.aspx
  • Cloudberry PC and MAC Have Different AWS Directory Structures
    Support has your logs. They'll review and be in touch. You can ignore the spam kickback.
  • MSP360 on Linux CentOS 7.4
    Yes. We support CentOS 7.x
  • Cloudberry PC and MAC Have Different AWS Directory Structures
    So the answer is they are compatible. I just checked with the team and the product platforms are designed to be compatible with one another. So, you can restore PC backups on a Mac. I also see Klim answered you on the other post. If the spam kickback is happening on our side, shoot me a direct message on the forum with your email and I'll send it over to the team to have a look.
  • Download old versions
    We don't post versions as they often have bugs we've addressed and / or new features we've added. Please download the latest version and use that. Keep a copy locally in case you want to stay on that version.
  • Cloudberry PC and MAC Have Different AWS Directory Structures
    I'm still waiting on final word, but I'm pretty sure the backups are not compatible, hence your restore failure - EDIT: This was incorrect on my part - they are compatible. Unless you are saying that you did the restore from the PC and it failed (the reply was unclear). Please clarify.

    Can you clarify who flagged the email as spam? Was it a spam filter at your end or by your ISP, or are you saying it was kicked back by MSP360?
  • Remote Assistant Logs
    I've reached out to the engineer team and hope to hear back soon.
  • What is the Function of the "~segments" Directory Branch in the Cloud File Store?
    Here's the response. Multi-part uploads on some cloud storage platforms store the chunks in the Segments folder with the file name and pointers to the chunks in the CBB_VOLUMES folder. If you removed files in either folder, there is no easy way to easily remove the corresponding files from the other. You'll need to avoid doing that in the future and run a repository sync now on the storage account so Backup knows what's available for restore. You may also end up with some orphaned files in storage. The other thing you can do is create a new storage account to a new bucket and re-backup the files. You can remove the old bucket once you're certain it won't be needed.