• Immutable storage without doing full backups?
    You may be misunderstanding what a Full backup is in the context of a File backup. The only data that will be backed up is the data in the folders selected. Periodically, you'll have a run scheduled full backups - at least once a month. But, again, those backups are only going to back up the selected folders in the plan and the Full backup will also be a synthetic backup - which only runs an incremental backup and then uses the data in S3 to construct the new Full backup in the cloud.

    What are your retention needs? And what GFS settings do you need (Weekly, Monthly, Annual). That will help me understand what your settings for Keep Backups For and GFS should be and what your scheduling will have to look like. Immutability is currently tied to GFS, and GFS settings will dictate how frequently you need to run full backups.
  • Use CloudBerry Backup across Windows users
    This is what Support recommends:

    Export Backup Plans (for a backup in case they need to be restored)
    https://www.msp360.com/resources/blog/how-to-copy-backup-plans/

    Stop the backup services (either via the GUI or Services.msc)
    Close the GUI prior to executing the commands below
    Use the following command to switch to Common mode.
    From a command prompt in the backup installation folder.

    Change directory in the command prompt to the Backup software installation folder and then run this command:

    cbb option -usermode common


    That should do it.
  • Suggestion for repository synch
    Unless I'm misunderstanding what you're describing, the backup repository should not need to be manually synchronized as it's updated in real-time with every backup.

    Can you describe in a little more detail the steps you went through to perform the restore?
  • Retention
    The reason nothing's getting deleted is because you're not running any Full backups.
    You can see that on your first screenshot. It's the option right below the incremental backups. Nothing can be deleted because the new backup format uses generational backup sets with a chained set of backups that include a full and a number of incrementals. And that way, it works similarly to how the legacy image and virtual machine backups work, since it does not manage backup data at the file level.

    Having said that, I don't think the new backup format is what you need. Because in order to remove data, you need to run periodic full backups, and as you've said, with a hundred terabytes worth of data you may not want to use that much backup storage. The b2 full backups will be synthetic; meaning we'll only run a new incremental backup and then use the intelligence in the cloud to have your B2 account created new full from the existing data that's up there. But currently you'll need to run full backups at least once a month (a version to allow less frequent full backups is not out yet), and if you want to keep data for 3 months that means that you'll end up with 4 full backups in storage until the first backup set can be removed. And that means 4 full copies of the hundred terabytes of data. And that could greatly increase your cloud storage needs and cost. If that works for you then you can continue using the new backup format. But...

    You may want to use the legacy backup format. The legacy backup format is a true incremental forever as we only ever back up the new and change files and all cloud objects are managed at the file level. It uses version-based retention which provides flexibility in how many revisions you want to keep for each file. It loses some of the advantages of the new backup format like faster backups, faster restores, client side deduplication, automatic backup consistency checks, GFS retention, and backup immutability on some cloud platforms.


    It sounds like your need is to have incremental forever, and the legacy backup format is the way to get there. The only thing is, in order to move to the legacy format you're going to have to start a brand new backup using the legacy backup format and reupload the data to B2. You can disable the backup scheduling on the new backup format plan and can leave the existing backup data in B2 until it's no longer needed. Then use the backup storage tab in the client to remove the backup data that's no longer needed.

    Let me know if you have any additional questions.
  • Retention
    you'll need to clarify whether you're using the legacy file backup format or the new backup format. You'll also need to post whether or not you have the option to mark files deleted in backup storage when they're deleted locally (if legacy format), and also post all of the settings you have on the retention tab (Post the screenshot if that's easier). If you're doing file backups then they're always incremental forever with the legacy backup format.
    If you're using the new back of format then also post the schedule for the incremental and full backups. And lastly, let us know if you're backing up to local disc or cloud storage.
  • Use CloudBerry Backup across Windows users
    Let me check with support. An uninstall won’t delete repository or plans but let’s check to see if anything else is needed.
  • Failed to load private key. Error code: 3333
    well, we support minio natively in the product. And we support both Linux and Windows versions of minio. And it is free, so maybe give that a shot if you want to migrate off of FTP. I don't think the error you're getting has anything to do with our support for SFTP, but I have no experience with SFTP in order to help. And if your backups are all within the same network, then maybe you can get away with using plain FTP or a regular network backup using our product.
  • Failed to load private key. Error code: 3333
    I apologize, but FTP and SFTP have been deprecated for years now. And I don't see any reference to the error you're seeing in our system to be able to provide assistance. Someone else on the forum may be able to assist, or you can look to use regular FTP or try to Google that error with SFTP and see if any good results come back. If I find something, I'll post here.
  • Use CloudBerry Backup across Windows users
    did you uninstall the product from all user accounts first before upgrading?

    If that doesn't work you may need to reach out to support. I haven't come across many users that had the product installed at the user account level, and I'm not even sure we still support that in the later versions.
  • Use CloudBerry Backup across Windows users
    Someone installed the product before you ran it. And they probably installed it for their user account. I gave you the solution above on how to move the repository, but you may want to uninstall and reinstall the product first.

    Having said that. the version you are using is long out of support and I would strongly recommend you upgrade to a supported version.

    You can read more about our product lifecycle here: https://www.msp360.com/productlifecycle.aspx
  • Use CloudBerry Backup across Windows users
    If it's in your home folder, then it sounds like that the application was installed for that user only. The default location for the repository is not the Users folder. It's normally in ProgramData. You can relocate the repository using the functionality on the Options - Repository page: https://help.msp360.com/cloudberry-backup/options/repository

    I would make a copy of the repository folder first before moving.
  • Use CloudBerry Backup across Windows users
    Can you verify what version you are running?
  • Use CloudBerry Backup across Windows users
    Was the product installed for All Users? If so, you should see a single view of all plans regardless of who is logged in.
  • Backing Up and Restoring the Recycle Bin
    If you're referring to Image backups, then everything is backed up by default. Image backups are not backed up at the file level, despite the option to have folders / files excluded. You could try excluding the Recycle Bin if you want, but would have to test to ensure the restore works correctly.

    The Recycle Bin is normally located on every disk, off the root, in the following location:
    <DRIVE>:\$Recycle.Bin
    
  • Recommended Backup Exclusions
    For Image backups, you can choose to exclude some folders that may not include data needed for a full restore. However, if you do decide to exclude, you must test to ensure the exclusions do not interfere with a needed restore. Good candidates might be browser cache folders and temp locations.

    But as I stated, if you exclude the wrong folder, you may not be able to properly restore. So please test.

    Find the User Folder Location from Command Prompt
    echo %USERPROFILE%
    
    Google Chrome Browser Cache (per User)
    \Users\<USERNAME>\AppData\Local\Google\Chrome\User Data\Default\Cache\
    
    Microsoft Edge Browser Cache (per User)
    \Users\<USERNAME>\AppData\Local\Microsoft\Edge\User Data\Default\Cache\
    
    Firefox Browser Cache (per User).
    Use about:cache in Firefox to find exact location
    \Users\<USERNAME>\AppData\Local\Mozilla\Firefox\Profiles\<PROFILE_NAME>
    
    Windows User TEMP Folder (per User)
    echo %TEMP% from Command Prompt
    \Users\<USERNAME>\AppData\Local\Temp
    
  • New backup format and incremental forever.......
    No. All file backups using legacy mode are at the file level, We only back up new and changed files and there is a one-to-one correspondence of objects in backup storage. The only "chaining" that can occur is when you use block-level backups which back up changes within large files. Even so, you need to schedule periodic backup of those files in full - but those backups are still at the file level. The new backup format uses archives that bundle many files together for easier backup file management and faster backups and restores.
  • File Delete Warnings on Legacy

    Just trying to understand what you're saying.

    As an example:
    * You are backing up C:\Users\UserName
    * You delete a file that has already been backed up called C:\Users\UserName\My Documents\MyFile1.DOCX
    * You get a delete warning the next backup
    * But if you delete an embedded folder C:\Users\UserName\My Documents\HomeStuff that has already been backed up, you do not get a delete warning for the folder or any files in that folder?
  • Fixing compliancy issues
    You would need to report the actual issue you're having to Support. I am not seeing any issues reported in the system, so please report this to support as soon as possible. https://support.msp360.com/
  • New backup format and incremental forever.......
    You could do that, but it's not recommended because the more incremental backups you have in the chain the more data loss is possible if there is disk corruption or some other type of backup file data loss. Even if you wanted to keep data "forever", you'd probably want to run periodic full backups. You could something like use GFS with 10 Years of Annual backups, so you're only keeping one generation per year. Or use the legacy format which I think is the better choice in your case as they are incremental forever and you can keep locally deleted files in backup storage and also check the option to always keep the last version of every file - that way, you keep everything forever without chained backups since all backups are managed at the file object level.