• Cloudberry Backup error: The given key was not present in the directory
    That version is long, long out of support. I'd recommend you upgrade to the latest version.

    If your goal really is to continue with this version and try to correct the error, then you can try regenerating your access key secret key, or you could try synchronizing the repository.

    You may also try to synchronize repository following these steps: 1. Go to Tools > Options > Repository and click on Synchronize Repository button. 2. In Synchronize Repository window select your Account from drop-down menu. 3. Click Synchronize Now button.
  • Remote assistant linux
    We recently released the Android version. You may be able to use that version on your Linux with the proper support.
  • CloudBerry creating recursive folders - can we turn off including Junctions/Symlinks?
    Just keep in mind that the option I am referring to only allows a list of folder names, not full paths. It was designed to be able to eliminate common folder names you almost never want to back up like "Temp". So, the symlink folder will have to have a unique name or you run the risk of excluding all folders with that name. I would add the folders to exclude, save the backup plan, and then edit / review the plan again in the wizard to ensure the folders you typed in are still listed the same way they were entered.
  • CloudBerry Freeawere
    Can you verify what version you are using? I am going to check with Support to see what you can do to trigger the new, improved AWS storage limit.
  • CloudBerry creating recursive folders - can we turn off including Junctions/Symlinks?
    I think you could exclude the symlinked folder in the backup plan using the Skip Folders option - but it should have a unique name to avoid skipping all folders with that name. There is no option currently to skip symlinks. I'll add this as a request.
  • Free or Trial?
    Just download again and you'll get a new activation code which you can write down.
  • Free or Trial?
    We do not have anything in the product that does what you're asking.
  • 1TB limit
    If you're an MSP, then you are presumably reselling Managed Backup - which is designed for MSPs. You posted in the stand-alone backup section (I can move the post if needed). If you are using Managed Backup there are no storage limits on any of the licenses.
  • Free or Trial?
    The stand-alone Remote Desktop product is free for any use. Address book can store up to 5 entries as I recall, but you can use it to connect to other remote systems, but cannot store the remote machine names / IDs.

    Managed Remote Desktop is the paid product and is licensed per technician.
  • 1TB limit
    There is no subscription for stand-alone products. They are one-time purchases of perpetual licenses with optional maintenance. Ultimate Edition is $89 more than SQL Server ($299 versus $209). https://www.msp360.com/backup/windows.aspx

    If you're talking about Managed Backup and just posted in the wrong section, then there are no storage limitations on any of the licenses for that subscription product.

    Can you clarify what you are using?
  • 1TB limit
    Unlimited storage is available in the Ultimate Edition. How much storage do you anticipate needing for your SQL Server database backups and other backups on that server?
  • Recommendation for backup retrieval from AWS
    What product was used to back up the files in the first place? I assumed it was our backup product. If that's the case, then you can either restore with the backup product and you'll be prompted for the decryption password during restore plan creation, or you may be able to restore the files using Explorer. If you use Explorer, and the files were encrypted, which it sounds like they were, you'll have to enter the proper encryption password in the options - compression and encryption section - and you'll need the paid version or being trial mode to do this. As you've read earlier in some of the other posts, Explorer will automatically detect that the files were backed up with our backup product, and then try to restore them for you automatically.

    But keep in mind that if backup was the product that was originally used and options like block level backup were selected, you will not be able to restore the files with Explorer. That's why I'm recommending that if backup was used, then Backup is used to restore the files as well.
  • Recommendation for backup retrieval from AWS
    A good option might be to restore them, and then back them up to Azure once restored.
  • Recommendation for backup retrieval from AWS
    I think you need to explain in a little more detail what you're trying to do. Are you looking to restore these files, or you looking to copy the backup files in Glacier for storage somewhere else? Extracting data from Glacier can be expensive, so before you undertake any data egress, you should check the Amazon Glacier calculator to understand what the cost might be.

    Usually if a customer is moving to a new cloud it will often be less expensive and less of a hassle to leave the data in the old cloud until it's no longer needed and then just bulk delete it at that time. And start new backups using the new cloud. Glacier storage is relatively inexpensive, and you would even have the option to moving it to Glacier deep archive for even lower cost if you needed to keep it there long-term.

    But describe exactly what you're trying to do with the data that's stored in Glacier, and I'll try to assist.
  • decryption password is not correct
    I agree with you that Explorer should copy the files as is - or at least have an option to ignore that they were backed up with encryption using one of our other products. However, as it stands today, Explorer recognizes the backup files, and tries to restore them using the encryption password you can set up in Options - Compression & Encryption. But if you do that, the files are decrypted during the copy operation and they will no longer be backup files. I'm still checking at my end to see if there is a work-around for this other than what I'm going to suggest.

    You can use Explorer to copy the files in Freeware Mode. Freeware has no encryption support, so it will simply copy the files. To do this, you need to release your license from the Help menu. And when you restart, select Freeware mode. You will lose parallel uploads and that may slow down the process. It's not a great solution, but it will get the job done.The alternatives are to use a different product for this one-off copy or to simply start a new backup plan to Wasabi.

    I'm going to speak with the team next week to get some additional background on this feature. I'll report back if I find anything useful.
  • decryption password is not correct
    I was able to verify at my end as well. I am waiting on word from the Support team as to why this is happening. I'll reply when I hear back.
  • decryption password is not correct


    Can you verify if you have any Upload Rules created for that Wasabi account (Tools - Upload Rules)?

    What options are set in Options - Compression and Encryption tab?
  • decryption password is not correct
    Are you using cloudberry backup? You posted in the backup forum. What are you using to copy the files locally to the cloud?
  • decryption password is not correct
    You need the password that was used at backup time. There's no way to get past it any other way. The files are all encrypted using AES encryption, and a brute force attack, barring inadequate password complexity, is impossible.

    Have you lost the password?

    If I'm misunderstanding your question, please provide additional details.
  • Moved but un-modified files needing to be re-copied
    Ok. I did not realize you were on Linux. If you move a file locally, it will look like a file delete and new file create to the OS and to our software. What happens with the locally "deleted" files in backup storage depends on the retention settings for your backup plan. If you have questions, please post what you are using and I'll be happy to review.