• Image Based Backup
    Hi!
    This is correct. If we ignore bad sectors - recovery will be possible even if some sectors are damaged. Files on those sectors, however, will be non-readable after the restore.
  • Explorer login to ACD fails
    Hi, have you tried adding the same entry to HKEY_CURRENT_USER\SOFTWARE\Microsoft\Internet Explorer\Main\FeatureControl\FEATURE_BROWSER_EMULATION? It may work as well.

    Also, note that you will have to set the different values depending on your IE version:

    "10001 (0x2711)" for IE 11

    and

    "10000 (0x02710)" for IE 10
  • Cannot load Windows PowerShell snap-in CloudBerryLab.Explorer.PSSnapIn
    We managed to reproduce the problem. You OS should meet the following requirements:

    .NET Framework 4.0 (full version)
    Windows Management Framework 3.0
    They are written here. Add those features and reinstall the product.
  • Questions before I ramp up usage of CloudBerry MBS
    SQL backups - should transaction logs be shrinking after backup automatically or just marking space as free within the transaction logs? Today I checked a server and had 12gb of transactions logs on a few hundred mb database. I had to manually backup and then shrink to get the space back. This was on SQL Express.

    CloudBerry uses SQL native tools, so all logs can be truncated automatically after the backup finished.

    If I’m doing a hybrid backup and the local network target is missing, does the cloud backup proceed or also fail? (I’m pretty sure it’s fail)

    Yes. It will fail. In this case the account won't be defined.

    On Desktops/laptops, If I’m doing a backup of either image or file/folder and the computer is shutdown/sleep/hibernate, what’s the impact on the backup and any files being backed up at that moment? Does it make a difference if this is real-time or scheduled?

    If the backup job is running - it's unlikely that the PC will be switched to sleep/hibernate state. There is an option "Prevent computer from sleep/shut down while the backup job is running" and it's enabled by default. In other cases, there is an option "run missed backups once the PC is switched on". If it was file-level backup - it will continue from the place where it left. In case it's image based - from the scratch.

    Can I do a hybrid image plan scheduled in addition to a file/folder real time plan without them interfering?

    Not a good idea

    Will a file/folder real time plan generate a failure anytime it runs a backup and fails due to no internet? If so, can that be limited for that plan? What happens if a client takes their laptop off-site, boots it up, then connects to a guest network 15-30min later? Do we get a bunch of queued failure emails in that 15-30min period when it finally can hit the internet?

    It just failed and sent you the notification once.

    What phone support options are there? I’ve only ever done email and for me to ramp this up, I really need phone support.

    There is a phone support. Or you can initiate a screen sharing session. Just ask the support guys for this.

    Should I be encrypting within CB on each backup?

    You should. There are bunch of encryption methods with CB.

    Is encryption performance impacting during backup/restore operations or pretty minimal on overhead?

    Not for the regular backup user. And not if the volume of the data is not so huge.

    Do I need to do more than AES-128 or is that secure enough for non-compliance data?

    AES cypher algorithm is considered as one of the strongest nowadays. You can use the maximum key length - 256 bit. But it's optional and up to you =)

    How good is the granular restore from images of VM’s or bare metal?

    Please give it a try and let us know how it goes.

    Can I script the below (We have Continuum)? Install the client Create the backup plan of file/folder to cloud. Have the backup plan consist of C:\Users*

    You can backup Users profiles

    Can users get a read only view of their backup plans?

    Each user can view only their own backups. Not other.

    I don't see any issues with your use case. In case you faced some - let us know! We are always happy to help!
  • Managed Backup Services vs BackBlaze
    Desktop software costs me 49.99 a year with no storage

    Our volume discount in MBS starts since the 5th license you would buy in bulk. If you are buying 10 licenses, then the cost per license falls down to $29.99

    but like backblaze offers desktop $50 per year with unlimited storage

    However, BackBlaze don't do encryption on the user end, it only applies to the server end. So if you are doing local backup as well - it won't be encrypted. And your data will be deleted in 30 days, after you delete it from your machine. So, retention is a concern well (if BackBlaze server cannot connect to your machine, your data is lost after 6 months).

    carbonite $59 for unlimited storage.

    Carbonite limits upload speed. And there is an issue with the provider lock-in. Just look at CrashPlan users nowadays, trying to migrate their data somehow. It's not easy.

    and whats preventing a customer from buying the desktop and actually using it on server.

    Actually nothing. This is a file level license not dedicated to workstation exclusively. Doesn't matter on which machine the license is installed.
  • Specify Multiple Databases when using cbb.exe addRestoreMsSqlPlan
    It is not implemented due to possible source / target mismatch (e.g. another flag in the cli is the target DB name -dbnn). The workaround is to create individual restore plan for each DB in your instance.
  • Cloudberry File backup for SQL: Large amount to upload
    With SQL server backup it's a bit difference from file level/Image based backups. CloudBerry use native Windows tools and the backup is differential. There is no any deduplication of a full backup *.bak file.

    So far, if you don't like to upload such amount of data monthly, you should rely on Differential parts and schedule your Full backup less frequent.
  • Cloudberry Linux / Backblaze B2 - file count issue
    Hi. CloudBerry for Linux puts files to backup in a row. So, you now see 2000 of files. That means that one thousand was already backed up and now the software processes the next thousand. So, in the end, you should have 24k of files processed. Let me know if you have any questions regarding the process.
  • CloudBerry Backup for macOS with Amazon Glacier?
    Hello and thanks for choosing our product. Our software supports Glacier via Lifecycle policy transition from S3. It is a proposed way to perform such operations by AWS and costs not more than pure backup to Glacier. Please check out the step to step guide here: https://www.msp360.com/resources/blog/introducing-backup-for-mac-linux-2-8-1/
  • Alternate Way to Release License?
    Hello. We have a human-based alternative - please write at and your license will be released.
  • Can I use Cloudberry to backup to Backblaze Personal?
    Hello. BackBlaze Personal is not a storage, it is a backup solution. It actually backs up everything to B2. Thus, we can only access B2.
  • How to convert to hybrid backup?
    Hi! There is no way to convert one plan to another, as hybrid backup uses a slightly different architecture. However, there is a feature, that we are planning, to change that. Please PM me your email address and you will be notified ASA it is implemented
  • No client-side encryption for OneDrive For Business?
    This feature will be implemented, the only question is when. The difficulties with implementing this option
    are connected with the fact that you should know the final size of a file prior uploading it to the OneDrive.
    This is how OneDrive for Business works from the API perspective. Anyway, we are working on it.
  • Retention policy understanding
    If I choose both "Delete versions older than" and "Keep number of versions", what takes precedence? Do both criteria have to be true before a file version is removed, or is it either one?

    Cloudberry will check for both conditions and in a case at least one is true - it should be in place. Example: you want to delete versions older that 1 day and you want to keep 3 versions. How it works: you have 1 version and file older than 1 day. Will it be purged? No! Cause you still have only 1 copy of your file. Let's imagine that then you have 4 versions of your file and all of them older than 1 day. In this case, the eldest version will be purged. Just specify your retention policy as simple as it possible.

    How is option "Delete files that have been deleted locally" interpreted? If I deselect files from the backup source, is that interpreted as a deletion?

    No. It means that in case you removed a file from the backup source and this file won't be found for a number of days that you specified in agent - it will be deleted from the backup destination.
  • CloudBerry Backup Web Access Service on macOS
    Hi!

    That's for our web interface. Please, check out this article to know more about it: https://www.msp360.com/resources/blog/introducing-backup-2-1-for-macos-linux/
  • Attempting to Write Oversized MFT Record
    Hi, This error is caused by the fact that destination isn't big enough to restore this image. In order to fix it, you will need to:

    1) Perform the defragmentation on the initial server where you're running Image Based backup

    2) Rerun the Image Based backup after defrag

    3) Retry the restore after the defrag, it should work properly.
  • Continue backup after new installation
    Would I have to name the server the exact same name for the Azure container to recognize it being the same resource?

    In this case, you should only install the backup software and change the prefix in it to the one, you used previously (it's the machine name, you are correct).

    if I want to restore, will the password still work? Yes, for sure. The password will be working.
    What is encrypting what and with what keys


    The key is the phrase you are using. It decrypts the files, we are encrypting on the endpoint (your server).
  • CloudBerry Backup questions
    Is it possible to encrypt the AWS keys, protecting them with a master password used, for instance, when opening the application? I might have missed that.

    I'm not sure I understand what exactly you want to protect. If I'm correct 1. You would like to encrypt the user-side password for AWS?

    Instead, selecting "Restore" triggers the Restore wizard, forcing you to cycle through all the screens and re-select the file. Did I miss something?

    That's the very basic mechanics of our backup. We are restoring file only through restore wizard.

    I couldn't find an option to ignore symbolic links when creating a backup plan. For instance, when a 1GB folder contains a symlink to a 4GB folder, I need the option to ignore it.
    We ignore those by default

    Oddly, the option to "Delete files that have been deleted locally" did not seem to work for folders. In Destination, synching leaves empty folders that were removed from the source -- and I could not manage to delete them through CloudBerry Backup.

    That might happen if you choose "Backup NTFS permissions".
  • Alternative to SFTP?
    Hi. I would recommend you to use Minio or setup any other self-hosted storage (OpenStack Swift, for example). Also, check out our overview of open-source object storage services, you can use instead of SFTP: https://www.cloudberrylab.com/blog/open-source-object-storage-vendors-comparison/
  • Backup to Minio (Choosing Minio vs S3 Compatitble?)
    None, except Minio says it's better and more compliant to their storage

Denis Gorbachev

Start FollowingSend a Message