Comments

  • File Delete Warnings on Legacy
    It's still an issue. I know it isnt high priority probably but would be nice to have that option working in case of accidental deletion.
  • File Delete Warnings on Legacy
    I sent a message using the diagnostics button in CB. I enabled high logs and went through a scenario that shows the problem. I hope it gets a fix.
  • File Delete Warnings on Legacy
    Is this a bug? Anyone else exp this? Will it be fixed I really like the warning option. It would be nice if It could email me about it too.
  • File Delete Warnings on Legacy
    Correct. Tested it in the Pictures directory too. It will show warnings for files deleted in folders and sub folders. If I delete a folder itself not warnings. IDK if that's just me or not.
  • New backup format and incremental forever.......
    TY.

    But legacy is also incremental forever? That isnt a problem? Is that because of the format?
  • New backup format and incremental forever.......
    The new backup format (for files) is not designed for incremental-forever backups like the legacy format. The new format requires periodic full backups, since it uses a generational approach (full backup sets) for data retention. These full backups are synthetic fulls (on most cloud storage providers) and will run quickly compared to the first full backup.For Image and VM backups the periodic need for a (synthetic) full backup is no different than the legacy format.David Gugick

    TY for your responses.

    COuld you elaborate on this a bit? Im not sure I entirely understand the data retention point. What if I decide to keep it forever? Never purge it. Why would that be a problem? Is it a problem because the generation is taken as 1 unit of restoration if one part of it goes you cant restore to it? So the more you incremental the more points of failure?
  • New backup format and incremental forever.......
    But that's only really true for file backup. For image and VM backup, storage requirements would be about the same or less because of client side deduplication. But it also depends on retention settings.David Gugick

    Syn fb is a server side created FB with only the changed bits uploaded right? So two FBs reside now on the cloud? How does that save space. Unless there is block level dedup on the server as well. THe new Syn FB just links to same blocks from the first FB?
  • New backup format and incremental forever.......
    TY

    1. How often would a full (or syn) backup need to be performed? Just to be clear, Syn FB creates a new FB with current block already in the cloud + new blocks created locally since. So if i dont delete any files there would be 200 GB of data after the syn full assuming the size of the orginal FB was 100GB which would increase my per GB storage bill? So the new format wouldnt necessarily save the most online space?

    3. I dont mean deleting backup data in the backup itself. I mean in the local files marked for backup. If i dont see those as gone. The 30 day retention auto deletion and they are gone. I may just stick with Legacy maybe.
  • CloudBerry Backup Questions
    Also could I just use the new backup format with incrementals forever? I get the feeling that isnt best practice, Like against ransomware attacks.
  • CloudBerry Backup Questions
    Another quick question:

    Was the archive option removed from the legacy format? I dont see it even though it is in the documentation. Is that a paid feature?
  • CloudBerry Backup Questions
    Thank you! Im learning alot here. It can be so overwhelming but interesting.
  • CloudBerry Backup Questions
    The new format does generations. For 300 GB (compressed), that would fill storage space fast with each new generation. If you regularly do fulls which is recommended. Synthetic would be a must for me. Im not paying for spikes in cloud space I only want to pay for what I need and that's it.

    BTW thank you for the answers I appreciate it. :smile:
  • CloudBerry Backup Questions
    Well there is a definite difference between the amount of files in the new format vs the legacy format. The latter i dont think does any archiving at all from what Ive seen. I mean taking all the files and merging them into one. One would think there would be far fewer transactions. Google cloud price is per 10,000 operations. Would each grp of 10,000 files have its own operation fee? Using the legacy backup?
  • CloudBerry Backup Questions
    Question 1:
    That's what I was thinnking. Could you give me a ballpark typical scenario? Like lets say i have 500 GB and 100,000 files total. How many transactions would CB do roughly? Uploading to google cloud lets say. Fill in whatever other variables you like. Maybe it's too complex idk.

    Question 2:
    Yes that's what I thought too. The new backup format is the one that supports client side dedup? I def like that feature. Why doesnt the new format also support file version retention? Will it ever?