Comments

  • Problem deleting data to reclaim Amazon S3 / Glacier space
    Thank you Alex,

    The reason I didn't want to use the Backup storage tab to delete data is because I thought it deletes all data in the backup plan selected, and I only needed to delete some (most) of the data, to re-upload it all with a new folder structure (moved locations for most folders).
    I'll check that out next time, as I used cloudberry explorer instead this time.
    I am re-syncing the repository now, I'll let you know how that goes for detecting the true usage, I'll update this thread in the next couple of days.
    (I ran this during a running backup, not sure if that might stop the currently running backup but that is fine, it runs daily)

    On another note, it turns out that re-uploading was expensive as over 2,000,000 requests were sent all over again for the new files to all be uploaded to the S3 bucket.
  • Problem deleting data to reclaim Amazon S3 / Glacier space
    I have checked the S3 bucket from AWS console, and only the folders that should be there (since the new upload) are there, none of the old folders I deleted are visible.
    However I do have S3 sending to Glacier after 24 hours in the cloudberrylab S3 bucket settings.

    The reason why this is a problem, is because I set a 6TB limit for this backup location in Cloudberrylab. It nows says I have used all 6TB of the 6TB limit.
    I could increase the limit, but I am worried that I might be actually paying for 6TB of data.
    I called AWS support and they confirmed that under 3TB is currently stored.

    I originally posted in this topic (cloudberry explorer) because I thought it didn't delete it properly. But perhaps it is a bug in how cloudberrylab detects used space?

    Regarding the versioning on S3, I have not heared of this option and cannot see a reference to it. I am trying to see where that might be but I can assure you that I have not setup anything out of the ordinary. I long time ago I setup an S3 bucket and let Clourberrylab do the rest.

    Thanks,
    Michael A.
  • Date range for files to backup
    Thank you David for your detailed reply.

    Just to clarify, we have the CloudBerry retention settings set to keep data for 60 days.
    We can't change this as we want backed up data for 8 weeks on the cloud (or 2 months).

    We have a software program which backs itself up each night and has the past 7 days retained in it's backup folder. We are trying to get Cloudberry to run once per week (Sunday) to get the previous night's backup (Saturday) only each week and keep it for 2 months. We need to do it this way as each nightly backup is over 100GB. We want to keep about 8 of these 100GB backups in cloud storage (8 weeks retention of a single weekly backup). Currently, it is taking all 7 of the nightly backups each week which means it is storing about 800GBs/week (about 7TB for the 8 week cloud retention period). Finding a good solution will mean the 8 single weekly backups will only take 800GB all up at any time on the cloud instead of 7TB (large cost difference).

    We want to avoid scripting the single latest weekly backup to copy to another location for reasons I won't go into, but that would be a work-around for most in this situation.

    I think we will script the hidden or system attribute via a batch file which searches for files older than 1 day and applies the attribute ("For files" command first, then apply attributes to results). This will not require any data copying and allow Cloudberry to backup directly from that folder.

    If your dev team are interested in adding a new feature, having the number of days as an option in the backup plan settings would be great (in addition to the date selector which is currently there and possibly used by many).
    This forum has helped me think of a work-around for now until a feature is added (if it does).

    Thanks again for your time.
  • Date range for files to backup
    Another quirky idea:
    There is an option not to backup hidden files.
    I could create a script that makes any file older than 1 day hidden in windows, and then tick the option in managed backup to ignore hidden files and run this script as a pre-backup script.
    Actually, this is the best work-around I have found so far.
    Any better ideas?
  • Date range for files to backup
    There are options in the Backup wizard to only back up files modified in the last X days

    Hello David,
    You may be mistaken (Although I kind of hope you are right and I am missing something).

    There is an option called:
    "Backup files modified -_- days ago"
    With this description:
    "Files will be backed up only if they have been modified more than specified days ago"
    And indeed that is what it does. If I put "1" days for that setting, it ignores files less than 1 day old and only backs up files older than that.
    I need files less than 1 days old to be backed up only, and to ignore files older than that.

    I also can see this option:
    "Backup files modified since: -_date_-"
    Now, if I could enter a number of days here, that would be perfect. But alas, it only lets me enter a date. This means that this setting is not useful for automated backups, I would have to manually set this day to yesterday and then run the backup. The date set is unchanging over time.

    Weird idea for a solution:
    Is there some way for Windows to hide certain files within a folder?
    I have looked on google for ways to create a "search folder" like in Outlook so that I can enter criteria and it shows the results. If I could point cloudberrylab managed backup to such a folder set to only show results modified within the last day this would be a work-around that works.
    Do you have a weird but functional work around perhaps like that in mind that might work?

DO IT Solutions

Start FollowingSend a Message