• SamboNZ
    0
    I backup quite a large number of files and I've noticed that my repository database gets quite large (16GB at present).

    If I manually shrink the database it typically drops to about 50% of its original size.

    A couple of questions regarding this:
    1) Why am I seeing this kind of size bloat?
    2) Is there a way to automatically shrink the database?

    I see there is a possible way to do this via the 'cbb' command line tool:
    cbb repository -shrink

    - Could I create a scheduled task to run this automatically on a weekly basis at a time backups are unlikely to be running?
    - Are there any caveats to running this?
    - If this ran during a backup would bad things happen?

  • Alexander NegrashAccepted Answer
    26
    1) Why am I seeing this kind of size bloat?SamboNZ
    large number of files

    2) Is there a way to automatically shrink the database?SamboNZ
    not out of the box

    - Could I create a scheduled task to run this automatically on a weekly basis at a time backups are unlikely to be running?SamboNZ
    why not

    - If this ran during a backup would bad things happen?SamboNZ
    most likely. I would advise carefully testing this in non-production environment first.
  • SamboNZ
    0
    Thanks Alexander,

    I have a feature request then for the cbb command.
    It would be great if the "cbb status" command had an option to list the status of any currently running jobs; preferably also returning a specific command line 'errorlevel' if any jobs were currently running.

    This would provide an easily way to check for running backup jobs before executing a repository shrink via cbb.
  • Alexander Negrash
    26
    Sure. I will add to the list. Thanks
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment