• Allan Coganovitch
    0
    One of my clients using File/Folder Legacy backup to Amazon AWS S3 has reached their preset limit of 50 Gb of Cloud Storage.

    In order to try to determine which files are taking the most space, I would like to sort all of their files in storage in reverse size order regardless of the folder.
    Have can I obtain this information without connecting remotely to their computer?
    Thank you in advance.
  • David Gugick
    118
    there's no catalog stored in the management console, so you'll have to connect remotely to see what's being backed up and delete any files you no longer need from the storage tab. If you're not using file name encryption with S3, then you may be able to review the bucket on the S3 side to get a sense of file sizes, but you should not delete them from there anyway since you'll have to resync the repository and block level backups can make that difficult anyway.
  • David Gugick
    118
    let me semi-correct myself. You could go to reporting - backup history in the management console and review the backup plans for that particular endpoint. You'll see the files that were backed up with each plan and their file sizes, but that might require you review a few different backups to get a sense of what the big files are. You still need to remove them from the client itself on the agent.
  • Allan Coganovitch
    0
    I see that you have a 'CloudBerry Explorer' that seems to have a reasonable user interface.
    But, as I recall, you cannot filter the way I need it to or sort across folders, etc.
    Is there anyway for me to use this tool with some parameters, options or settings that I am not aware of?
  • David Gugick
    118
    I'm wondering if it would just be easier to ask the end user to scan their hard drive for very large files from Windows Explorer. With retention settings and multiple file versions stored for each file it's increasingly difficult to figure out which files are actually taking up the most space. There might be some single very large files, but there might also be 10 copies depending on retention of a slightly smaller file that collectively is storing more data in the cloud. Can I ask why remoting into the system is not an option?
  • Allan Coganovitch
    0
    I have a remote connection (Screen Connect).
    Technically, the connection is not the issue.
    These computers are used to monitor physical alarms, elevators, etc.
    I do not want to be intrusive to their day-to-day operation and it is very difficult for the client to 'give-up' their computer for any length of time.
    I am more concerned about large video files that are downloaded by their overnight staff as opposed to 10 * 50/100 Mb files.
  • David Gugick
    118
    What about increasing their cloud storage. It's $23/TB/Mo on average in the US for S3-Standard and they'd only pay a portion of its use which sounds like a few dollars based on current use.
  • Allan Coganovitch
    0
    I think that I need to deal with the problem which is *why* they have filled up 50 Gb.
    Their management would be interested to know.
  • Steve Putnam
    35
    From the agent console storage tab, I select the “Capacity” view which sorts by folder size. Typically what chews up a lot of space are pst files, and if you keep a lot of fulls, the storage can get chewed up fast.
  • Allan Coganovitch
    0
    Thank you.
    Amazon AWS S3 also has an 'inventory' report in the Management tab of the endpoint.
    This generated a .compressed .CSV that you can download and sort.
    The report can be enabled/disabled.
    It *did* require about 24 hours for the first report to be generated but contained the data that I needed.
  • David Gugick
    118
    thanks for the update Allan. I will add that to my list of knowledge in my brain.
  • Allan Coganovitch
    0

    Document, document and document !:roll:
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment