• AlexS
    0
    Hi all,

    I have a question regarding setting default storage class when uploading files to S3.

    What rule needs to be added to upload files directly to Deep archive bypassing (s3) using upload rules options?

    Thanks,

    Alex
  • Matt
    91
    Drive doesn't support storage classes yet, that functionality will be added later.

    Note that due to how Deep Archive works it's unlikely that we'll implement it in Drive. We recommend using our backup tool for these purposes.
  • David Gugick
    118
    As Matt said Glacier is for archiving and not real-time access. Deep Archive would require 12 hours to get the data ready for retrieval and would be costly. And it has a 180 day retention requirement. If you're looking for a lower cost storage option with no retention requirements, consider Backblaze B2.
  • AlexS
    0
    thanks for answering
    The good thing is that I'm OK with 12 hours of object restore and 180 days of retention

    I thought that x-amz-storage-class HTTP header to DEEP_ARCHIVE will do the trick as its described https://www.msp360.com/resources/blog/how-to-archive-data-from-amazon-s3-to-glacier-with-explorer/
  • David Gugick
    118
    That won't work with Drive. As Matt said, it seems like you need a backup product and not cloud storage extension through Drive. Consider downloading MSP360 Backup and then you can back up to S3 and use a Lifecycle Policy to move the data automatically to Glacier Deep Archive.
  • AlexS
    0
    Do you have an ETA of the new version with support of deep archive?
  • David Gugick
    118
    We are never going to support Glacier with Drive. They are incompatible. Any storage class support added to Drive will likely only include hot S3 storage classes: S3-Standard, S3-Infrequent Access, and S3-Intelligent Tiering.
  • AlexS
    0
    David,

    Thank you for your answer
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment