Compressed file corruption - Metadata "Content-Encoding" used
I move files from different AWS S3 accounts regularly. Among those are compressed tar files (.tar.gz). It wasn't a problem, until we started using the metadata tag "Content-Encoding" to make it easy for a browser/user.
Now, when I move from S3 to S3 there isn't an issue, but if they copy/move through the local computer they become corrupt or decompressed on the other end. Corruption appears when the file is large >2GB and contains compiled/unreadable data.
Basic test: Make two 100MB text files, tar and compress them. Upload to S3 using tool, add metadata tag "Content-Encoding" with a value of gzip. Download back to computer and the file is uncompressed now. If I remove the tag, this wont occur.
Is there anyway to tell the tool not to modify the file because of the metadata?
Thanks in advance.
Cloudberry PRO w/maint Build
Sign in or register to add a comment.
Add a Comment
Welcome to MSP360 Forum!
MSP360 Managed Products
Managed Backup - General
Managed Backup Windows
Managed Backup Mac
Managed Backup Linux
Managed Backup SQL Server
Managed Backup Exchange
Managed Backup Microsoft 365
Managed Backup G Workspace
Backup for Linux
Backup SQL Server
Connect Free/Pro (Remote Desktop)
Backup for Linux fails to backup files with diacritics in file names (Spanish, French, German etc.)
Question about encoding algo for AWSSecret in settings.list file
Could not load file or assembly 'AlphaVSS.60.x64.dll'...
Cloudberry file backup runs while file is still being written
Terms of Service
Useful Hints and Tips
Created with PlushForums
© 2024 MSP360 Forum