• StevenLewin
    0
    I need to copy about 8 Terabytes of data to Azure Blob. It will be roughly a few million files and folders of mixed data. Can you please recommend best practise settings for Cloudberry Drive?

    Secondly, we busy testing at the moment and CloudberryDrivehost.exe is using 40% CPU and then also the service just stops. Do I need any specific settings to handle huge amounts of data?
  • David Gugick
    118
    What is your goal here? Is it just to copy data to Azure or are you looking to use Drive to access the 8 TB of data from Drive after the sync is complete? If it's the latter, I would recommend you open a Support case and discuss your particular use case with the team to see if Drive is the best solution for your needs.

    If you are only looking to copy, I would recommend you look at CloudBerry Explorer Pro
  • StevenLewin
    0
    Thanks for your response. We only need it as a once off to move our data to Azure. Does Cloudberry Explorer Pro support drive mappings? We using robocopy to move the data.
  • David Gugick
    118
    I need you to explain the Robocopy usage. Are you saying you were using Robocopy to copy the data to the Drive folder with the intent of copying that data to Azure? With Explorer Pro (not freeware for your number of files) you can simply copy the folders in question from their source location to Azure.

    You can use the trial version of Explorer which has full Pro capabilities, but not the freeware version as is has limitations on multi-threading: https://www.msp360.com/download.aspx?prod=cbazure
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment