Azcopy to Azure blob

I need to copy my data which is located on (Windows Server 2016) to our Azure Blob, this data contains folders and subfolders, Some paths exceed 260 characters and I need the structure of files to be copied as it.
I tried Azcopy but got an exception with a long file name.
Is there any idea, how to make AZcopy handle long names? what about something like RoboCopy, I'm not sure that's possible.
What about the recommended tools by my colleague like GoodSync, Gs Richcopy 360, and Securecopy? anyone has an idea
Another question, How can I control the connection via Azcopy while copying to Azure blob, as our network bandwidth is consumed by as much as 90%, this affects our DataCenter?

To be honest, I posted this question in many communities but no way to have any workable answer šŸ˜
 
I am updated with all the provided info, but "Prevent AzCopy Uploads from maxing out Internet Connection Speed" does not support my scenario, as you know Azcopy is a powerful tool that can take up your bandwidth, even I applied the way on your link it still takes about 80% to 70% for just one server, what about if I will try to copy from more than one server to the Azure Storage.
my request is simple, is there a way to make the Azcopy control the consumed bandwidth by max 50M for every server (assume my actual speed is 1G)
 

noramore

Posts: 10   +3
Now I understand , you are right this is not an option on Azcopy , I will check with the support of the mentioned tool which I use now( Gs Richcopy 360 ) and feed you back .
Note : we didn't use this tool with any cloud because our work is with local NAS or remote servers but I have good info about Azcopy .
 

noramore

Posts: 10   +3
Good news, I have a reply from the support of gurusquad, they informed me that there is an option on Gs Richcopy 360 that can control the bandwidth of how much you want .
So you are able to throttle cloud jobs so that they cannot consume more than what you allocate them to.
screenshot by the support
1613980892179.png

I hope this helps
 
I think you got it,, this exactly the option that I am looking for,, my question is,, can I set multiple jobs to run at the same time?

For example, I want to transfer from each server with a speed max 50M to Azure Storage and all servers can transfer at the same time . is that available?
 

noramore

Posts: 10   +3
Happy to hear that and yes you can copy from more than one server to Azureblob in the same time with the option of controlling the connection .
Note : Gs Richcopy 360 can copy directly to Azure blob from windows or VM which contains Azure only , the tool dose not work on Linux .
 
Just started today with the Enterprise version of Gs Richcopy 360, all jobs configured successfully and all servers now upload to the Azure storage with max speed 50M. Our users haven't any problem with the speed and the datacenter perform better now.
Thanks noramore