How To Sync Between Azure Blob Storage and Between Azure File Shares

4 min read


A while ago, I wrote about how to copy data from one Azure storage account in one subscription to another storage account in another Azure subscription.

In this blog post, I will show you how to sync between two Azure Blob Storage accounts and how to sync between Azure File Shares. This scenario is very useful if you have an application that is designed to read/write to a single file share and you have another application that is deployed in another Azure subscription which requires access to the same files. In this way, you can sync between Azure File Shares or Blob Storage.

For this scenario, we will make use of the AzCopy tool which is a command-line utility that you can use to copy/sync blobs or files to or from a storage account. If you are new to this tool, then make sure to check the get started document from Microsoft here. The good news is, Microsoft recently added Sync support for AzCopy starting with version 10.3.0 and later.


The prerequisites are very simple as follows:

  • Download AzCopy V10.3.2 from here, or jump into Azure Cloud Shell session, AzCopy is included as part of Cloud Shell.
  • You need to have two different storage accounts either in the same region, same subscription or in different regions and subscriptions.
  • You also need to create at least one container in each blob storage OR one Azure File Share across two different storage accounts.
  • Download Microsoft Azure Storage Explorer from here if you don’t have it yet, I will use it to create the Shared Access Signature (SAS) tokens. You can also generate SAS tokens using the Azure Portal, as well as using PowerShell. Personally, I prefer to use Azure Storage Explorer to generate SAS tokens. As a side note, SAS is more secure than the storage account key. One of the main reasons is with SAS, you can ACL the IPs that can access the account, you can control the permissions on the account in a more granular fashion, and when the token will expire, as well as which service you want to have access to (Blobs, Files, Queues, Tables). Please note that SAS tokens are signed with your key, so invalidating your storage account key should invalidate the SAS token as well, so keep that in mind.
  • Last, you need to upload some files to the Blob container(s) or Azure file share(s).

Sync Between Azure File Shares

Assuming you have some files in each blob container or file share. Once you are ready, you can take the following steps:

In this example, I have the following files uploaded to the first Azure File Share named (az-fileshare-01), I have 10 files.

In the second Azure File Share named (az-fileshare-02), I have similar files but without the Copy version (5 files).

To sync an entire Azure File Share from one storage account with SAS to another storage account with SAS, you can use the following syntax. This command will go through all the files in the source file share in recursive mode and sync the contents to the destination Azure file share in the second storage account.

azcopy sync "https://[sourceaccount][Share]?[SAS]" "https://[targetaccount][Share]?[SAS]" --recursive=true

The output will show the total number of files at the Source, Destination and the Total Number of Copy Transfers which are 5 files, as well as the time in Minutes.

Open the second Azure file share in the second storage account and verify the files are synced over (total 10 files).

If I run the sync command again, you can see that the source and destination file shares are already in sync (10 files).

Sync Between Azure Blob Storage

You can use the same syntax to sync a blob container as well. To sync an entire blob storage container from one storage account with SAS to another storage account with SAS, you can use the following syntax. This command will go through all the files in the source blob storage container in recursive mode and sync the contents to the destination blob container in the second storage account.

azcopy sync "https://[sourceaccount][Container]?[SAS]" "https://[targetaccount][Container]?[SAS]" --recursive=true

Last but certainly not least, you can automate and schedule the entire process using Azure Automation Runbook to run/sync for example every 1 hour.

How it works…

Please note that synchronization is one-way. In other words, you choose which of the two endpoints is the source and which one is the destination.

The sync command compares file names and last modified timestamps. You can set the –delete-destination optional flag to a value of “true” or “prompt” to delete files in the destination directory if those files no longer exist in the source directory. If you set the –delete-destination flag to true, AzCopy deletes files without providing a prompt. If you want a prompt to appear before AzCopy deletes a file, then set the –delete-destination flag to “prompt” only.

To prevent accidental deletions, please make sure to enable the soft delete feature on your Azure Storage blobs before you use the –delete-destination=prompt|true flag option.

For more information about AzCopy Sync syntax with additional parameters, please check the following document from Microsoft.

That’s it there you have it.

Thank you for reading my blog.

If you have any questions or feedback, please leave a comment.

-Charbel Nemnom-

About Charbel Nemnom 503 Articles
Charbel Nemnom is a Cloud Architect, ICT Security Expert and Microsoft Most Valuable Professional (MVP), totally fan of the latest's IT platform solutions, accomplished hands-on technical professional with over 17 years of broad IT Infrastructure experience serving on and guiding technical teams to optimize performance of mission-critical enterprise systems. Excellent communicator adept at identifying business needs and bridging the gap between functional groups and technology to foster targeted and innovative IT project development. Well respected by peers through demonstrating passion for technology and performance improvement. Extensive practical knowledge of complex systems builds, network design and virtualization.

Be the first to comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.