Copy From Azure Blob Storage to Azure File Share

8 Min. Read

Updated – 14/02/2022 – Starting with the AZCopy version 10.13.0 and later, Microsoft added sync support between Azure Blob <-> Azure Files instead of only copy. The automation tool was updated to support this new scenario.

Azure Files enables you to set up highly available network file shares that can be accessed by using the standard Server Message Block (SMB) protocol or the Network File System (NFS) protocol. That means that multiple VMs can share the same files with both read and write access. You can also read the files using the REST interface or the storage client libraries.

In this article, we will share with you how to automate and copy your data from Azure Blob storage to Azure file share.

Introduction

We decided to put this guide together to help many readers who have reached out to us and ask how to automate and copy data from Azure blob storage to Azure file share.

You are storing data in Azure blob storage, and you have a line of business application (LOB) that can only read from SMB file share and not from blob container. In another scenario, you want to provide access to the users through an SMB file share by forcing NTFS ACLs on different data sets that you have in a blob container. You might also have other scenarios, please leave a comment below and share your use case.

For these kinds of scenarios, you have a couple of options, at the time of this writing, you could use Azure Databox Gateway which can sync with Blobs. There are also other tools that you could use like AzCopy, Azure Batch, and Azure Data Factory that can help you move data back and forth. However, using these tools comes with some fidelity loss that you want to be aware of such as (permissions and timestamps like last modified time will be lost/changed).

For the purpose of this article, I will make use of the AzCopy tool which is a command-line utility that you can use to copy/sync blobs or files to/from a storage account, and I will use Azure Container Instances to simplify and automate the AzCopy in Runbook which will run as part of the container. In this way, we can run the container on a simple schedule to copy the data and only get billed for the time the container was used.

Please make sure to check my previous articles if you have a different use case:

> Sync between two Azure File Shares for Disaster Recovery.

> Copy between Azure File Share and Azure Blob Container.

> Sync between Azure Blob Storage and between Azure File Shares.

> Copy files from one Azure storage account to another storage account.

Prerequisites

To follow this article, you need to have the following:

1) Azure subscription – If you don’t have an Azure subscription, you can create a free one here.

2) You need to have one or two different storage accounts either in the same region, same subscription or in different regions and subscriptions.

3) You also need to create at least one container in the blob storage and one Azure file share in the same storage account, or across two different storage accounts.

4) Last, you need to have some files in the container.

Assuming you have all the prerequisites in place, take now the following steps:

Get started

First, we need to create an Azure automation account that will help you to automate the synchronization and the copy process without user interaction. This will also make sure to respect the security access of your storage account without exposing access keys to users.

Create Automation Account

In this step, we will create an Azure automation resource with a Run As account. Run As accounts in Azure Automation are used to provide authentication for managing resources in Azure with the Azure cmdlets. When you create a Run As account, it creates a new service principal user in Azure Active Directory (Azure AD) and assigns the Contributor role to the service principal at the subscription level.

Open the Azure Portal, click All services found in the upper left-hand corner. In the list of resources, type Automation. As you begin typing, the list filters based on your input. Select Automation Accounts.

Click +Add. Enter the automation account name, choose the right subscription, resource group, location, and then click Create.

Import modules from Gallery

In the next step, you need to import the required modules from the Modules gallery. In your list of Automation Accounts, select the account that you created in the previous step. Then from your automation account, select Modules under Shared Resources. Click the Browse Gallery button to open the Browse Gallery page. You need to import the following modules from the Modules gallery in the order given below:

  1. Az.Accounts
  2. Az.ContainerInstance
  3. Az.Storage
Automation Account - Import Modules
Automation Account – Import Modules

At the time of this writing, AzCopy is still not part of the Azure Automation Runbook. For this reason, we will create an Azure Container instance with AzCopy as part of the container so we can automate the entire copy process.

Please note that if you have imported the Az.ContainerInstance PowerShell module version 2.0 or later, please delete it from the Modules section in your Azure Automation Account and then import version 1.0.3 below.

Browse to the following page and import the Az.ContainerInstance PowerShell module version 1.0.3, then click on the Azure Automation tab and then select Deploy to Azure Automation. Last, use the script as described in this article, and it should work.

Updated – 27/12/2021 – The script below has been updated and tested with the latest Az.ContainerInstance module version 2.1 and above.

Create PowerShell Runbook

In this step, you can create multiple Runbooks based on which set of Azure blob container(s) you want to sync/copy to the Azure file share(s). PowerShell Runbooks are based on Windows PowerShell. You directly edit the code of the Runbook using the text editor in the Azure portal. You can also use any offline text editor such as Visual Studio Code and import the Runbook into Azure Automation.

From your automation account, select Runbooks under Process Automation. Click the ‘+ Create a runbook‘ button to open the Create a runbook blade.

Create a runbook
Create a runbook

In this example, we will create a Runbook to copy all the files and directories changes from a specific Azure bloc container to a specific file share. You can also be creative as much as you want and cover multiple Azure Blob Containers / File Shares / Directories, etc.

Edit the Runbook

Once you have the Runbook created, you need to edit the Runbook, then write or add the script to choose which Azure File Share you want to sync and copy data to the Azure blob container. Of course, you can create scripts that suit your environment.

As mentioned earlier, in this example, we will create a Runbook to read and check all the files and directories in a specific Azure Blob Container, and then copy the data over to a specific Azure file share. And to maintain a high level of security, we won’t use the storage account keys, instead, we will create a time limit SAS token URI for each service individually (blob container and file share), the SAS token will expire automatically after 60 minutes. So, if you regenerate your storage account keys in the future, the automation and copy process won’t break.

Please note that you can also update the parameter section below and copy between storage accounts across different subscriptions.

Updated – 27/12/2021 – The script below has been updated and tested with the latest Az.ContainerInstance module version 2.1 and above.

The automation script is as follows:

<#
.DESCRIPTION
A Runbook example that continuously checks for files and directories changes in recursive mode
for a specific Blob container and then copy data to Azure file share by leveraging the AzCopy tool
which is running in a Container inside an Azure Container Instance using Service Principal in Azure AD.

.NOTES
Filename : Copy-BlobContainerToAzureFileShare
Author   : Charbel Nemnom (Microsoft MVP/MCT) 
Version  : 2.1
Date     : 24-October-2021
Updated  : 25-March-2022
Tested   : Az.ContainerInstance PowerShell module version 2.1 and above

.LINK
To provide feedback or for further assistance please visit:
https://charbelnemnom.com
#>

Param (
[Parameter(Mandatory = $true)][ValidateNotNullOrEmpty()]
[String] $AzureSubscriptionId,
[Parameter(Mandatory = $true)][ValidateNotNullOrEmpty()]
[String] $storageAccountRG,
[Parameter(Mandatory = $true)][ValidateNotNullOrEmpty()]
[String] $storageAccountName,
[Parameter(Mandatory = $true)][ValidateNotNullOrEmpty()]
[String] $storageContainerName,
[Parameter(Mandatory = $true)][ValidateNotNullOrEmpty()] 
[String] $storageFileShareName
)

# Ensures you do not inherit an AzContext in your runbook
Disable-AzContextAutosave -Scope Process

# Connect to Azure with system-assigned managed identity (automation account)
Connect-AzAccount -Identity

# SOURCE Azure Subscription
Select-AzSubscription -SubscriptionId $AzureSubscriptionId

# Get Storage Account Key
$storageAccountKey = (Get-AzStorageAccountKey -ResourceGroupName $storageAccountRG -AccountName $storageAccountName).Value[0]

# Set AzStorageContext
$destinationContext = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey 

# Generate Container SAS URI Token which is valid for 60 minutes ONLY with read and list permission
$blobContainerSASURI = New-AzStorageContainerSASToken -Context $destinationContext `
 -ExpiryTime(get-date).AddSeconds(3600) -FullUri -Name $storageContainerName -Permission rl

# Generate File Share SAS URI Token which is valid for 60 minutes ONLY with read and write permission
$fileShareSASURI = New-AzStorageShareSASToken -Context $destinationContext `
 -ExpiryTime(get-date).AddSeconds(3600) -FullUri -ShareName $storageFileShareName -Permission rw

# Create azCopy syntax command
$command = "azcopy","copy",$blobContainerSASURI,$fileShareSASURI,"--recursive=true","--overwrite=ifSourceNewer"

# Choose the following syntax if you want to Sync instead of Copy
# $command = "azcopy","sync",$blobContainerSASURI,$fileShareSASURI,"--recursive=true","--delete-destination=true"

# Container Group Name
$jobName = $storageAccountName + "-" + $storageFileShareName + "-azcopy-job"

# Set the AZCOPY_BUFFER_GB value at 2 GB which would prevent the container from crashing.
$envVars = New-AzContainerInstanceEnvironmentVariableObject -Name "AZCOPY_BUFFER_GB" -Value "2"

# Create Azure Container Instance Object and run the AzCopy job
# The container image (peterdavehello/azcopy:latest) is publicly available on Docker Hub and has the latest AzCopy version installed
# You could also create your own private container image and use it instead
# When you create a new container instance, the default compute resources are set to 1vCPU and 1.5GB RAM
# We recommend starting with 2 vCPU and 4 GB memory for a large blob containers (E.g. 3TB)
# You may need to adjust the CPU and memory based on the size and churn of your file share
$container = New-AzContainerInstanceObject -Name $jobName -Image "peterdavehello/azcopy:latest" `
-RequestCpu 2 -RequestMemoryInGb 4 -Command $command -EnvironmentVariable $envVars

# The container will be created in the $location variable based on the storage account location. Adjust if needed.
$location = (Get-AzResourceGroup -Name $storageAccountRG).location
$containerGroup = New-AzContainerGroup -ResourceGroupName $storageAccountRG -Name $jobName `
-Container $container -OsType Linux -Location $location -RestartPolicy never

Write-Output ("")

Please note that if you have soft delete enabled on blob storage (which is the default now), you must add the “–overwrite=ifSourceNewer” option to the “copy” command, otherwise, it would overwrite identical/unchanged files by default and rapidly balloon out your storage costs. The script was updated to take into consideration the container soft delete feature.

Once done, click “Save” the script in the CMDLETS pane as shown in the figure below.

Save runbook
Save runbook

Then test the script using the “Test pane” to verify it’s working as intended before you publish it.

Once the test is completed successfully, publish the Runbook by clicking Publish. This is a very important step.

Schedule the Runbook

In the final step, you need to schedule the Runbook to run based on your desired time to copy the changes from the blob container to Azure file share.

Within the same Runbook that you created in the previous step, select Schedules and then click + Add schedule.

So, if you need to schedule the Runbook to run every two hours, then you need to create the following schedule with Recur every 2 Hours with Set expiration to No and then click “Create“. You can also run it on-demand if you wish to do so.

Add a schedule
Add a schedule

While scheduling the Runbook, you can configure and pass the required parameters for the PowerShell Script.

In this example, we need to specify the Azure Subscription ID, Resource Group Name, Storage Account Name, Azure Blob Container Name, and the Azure File Share Name that you want to copy over. The sample script takes those parameters as input.

Once done, click OK twice.

Test the Runbook

In this section, we will test the Runbook and request on-demand storage sync to copy the data from an Azure blob container to an Azure file share. This scenario simulates when an application or user adds or modifies files directly in Azure blob storage, and then copies the data to the Azure file share automatically.

Browse to the recently created Runbook, and on the overview page click the “Start” button. Enter the required parameters as input and then click “OK“.

The job will kick in, and after a short period of time, you will see the output and logs under the “Output” to verify that the copy job finished successfully as shown in the figure below.

Runbook job output
Runbook job output

You can also monitor the success or failure of these schedules using the “Jobs” page of Runbooks under Resources.

You can see the next run schedule using the “Schedules” page, in my example, the Runbook will run every 2 hours, and so forth…

Monitor runbook jobs
Monitor runbook jobs

That’s it there you have it!

This is version 1.0, if you have any feedback or changes that everyone should receive, please feel free to leave a comment below.

Summary

In this article, we showed you how to copy data from an Azure blob container to an Azure file share using the AzCopy tool running in a container. In this way, we can the run container with copy jobs on a simple schedule and only get billed for the time the container is used.

At the time of this writing, if you deleted some files from the Azure blob container, they won’t be deleted from the Azure file share automatically. This is a copy job and not a synchronization solution. Starting with the AZCopy version 10.13.0 and later, Microsoft added sync support between Azure Blob <-> Azure Files instead of only copy. The automation tool described in this article was updated to support this new scenario.

Do you want to learn more about Azure Storage including Azure Blobs and Azure File Shares? Make sure to check my recently published online course here: Azure Storage Essential Training.

__
Thank you for reading my blog.

If you have any questions or feedback, please leave a comment.

-Charbel Nemnom-

Related Posts

Previous

(Solution) Azure Sentinel – Caller is Missing Required Playbook Triggering Permissions

Protect Critical Backup Operations with Multi-User Authorization for Azure Backup

Next

6 thoughts on “Copy From Azure Blob Storage to Azure File Share”

Leave a comment...

  1. Thanks, Charbel! Excellent write-up.
    Another scenario is the opposite of this. Say you have AFS based SMB file or departmental shares. You utilize FSRM to monitor old files for archiving. FSRM dumps expired files to D:\expired folder and we need to move them to a Blob Storage of Archive tier to save money on storage cost. How can we preserve NTFS permissions file copying over and copying back using the solution you described in this article?

  2. Hello IJ, thanks for the comment and feedback!
    For your particular case and scenario, I would recommend using Azure File Sync where you can sync your local D:\expired folder to Azure file share (Cool) tier to reduce costs.
    1) Check this post on how to get started with Azure File Sync.
    2) I would also recommend joining the Azure storage account to your local Active Directory to preserve NTFS permissions when copying over and copying back.
    3) Check this post to create an Azure file share and understand the different tiers. You need to select the Cool tier.
    4) You could also enable cloud tiering in Azure File Sync to save disk space on your local D:\expired folder.
    5) Optional, you could also enable Azure Backup for the Azure file share (E.g. expired) in Azure.
    Hope this helps!

  3. Is it possible to have this script target specific folders within the Blob to move to AFS?

    Is it possible to do a move rather than a copy? If not, what’s the best way to validate everything has been moved from the Blob to AFS prior to a cleanup/delete job scheduled on the Blob?

  4. Hello Daniel, thanks for the comment!
    Yes, you could target a specific folder within the Blob to move to AFS.
    You need to add the virtual directory name inside the container to the command $blobContainerSASURI. Here is an example on how to construct the entire URI:

    azcopy copy "https://[account].blob.core.windows.net/[container]/[path/to/virtual/dir]/?[SAS]" "https://[account].file.core.windows.net/[share]/[path/to/dir]/?[SAS]" --recursive=true --overwrite=ifSourceNewer

    There is no move by default for the AzCopy command, I would suggest copying instead of syncing, and then deleting.
    The best way to validate is to check the log files and see the copy status and failed files. Check here for more details.
    Hope it helps!

  5. Thank you for the reply Charbel.

    Would it be possible to feed multiple commands to the same ACI in this script? I’d like to take advantage of the fact that azcopy is thread blocking and throw in an azcopy rm targeted at the Blob to remove the files that were just copied.

  6. Hello Daniel, thanks for your comment!
    Unfortunately, it’s not possible to run multi-commands at one time.
    Please check the restrictions of the exec command for ACI here.
    The only way is to run the command to create an interactive session with the container instance to execute the command continuously after you create the ACI.
    Hope it helps!

Let me know what you think, or ask a question...

The content of this website is copyrighted from being plagiarized!

You can copy from the 'Code Blocks' in 'Black' by selecting the Code.

Please send your feedback to the author using this form for any 'Code' you like.

Thank you for visiting!