The contents of this article are solely my own work. You are welcome to share, reuse and incorporate it’s contents in other publications so long as due credit is provided.
Overview
If your application is hosted on an IIS web server within a virtual machine, you might receive requests to either archive the logs externally or make them accessible for analysis as needed by external developers.
In this example we will be using PowerShell automation running on the virtual machine as a scheduled task to continuously export the log files to an Azure Blob Storage container.
A comparable approach can be used to transfer the logs to a different storage type. However, if there is infrequent access to the logs (which is the case here), utilizing cool or cold access tiers on Azure Storage Blobs can result in substantial cost savings.
Request Summary
Export all recent IIS log files created within the last 6 hours to an Azure storage blob. The job should run 4 times a day every 6 hours to cover 24 hours. In case of an error, an identifiable event should be triggered in event logs.
When it comes to exporting the IIS logs, we’re likely going to encounter errors while trying to export the most recent log files. These files are most often locked by the IIS process for various reasons. To overcome this issue, we will identify a second folder for the script to copy the most recent log files into.
The script can identify the most likely locked files if they have been modified within the last 30 minutes during each run time. It will then proceed to export the rest of the logs that were modified within the previous 6 hours (but older than 30 minutes).
The overall process can be simplified via the chart below:

Requirements
- A storage account and blob storage container (on cold tier)
- Local server account with privilege to run scheduled tasks and scripts
Create a storage account and set the default tier to Cold.

If you’re using an existing storage account you can either change the above settings from the Storage Account–>Settings–>Configuration or you can specify the storage tier during the upload within the script by adding -StandardBlobTier Cool to the command.
Create the blob container via powershell or in the portal and create a SAS token that will be used in the script. Ensure to restrict access to the server public ip address.

The script
The script should be self explanatory and should be easy to troubleshoot if run manually on PowerShell. I have also tried to add as much comments as possible to make it easier to follow.
You will need the az.storage module installed on the server. This can be added to the scipt via a simple check as below:
$PsModuleInstalled = Get-InstalledModule -Name ‘Az.Storage’ -Erroraction silentlycontinue
if ($PsModuleInstalled -ne $null) {
Write-Host ‘Az.storage ps module available’ `n
}
else {
Install-Module -Name Az.storage -AllowClobber -scope AllUsers -Force
}
The main script is available below and can also be downloaded from my Github here: https://github.com/amirarefi/AzureLogShipping
I chose eventID 61333 to log the error event to the event logs.
cls
##declare SASUri shared access token from Azure and convert
$SASUri = 'Your storage blob SAS token goes here'
$uri = [System.Uri] $SASUri
$sasToken = $uri.Query
#declare variables and storage account details
$folderPath = 'Your first absolute folder path to your IIS logs'
$folderPath2 = 'Your second absolute folder path to your copy folder'
$logfileExtension = '*.log' #you can change this extension based on your needs
$storageAccountName = $uri.DnsSafeHost.Split(".")[0]
$container = $uri.LocalPath.Substring(1)
$storageContext = New-AzStorageContext -StorageAccountName $storageAccountName -SasToken $sasToken
#Error definition, we are using unique event ID 61333 for easier identification in the event logs
$ErrorEvent = @{
LogName = 'Application'
Source = 'Application Error'
EventID = 61333
EntryType = 'Error'
Message = "The Azure Log Shipping script for IIS encountered an error. "
}
Try {
######identify the most recent file which is likely under lock and can't be moved###
$AllRecentFiles = Get-ChildItem -Path $FolderPath -Exclude *copy* | Get-ChildItem -Filter $logfileExtension -Recurse| Where-Object {$_.LastWriteTime -gt (Get-Date).AddMinutes(-30)} | % { $_.FullName}
#copy the most recent file into temp folder then copy to azure
foreach ($LatestFile in $AllRecentFiles) {
$filenameNoExtension = [io.path]::GetFileNameWithoutExtension($LatestFile)
$filenameExtension = [io.path]::GetExtension($LatestFile)
#prepare file name
$MostRecentFile_Name = "$filenameNoExtension$filenameExtension"
Write-Host `n$LatestFile 'is a very recent file and will be copied over then uploaded first'`n
copy-item -path $LatestFile -destination "$folderPath2$MostRecentFile_Name" #We're constructing the absolute path of the most recent file
#now copying the files into the blob
Set-AzStorageBlobContent -File $folderPath2$MostRecentFile_Name -Container $container -Context $storageContext -Force -ErrorAction Stop
}######Setion End###
}
Catch {
Write-Error $_
Write-Host `n'Error occurred, sending error to event log' `n
Write-EventLog @ErrorEvent
}
Try {
#get all the files within the specficied path modified within the past 6 hours but not newer than last hour
$fileToUpload = Get-ChildItem -Path $FolderPath -Exclude *copy* | Get-ChildItem -Filter $logfileExtension -Recurse | Where-Object {$_.LastWriteTime -gt (Get-Date).Addhours(-6) -and $_.LastWriteTime -lt (Get-Date).AddMinutes(-30)} | % { $_.FullName}
Write-Host `n"The following list of files will now be uploaded:"`n
$fileToUpload
Write-Host `n'Uploading files:' `n
##upload each file to azure blob storage using stroage context defined previously
foreach ($file in $fileToUpload) {
Set-AzStorageBlobContent -File $file -Container $container -Context $storageContext -Force -ErrorAction Stop
}
}
Catch {
Write-Error $_
Write-Host `n'Error occured, sending error to event log' `n
Write-EventLog @ErrorEvent
}
After you have saved the script on your sever and modified as per your requirements, you can use the batch file technique to avoid facing issues with running ps1 scripts via scheduled tasks. To do this, create a batch file in the same location as the script with the following content:
@ECHO OFF
SET ThisScriptsDirectory=%~dp0
SET PowerShellScriptPath=%ThisScriptsDirectory%[nameofyourscript].ps1
PowerShell -NoProfile -ExecutionPolicy Bypass -Command “& {Start-Process PowerShell -ArgumentList ‘-NoProfile -ExecutionPolicy Bypass -File “”%PowerShellScriptPath%””‘ -Verb RunAs}”
Now go ahead and add the scheduled task with the parameters below:
- Account: Local account created earlier
- Run whether user logged in or not
- Action: start a program, simply point to the batch file created above
- Schedule: every 6 hours or as needed
Thank you
Thank you for taking the time to read this article. I hope you found the information valuable and helpful. If you have any questions or feedback, feel free to reach out 🙂

You must be logged in to post a comment.