Skip to content
Authors: Luke Boening, GitHub Copilot (Claude Sonnet 4.5) • Last update: December 14, 2025 02:49:15 • 14 min read

Azure Sync

Azure Storage Sync Solution

Technical documentation for synchronizing files between Windows Server 2022 hosts using Azure Blob Storage.

Note: this document is authored by GitHub co-pilot using Claude Sonnet 4.5. I have reviewed the content.

The unit, integration and end-to-end status is untested.

The link to my GitHub repository with the code (private repo for now): Azure Storage Sync

Overview

This solution describes how to synchronize a subset of files and folders between two Windows Server 2022 hosts when direct SMB access is unavailable. The approach uses Azure Blob Storage as an intermediary transfer point.

The solution supports syncing multiple independent folder paths while preserving the complete folder structure.

Architecture

Oracle DB → Generate Paths → Source Server → Compress Multiple Paths → Upload to Azure Blob → Download on Destination → Extract with Full Structure

Prerequisites

  • Azure Storage Account
  • Azure Storage Account access key or SAS token
  • PowerShell 5.1 or later on both servers
  • Azure PowerShell module (Az.Storage)
  • Network connectivity to Azure from both servers
  • Oracle database access (for dynamic path selection)
  • Oracle Data Provider for .NET or Oracle Instant Client (optional)

Installation

Install Azure PowerShell Module

On both source and destination servers:

Install-Module -Name Az.Storage -Force -AllowClobber

Install Oracle Data Access (Optional)

For direct Oracle integration from PowerShell:

# Install Oracle.ManagedDataAccess
Install-Package Oracle.ManagedDataAccess -Source nuget.org

Configuration

Azure Storage Setup

  1. Create a storage account in Azure Portal
  2. Create a blob container (e.g., file-sync)
  3. Generate a SAS token or use access key for authentication

Store Credentials in Variables

# Store credentials in Windows Credential Manager
$storageAccountName = "yourstorageaccount"
$storageAccountKey = "your-storage-key"
$secureKey = ConvertTo-SecureString $storageAccountKey -AsPlainText -Force

Dynamic Path Selection from Oracle Database

Oracle Database Schema

The solution assumes you have a table or view that contains path information. Example schema:

CREATE TABLE file_paths (
    path_id NUMBER PRIMARY KEY,
    base_path VARCHAR2(500),
    relative_path VARCHAR2(1000),
    path_type VARCHAR2(50),
    status VARCHAR2(20),
    last_modified DATE,
    created_date DATE
);

-- Example data: Over 60,000 paths
-- Remediation/V145/1
-- Production/V1620/0
-- Testing/V200/5

Oracle PL/SQL Query to Generate Sync Configuration

Basic Query with JSON Output

-- Generate JSON configuration for specific paths
SELECT JSON_OBJECT(
    'BasePath' VALUE 'C:\Data',
    'SyncPaths' VALUE JSON_ARRAYAGG(relative_path)
) AS sync_config
FROM (
    SELECT relative_path
    FROM file_paths
    WHERE status = 'ACTIVE'
      AND path_type = 'PRODUCTION'
      AND last_modified >= SYSDATE - 1  -- Modified in last 24 hours
    ORDER BY last_modified DESC
    FETCH FIRST 5 ROWS ONLY
);

Advanced Query with Filtering Criteria

-- More complex selection logic
WITH selected_paths AS (
    SELECT 
        relative_path,
        path_type,
        last_modified,
        ROW_NUMBER() OVER (PARTITION BY path_type ORDER BY last_modified DESC) as rn
    FROM file_paths
    WHERE status = 'ACTIVE'
      AND (
          -- Critical production paths
          (path_type = 'PRODUCTION' AND last_modified >= SYSDATE - 1)
          OR
          -- Recent remediation paths
          (path_type = 'REMEDIATION' AND last_modified >= SYSDATE - 7)
      )
)
SELECT JSON_OBJECT(
    'BasePath' VALUE 'C:\Data',
    'SyncPaths' VALUE (
        SELECT JSON_ARRAYAGG(relative_path ORDER BY path_type, last_modified DESC)
        FROM selected_paths
        WHERE rn <= 5
    ),
    'GeneratedAt' VALUE TO_CHAR(SYSDATE, 'YYYY-MM-DD HH24:MI:SS'),
    'SelectionCriteria' VALUE 'Active paths modified in last 7 days'
) AS sync_config
FROM DUAL;

Query with Path Metadata

-- Generate JSON with additional metadata
SELECT JSON_OBJECT(
    'BasePath' VALUE 'C:\Data',
    'GeneratedAt' VALUE TO_CHAR(SYSDATE, 'YYYY-MM-DD"T"HH24:MI:SS'),
    'TotalPathsInDB' VALUE (SELECT COUNT(*) FROM file_paths WHERE status = 'ACTIVE'),
    'SelectedCount' VALUE COUNT(*),
    'SyncPaths' VALUE JSON_ARRAYAGG(relative_path ORDER BY last_modified DESC),
    'PathDetails' VALUE JSON_ARRAYAGG(
        JSON_OBJECT(
            'Path' VALUE relative_path,
            'Type' VALUE path_type,
            'LastModified' VALUE TO_CHAR(last_modified, 'YYYY-MM-DD HH24:MI:SS')
        )
        ORDER BY last_modified DESC
    )
) AS sync_config
FROM (
    SELECT 
        relative_path,
        path_type,
        last_modified
    FROM file_paths
    WHERE status = 'ACTIVE'
      AND path_type IN ('PRODUCTION', 'REMEDIATION')
      AND last_modified >= SYSDATE - 7
    ORDER BY 
        CASE path_type 
            WHEN 'PRODUCTION' THEN 1 
            WHEN 'REMEDIATION' THEN 2 
            ELSE 3 
        END,
        last_modified DESC
    FETCH FIRST 5 ROWS ONLY
);

PowerShell Script to Query Oracle and Generate Config

# filepath: Generate-SyncConfig.ps1

param(
    [Parameter(Mandatory=$true)]
    [string]$OracleConnectionString,

    [Parameter(Mandatory=$false)]
    [string]$OutputPath = "sync-config.json",

    [Parameter(Mandatory=$false)]
    [int]$MaxPaths = 5,

    [Parameter(Mandatory=$false)]
    [int]$DaysBack = 7
)

Add-Type -Path "C:\oracle\odp.net\managed\common\Oracle.ManagedDataAccess.dll"

try {
    Write-Host "[$(Get-Date)] Connecting to Oracle database..."

    $connection = New-Object Oracle.ManagedDataAccess.Client.OracleConnection($OracleConnectionString)
    $connection.Open()

    $query = @"
SELECT JSON_OBJECT(
    'BasePath' VALUE 'C:\Data',
    'GeneratedAt' VALUE TO_CHAR(SYSDATE, 'YYYY-MM-DD"T"HH24:MI:SS'),
    'SyncPaths' VALUE (
        SELECT JSON_ARRAYAGG(relative_path ORDER BY last_modified DESC)
        FROM (
            SELECT relative_path, last_modified
            FROM file_paths
            WHERE status = 'ACTIVE'
              AND last_modified >= SYSDATE - :days_back
            ORDER BY last_modified DESC
            FETCH FIRST :max_paths ROWS ONLY
        )
    )
) AS config
FROM DUAL
"@

    $command = $connection.CreateCommand()
    $command.CommandText = $query
    $command.Parameters.Add("days_back", $DaysBack) | Out-Null
    $command.Parameters.Add("max_paths", $MaxPaths) | Out-Null

    Write-Host "[$(Get-Date)] Executing query to select top $MaxPaths paths..."

    $reader = $command.ExecuteReader()

    if ($reader.Read()) {
        $jsonConfig = $reader["config"]

        # Parse and reformat JSON
        $config = $jsonConfig | ConvertFrom-Json

        Write-Host "[$(Get-Date)] Selected $($config.SyncPaths.Count) path(s):"
        foreach ($path in $config.SyncPaths) {
            Write-Host "  - $path"
        }

        # Save to file
        $jsonConfig | Set-Content -Path $OutputPath -Encoding UTF8
        Write-Host "[$(Get-Date)] Configuration saved to: $OutputPath"
    } else {
        throw "No data returned from query"
    }

    $reader.Close()
    $connection.Close()

} catch {
    Write-Error "[$(Get-Date)] Error: $_"
    exit 1
}

Alternative: SQL*Plus Script for Generating JSON

For environments without .NET Oracle provider:

# filepath: generate-config.sql

SET PAGESIZE 0
SET LINESIZE 32767
SET LONG 32767
SET LONGCHUNKSIZE 32767
SET TRIMSPOOL ON
SET TRIM ON
SET FEEDBACK OFF
SET VERIFY OFF
SET HEADING OFF

SPOOL sync-config.json

SELECT JSON_OBJECT(
    'BasePath' VALUE 'C:\Data',
    'GeneratedAt' VALUE TO_CHAR(SYSDATE, 'YYYY-MM-DD"T"HH24:MI:SS'),
    'SyncPaths' VALUE (
        SELECT JSON_ARRAYAGG(relative_path ORDER BY last_modified DESC)
        FROM (
            SELECT relative_path, last_modified
            FROM file_paths
            WHERE status = 'ACTIVE'
              AND last_modified >= SYSDATE - 7
            ORDER BY last_modified DESC
            FETCH FIRST 5 ROWS ONLY
        )
    )
)
FROM DUAL;

SPOOL OFF
EXIT;

Execute with:

# Run SQL*Plus to generate config
sqlplus username/password@database @generate-config.sql

PowerShell Wrapper for SQL*Plus Approach

# filepath: Generate-SyncConfig-SQLPlus.ps1

param(
    [Parameter(Mandatory=$true)]
    [string]$OracleUser,

    [Parameter(Mandatory=$true)]
    [string]$OraclePassword,

    [Parameter(Mandatory=$true)]
    [string]$OracleDatabase,

    [Parameter(Mandatory=$false)]
    [string]$SQLScriptPath = "generate-config.sql",

    [Parameter(Mandatory=$false)]
    [string]$OutputPath = "sync-config.json"
)

try {
    Write-Host "[$(Get-Date)] Generating sync configuration from Oracle..."

    # Set Oracle environment
    $env:NLS_LANG = "AMERICAN_AMERICA.UTF8"

    # Execute SQL*Plus
    $connectionString = "$OracleUser/$OraclePassword@$OracleDatabase"

    $process = Start-Process -FilePath "sqlplus" `
        -ArgumentList "-S", $connectionString, "@$SQLScriptPath" `
        -NoNewWindow -Wait -PassThru

    if ($process.ExitCode -eq 0 -and (Test-Path $OutputPath)) {
        Write-Host "[$(Get-Date)] Configuration generated successfully"

        # Validate JSON
        $config = Get-Content $OutputPath | ConvertFrom-Json
        Write-Host "[$(Get-Date)] Selected $($config.SyncPaths.Count) path(s):"
        foreach ($path in $config.SyncPaths) {
            Write-Host "  - $path"
        }
    } else {
        throw "SQL*Plus execution failed with exit code: $($process.ExitCode)"
    }

} catch {
    Write-Error "[$(Get-Date)] Error: $_"
    exit 1
}

Example Oracle Views for Path Selection

-- View for production paths requiring sync
CREATE OR REPLACE VIEW v_sync_production_paths AS
SELECT 
    relative_path,
    'PRODUCTION' as sync_type,
    last_modified,
    CASE 
        WHEN last_modified >= SYSDATE - 1 THEN 1  -- Last 24 hours
        WHEN last_modified >= SYSDATE - 7 THEN 2  -- Last week
        ELSE 3
    END as priority
FROM file_paths
WHERE status = 'ACTIVE'
  AND path_type = 'PRODUCTION'
ORDER BY priority, last_modified DESC;

-- View for remediation paths requiring sync
CREATE OR REPLACE VIEW v_sync_remediation_paths AS
SELECT 
    relative_path,
    'REMEDIATION' as sync_type,
    last_modified,
    CASE 
        WHEN last_modified >= SYSDATE - 1 THEN 1
        WHEN last_modified >= SYSDATE - 7 THEN 2
        ELSE 3
    END as priority
FROM file_paths
WHERE status = 'ACTIVE'
  AND path_type = 'REMEDIATION'
ORDER BY priority, last_modified DESC;

-- Combined view for top sync candidates
CREATE OR REPLACE VIEW v_sync_candidates AS
SELECT * FROM (
    SELECT * FROM v_sync_production_paths
    UNION ALL
    SELECT * FROM v_sync_remediation_paths
)
ORDER BY priority, last_modified DESC
FETCH FIRST 5 ROWS ONLY;

Integrated Workflow Script

# filepath: Sync-WithOracleIntegration.ps1

param(
    [Parameter(Mandatory=$true)]
    [string]$OracleConnectionString,

    [Parameter(Mandatory=$true)]
    [string]$StorageAccountName,

    [Parameter(Mandatory=$true)]
    [string]$StorageAccountKey,

    [Parameter(Mandatory=$false)]
    [string]$ContainerName = "file-sync",

    [Parameter(Mandatory=$false)]
    [int]$MaxPaths = 5
)

try {
    # Step 1: Generate configuration from Oracle
    Write-Host "[$(Get-Date)] Step 1: Querying Oracle for sync paths..."

    $configPath = "$env:TEMP\sync-config-$(Get-Date -Format 'yyyyMMddHHmmss').json"

    & .\Generate-SyncConfig.ps1 `
        -OracleConnectionString $OracleConnectionString `
        -OutputPath $configPath `
        -MaxPaths $MaxPaths

    if (-not (Test-Path $configPath)) {
        throw "Failed to generate configuration file"
    }

    # Step 2: Upload using generated configuration
    Write-Host "[$(Get-Date)] Step 2: Uploading files to Azure..."

    & .\Upload-ToAzure.ps1 `
        -ConfigFile $configPath `
        -StorageAccountName $StorageAccountName `
        -StorageAccountKey $StorageAccountKey `
        -ContainerName $ContainerName

    Write-Host "[$(Get-Date)] Sync completed successfully"

} catch {
    Write-Error "[$(Get-Date)] Error: $_"
    exit 1
}

Define Sync Paths

Create a configuration file to specify paths to sync:

{
  "BasePath": "C:\\Data",
  "GeneratedAt": "2025-12-13T02:30:00",
  "SyncPaths": [
    "Remediation\\V145\\1",
    "Remediation\\V145\\2",
    "Production\\V1620\\0",
    "Production\\V1620\\1",
    "Testing\\V200\\5"
  ]
}

Source Server Script

Compress Multiple Paths with Structure

# Configuration
$basePath = "C:\Data"
$syncPaths = @(
    "Remediation\V145\1",
    "Remediation\V145\2",
    "Production\V1620\0",
    "Production\V1620\1"
)

$storageAccountName = "yourstorageaccount"
$storageAccountKey = "your-storage-key"
$containerName = "file-sync"
$timestamp = Get-Date -Format "yyyyMMdd-HHmmss"
$zipFileName = "sync-$timestamp.zip"
$tempZipPath = "$env:TEMP\$zipFileName"
$tempStagingPath = "$env:TEMP\sync-staging-$timestamp"

# Create staging directory
New-Item -ItemType Directory -Path $tempStagingPath -Force | Out-Null

# Copy each path to staging while preserving structure
foreach ($syncPath in $syncPaths) {
    $sourcePath = Join-Path $basePath $syncPath
    $destPath = Join-Path $tempStagingPath $syncPath

    if (Test-Path $sourcePath) {
        Write-Host "Copying $sourcePath..."

        # Create destination directory structure
        $destParent = Split-Path $destPath -Parent
        New-Item -ItemType Directory -Path $destParent -Force | Out-Null

        # Copy files recursively
        Copy-Item -Path $sourcePath -Destination $destPath -Recurse -Force
    } else {
        Write-Warning "Path not found: $sourcePath"
    }
}

# Compress staging directory
Write-Host "Compressing files..."
Compress-Archive -Path "$tempStagingPath\*" -DestinationPath $tempZipPath -Force

# Connect to Azure Storage
$context = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey

# Upload to blob storage
Write-Host "Uploading $zipFileName to Azure Blob Storage..."
Set-AzStorageBlobContent -File $tempZipPath `
    -Container $containerName `
    -Blob $zipFileName `
    -Context $context `
    -Force

# Clean up
Remove-Item $tempStagingPath -Recurse -Force
Remove-Item $tempZipPath -Force

Write-Host "Upload complete: $zipFileName"

Destination Server Script

Download and Extract with Full Structure

# Configuration
$destinationBasePath = "C:\Data"
$storageAccountName = "yourstorageaccount"
$storageAccountKey = "your-storage-key"
$containerName = "file-sync"
$tempDownloadPath = "$env:TEMP"

# Connect to Azure Storage
$context = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey

# Get latest blob
Write-Host "Finding latest sync file..."
$latestBlob = Get-AzStorageBlob -Container $containerName -Context $context | 
    Where-Object { $_.Name -like "sync-*.zip" } | 
    Sort-Object LastModified -Descending | 
    Select-Object -First 1

if ($null -eq $latestBlob) {
    Write-Error "No sync files found in container"
    exit 1
}

$blobName = $latestBlob.Name
$downloadPath = Join-Path $tempDownloadPath $blobName
$extractPath = Join-Path $tempDownloadPath "sync-extract-$(Get-Date -Format 'yyyyMMddHHmmss')"

# Download blob
Write-Host "Downloading $blobName..."
Get-AzStorageBlobContent -Blob $blobName `
    -Container $containerName `
    -Destination $downloadPath `
    -Context $context `
    -Force

# Extract to temporary location
Write-Host "Extracting files..."
Expand-Archive -Path $downloadPath -DestinationPath $extractPath -Force

# Copy to final destination preserving structure
Write-Host "Copying to destination with structure preservation..."
Copy-Item -Path "$extractPath\*" -Destination $destinationBasePath -Recurse -Force

# Clean up
Remove-Item $downloadPath -Force
Remove-Item $extractPath -Recurse -Force

Write-Host "Sync complete - files extracted to $destinationBasePath"

Complete Automation Script

Source Server - Upload with Multiple Paths

# filepath: Upload-ToAzure.ps1

param(
    [Parameter(Mandatory=$true)]
    [string]$BasePath,

    [Parameter(Mandatory=$true)]
    [string[]]$SyncPaths,

    [Parameter(Mandatory=$true)]
    [string]$StorageAccountName,

    [Parameter(Mandatory=$true)]
    [string]$StorageAccountKey,

    [Parameter(Mandatory=$false)]
    [string]$ContainerName = "file-sync",

    [Parameter(Mandatory=$false)]
    [string]$ConfigFile
)

$ErrorActionPreference = "Stop"

try {
    # Load from config file if provided
    if ($ConfigFile -and (Test-Path $ConfigFile)) {
        Write-Host "[$(Get-Date)] Loading configuration from $ConfigFile..."
        $config = Get-Content $ConfigFile | ConvertFrom-Json
        $BasePath = $config.BasePath
        $SyncPaths = $config.SyncPaths
    }

    # Validate base path exists
    if (-not (Test-Path $BasePath)) {
        throw "Base path not found: $BasePath"
    }

    Write-Host "[$(Get-Date)] Starting sync from base path: $BasePath"
    Write-Host "[$(Get-Date)] Syncing $($SyncPaths.Count) path(s)"

    # Generate unique filename
    $timestamp = Get-Date -Format "yyyyMMdd-HHmmss"
    $zipFileName = "sync-$timestamp.zip"
    $tempZipPath = Join-Path $env:TEMP $zipFileName
    $tempStagingPath = Join-Path $env:TEMP "sync-staging-$timestamp"

    # Create staging directory
    New-Item -ItemType Directory -Path $tempStagingPath -Force | Out-Null

    # Copy each path to staging
    $copiedCount = 0
    foreach ($syncPath in $SyncPaths) {
        $sourcePath = Join-Path $BasePath $syncPath
        $destPath = Join-Path $tempStagingPath $syncPath

        if (Test-Path $sourcePath) {
            Write-Host "[$(Get-Date)] Copying: $syncPath"

            # Create destination directory structure
            $destParent = Split-Path $destPath -Parent
            if (-not (Test-Path $destParent)) {
                New-Item -ItemType Directory -Path $destParent -Force | Out-Null
            }

            # Copy files recursively
            Copy-Item -Path $sourcePath -Destination $destPath -Recurse -Force
            $copiedCount++

            # Get size info
            $size = (Get-ChildItem -Path $sourcePath -Recurse | Measure-Object -Property Length -Sum).Sum / 1MB
            Write-Host "[$(Get-Date)]   Size: $([math]::Round($size, 2)) MB"
        } else {
            Write-Warning "[$(Get-Date)] Path not found: $sourcePath"
        }
    }

    if ($copiedCount -eq 0) {
        throw "No valid paths found to sync"
    }

    Write-Host "[$(Get-Date)] Successfully staged $copiedCount path(s)"

    # Compress staging directory
    Write-Host "[$(Get-Date)] Compressing files..."
    Compress-Archive -Path "$tempStagingPath\*" -DestinationPath $tempZipPath -Force -CompressionLevel Optimal

    $archiveSize = (Get-Item $tempZipPath).Length / 1MB
    Write-Host "[$(Get-Date)] Archive size: $([math]::Round($archiveSize, 2)) MB"

    # Upload to Azure
    $context = New-AzStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey

    Write-Host "[$(Get-Date)] Uploading to Azure Blob Storage..."
    Set-AzStorageBlobContent -File $tempZipPath `
        -Container $ContainerName `
        -Blob $zipFileName `
        -Context $context `
        -Force | Out-Null

    Write-Host "[$(Get-Date)] Upload successful: $zipFileName"

    # Create manifest file
    $manifest = @{
        Timestamp = $timestamp
        BasePath = $BasePath
        SyncPaths = $SyncPaths
        ArchiveSize = $archiveSize
        FileCount = (Get-ChildItem -Path $tempStagingPath -Recurse -File | Measure-Object).Count
    }

    $manifestJson = $manifest | ConvertTo-Json
    $manifestFile = Join-Path $env:TEMP "manifest-$timestamp.json"
    $manifestJson | Set-Content $manifestFile

    # Upload manifest
    Set-AzStorageBlobContent -File $manifestFile `
        -Container $ContainerName `
        -Blob "manifest-$timestamp.json" `
        -Context $context `
        -Force | Out-Null

    Write-Host "[$(Get-Date)] Manifest uploaded"

    # Cleanup old blobs (keep last 5)
    $allBlobs = Get-AzStorageBlob -Container $ContainerName -Context $context | 
        Where-Object { $_.Name -like "sync-*.zip" } | 
        Sort-Object LastModified -Descending

    if ($allBlobs.Count -gt 5) {
        $blobsToDelete = $allBlobs | Select-Object -Skip 5
        foreach ($blob in $blobsToDelete) {
            Remove-AzStorageBlob -Blob $blob.Name -Container $ContainerName -Context $context -Force
            Write-Host "[$(Get-Date)] Deleted old blob: $($blob.Name)"

            # Delete corresponding manifest
            $manifestName = $blob.Name -replace "sync-", "manifest-" -replace ".zip", ".json"
            Remove-AzStorageBlob -Blob $manifestName -Container $ContainerName -Context $context -Force -ErrorAction SilentlyContinue
        }
    }

    # Cleanup local temp files
    Remove-Item $tempStagingPath -Recurse -Force
    Remove-Item $tempZipPath -Force
    Remove-Item $manifestFile -Force

    Write-Host "[$(Get-Date)] Sync completed successfully"

} catch {
    Write-Error "[$(Get-Date)] Error: $_"

    # Cleanup on error
    if (Test-Path $tempStagingPath) { Remove-Item $tempStagingPath -Recurse -Force -ErrorAction SilentlyContinue }
    if (Test-Path $tempZipPath) { Remove-Item $tempZipPath -Force -ErrorAction SilentlyContinue }

    exit 1
}

Destination Server - Download with Structure Preservation

# filepath: Download-FromAzure.ps1

param(
    [Parameter(Mandatory=$true)]
    [string]$DestinationBasePath,

    [Parameter(Mandatory=$true)]
    [string]$StorageAccountName,

    [Parameter(Mandatory=$true)]
    [string]$StorageAccountKey,

    [Parameter(Mandatory=$false)]
    [string]$ContainerName = "file-sync",

    [Parameter(Mandatory=$false)]
    [switch]$BackupExisting
)

$ErrorActionPreference = "Stop"

try {
    # Ensure destination base path exists
    if (-not (Test-Path $DestinationBasePath)) {
        New-Item -ItemType Directory -Path $DestinationBasePath -Force | Out-Null
        Write-Host "[$(Get-Date)] Created destination base path: $DestinationBasePath"
    }

    # Connect to Azure Storage
    $context = New-AzStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey

    # Get latest blob
    Write-Host "[$(Get-Date)] Finding latest sync file..."
    $latestBlob = Get-AzStorageBlob -Container $ContainerName -Context $context | 
        Where-Object { $_.Name -like "sync-*.zip" } | 
        Sort-Object LastModified -Descending | 
        Select-Object -First 1

    if ($null -eq $latestBlob) {
        Write-Error "No sync files found in container"
        exit 1
    }

    $blobName = $latestBlob.Name
    $timestamp = $blobName -replace "sync-", "" -replace ".zip", ""
    $downloadPath = Join-Path $env:TEMP $blobName
    $extractPath = Join-Path $env:TEMP "sync-extract-$timestamp"

    Write-Host "[$(Get-Date)] Latest sync file: $blobName"
    Write-Host "[$(Get-Date)] Last modified: $($latestBlob.LastModified)"

    # Download manifest if available
    $manifestName = "manifest-$timestamp.json"
    $manifestPath = Join-Path $env:TEMP $manifestName

    try {
        Get-AzStorageBlobContent -Blob $manifestName `
            -Container $ContainerName `
            -Destination $manifestPath `
            -Context $context `
            -Force -ErrorAction Stop | Out-Null

        $manifest = Get-Content $manifestPath | ConvertFrom-Json
        Write-Host "[$(Get-Date)] Manifest loaded:"
        Write-Host "[$(Get-Date)]   Synced paths: $($manifest.SyncPaths.Count)"
        Write-Host "[$(Get-Date)]   File count: $($manifest.FileCount)"
        Write-Host "[$(Get-Date)]   Archive size: $([math]::Round($manifest.ArchiveSize, 2)) MB"
    } catch {
        Write-Warning "[$(Get-Date)] Manifest not found, continuing without it"
    }

    # Backup existing files if requested
    if ($BackupExisting -and (Test-Path $DestinationBasePath)) {
        $backupPath = "$DestinationBasePath-backup-$timestamp"
        Write-Host "[$(Get-Date)] Creating backup: $backupPath"
        Copy-Item -Path $DestinationBasePath -Destination $backupPath -Recurse -Force
    }

    # Download blob
    Write-Host "[$(Get-Date)] Downloading archive..."
    Get-AzStorageBlobContent -Blob $blobName `
        -Container $ContainerName `
        -Destination $downloadPath `
        -Context $context `
        -Force | Out-Null

    $downloadSize = (Get-Item $downloadPath).Length / 1MB
    Write-Host "[$(Get-Date)] Downloaded: $([math]::Round($downloadSize, 2)) MB"

    # Extract to temporary location
    Write-Host "[$(Get-Date)] Extracting archive..."
    if (Test-Path $extractPath) {
        Remove-Item $extractPath -Recurse -Force
    }
    New-Item -ItemType Directory -Path $extractPath -Force | Out-Null

    Expand-Archive -Path $downloadPath -DestinationPath $extractPath -Force

    # Copy to final destination preserving complete structure
    Write-Host "[$(Get-Date)] Copying files to destination..."

    $items = Get-ChildItem -Path $extractPath -Recurse
    $filesCopied = 0

    foreach ($item in $items) {
        $relativePath = $item.FullName.Substring($extractPath.Length + 1)
        $destPath = Join-Path $DestinationBasePath $relativePath

        if ($item.PSIsContainer) {
            if (-not (Test-Path $destPath)) {
                New-Item -ItemType Directory -Path $destPath -Force | Out-Null
            }
        } else {
            $destDir = Split-Path $destPath -Parent
            if (-not (Test-Path $destDir)) {
                New-Item -ItemType Directory -Path $destDir -Force | Out-Null
            }

            Copy-Item -Path $item.FullName -Destination $destPath -Force
            $filesCopied++
        }
    }

    Write-Host "[$(Get-Date)] Copied $filesCopied file(s) to $DestinationBasePath"
    Write-Host "[$(Get-Date)] Folder structure preserved"

    # Cleanup temporary files
    Remove-Item $downloadPath -Force
    Remove-Item $extractPath -Recurse -Force
    if (Test-Path $manifestPath) {
        Remove-Item $manifestPath -Force
    }

    Write-Host "[$(Get-Date)] Sync completed successfully"

} catch {
    Write-Error "[$(Get-Date)] Error: $_"

    # Cleanup on error
    if (Test-Path $downloadPath) { Remove-Item $downloadPath -Force -ErrorAction SilentlyContinue }
    if (Test-Path $extractPath) { Remove-Item $extractPath -Recurse -Force -ErrorAction SilentlyContinue }

    exit 1
}

Usage Examples

Using Configuration File

Create sync-config.json:

{
  "BasePath": "C:\\Data",
  "SyncPaths": [
    "Remediation\\V145\\1",
    "Remediation\\V145\\2",
    "Production\\V1620\\0",
    "Production\\V1620\\1",
    "Testing\\V200\\5"
  ]
}

Run the upload script:

.\Upload-ToAzure.ps1 `
    -ConfigFile "sync-config.json" `
    -StorageAccountName "youraccount" `
    -StorageAccountKey "yourkey"

Using Command Line Parameters

.\Upload-ToAzure.ps1 `
    -BasePath "C:\Data" `
    -SyncPaths @("Remediation\V145\1", "Production\V1620\0") `
    -StorageAccountName "youraccount" `
    -StorageAccountKey "yourkey"

Download with Backup

.\Download-FromAzure.ps1 `
    -DestinationBasePath "C:\Data" `
    -StorageAccountName "youraccount" `
    -StorageAccountKey "yourkey" `
    -BackupExisting

Scheduling with Task Scheduler

Source Server Task with Config File

$action = New-ScheduledTaskAction -Execute "PowerShell.exe" `
    -Argument "-NoProfile -ExecutionPolicy Bypass -File C:\Scripts\Upload-ToAzure.ps1 -ConfigFile 'C:\Scripts\sync-config.json' -StorageAccountName 'youraccount' -StorageAccountKey 'yourkey'"

$trigger = New-ScheduledTaskTrigger -Daily -At 2:00AM

$principal = New-ScheduledTaskPrincipal -UserId "SYSTEM" -LogonType ServiceAccount -RunLevel Highest

Register-ScheduledTask -TaskName "Azure Multi-Path Sync Upload" `
    -Action $action `
    -Trigger $trigger `
    -Principal $principal `
    -Description "Upload multiple paths to Azure Blob Storage"

Destination Server Task with Backup

$action = New-ScheduledTaskAction -Execute "PowerShell.exe" `
    -Argument "-NoProfile -ExecutionPolicy Bypass -File C:\Scripts\Download-FromAzure.ps1 -DestinationBasePath 'C:\Data' -StorageAccountName 'youraccount' -StorageAccountKey 'yourkey' -BackupExisting"

$trigger = New-ScheduledTaskTrigger -Once -At (Get-Date) -RepetitionInterval (New-TimeSpan -Hours 4) -RepetitionDuration ([TimeSpan]::MaxValue)

$principal = New-ScheduledTaskPrincipal -UserId "SYSTEM" -LogonType ServiceAccount -RunLevel Highest

Register-ScheduledTask -TaskName "Azure Multi-Path Sync Download" `
    -Action $action `
    -Trigger $trigger `
    -Principal $principal `
    -Description "Download and extract files from Azure Blob Storage with backup"

Security Considerations

Use SAS Tokens Instead of Storage Keys

# Generate SAS token with limited permissions
$startTime = Get-Date
$expiryTime = $startTime.AddDays(30)

$sasToken = New-AzStorageContainerSASToken -Name $containerName `
    -Context $context `
    -Permission "rwdl" `
    -StartTime $startTime `
    -ExpiryTime $expiryTime

# Use SAS token in scripts
$context = New-AzStorageContext -StorageAccountName $storageAccountName -SasToken $sasToken

Encrypt and Store Credentials Securely

# Encrypt and store credentials
$credential = Get-Credential
$credential.Password | ConvertFrom-SecureString | Set-Content "C:\Secure\azure-key.txt"

# Retrieve in script
$securePassword = Get-Content "C:\Secure\azure-key.txt" | ConvertTo-SecureString

Monitoring and Logging

Enhanced Logging Function

function Write-SyncLog {
    param(
        [string]$Message,
        [ValidateSet('Info','Warning','Error')]
        [string]$Level = 'Info'
    )

    $timestamp = Get-Date -Format "yyyy-MM-dd HH:mm:ss"
    $logMessage = "$timestamp [$Level] $Message"
    $logPath = "C:\Logs\azure-sync.log"

    Add-Content -Path $logPath -Value $logMessage

    switch ($Level) {
        'Warning' { Write-Warning $Message }
        'Error' { Write-Error $Message }
        default { Write-Host $logMessage }
    }
}

Write-SyncLog "Starting sync process..." -Level Info
Write-SyncLog "Path not found" -Level Warning

Monitor Sync History

# Check recent syncs
$blobs = Get-AzStorageBlob -Container $containerName -Context $context | 
    Where-Object { $_.Name -like "sync-*.zip" } | 
    Sort-Object LastModified -Descending |
    Select-Object Name, @{N='Size(MB)';E={[math]::Round($_.Length/1MB,2)}}, LastModified

$blobs | Format-Table -AutoSize

Troubleshooting

Common Issues

  1. Path structure not preserved
  2. Verify base path is correctly set
  3. Check that relative paths are used in SyncPaths array
  4. Ensure no trailing slashes in paths

  5. Files missing after extraction

  6. Check manifest file for expected file count
  7. Verify sufficient disk space on destination
  8. Review extraction logs for errors

  9. Performance issues with many paths

  10. Consider batching paths into multiple sync jobs
  11. Use compression level "Fastest" for large datasets
  12. Implement parallel processing for path copying

Verify Folder Structure

# After extraction, verify structure
function Test-FolderStructure {
    param([string]$BasePath, [string[]]$ExpectedPaths)

    foreach ($path in $ExpectedPaths) {
        $fullPath = Join-Path $BasePath $path
        if (Test-Path $fullPath) {
            Write-Host "✓ Found: $path" -ForegroundColor Green
        } else {
            Write-Host "✗ Missing: $path" -ForegroundColor Red
        }
    }
}

Test-FolderStructure -BasePath "C:\Data" -ExpectedPaths @("Remediation\V145\1", "Production\V1620\0")

Performance Optimization

Parallel Path Processing

# Process multiple paths in parallel (source server)
$jobs = @()

foreach ($syncPath in $SyncPaths) {
    $jobs += Start-Job -ScriptBlock {
        param($base, $path, $staging)
        $source = Join-Path $base $path
        $dest = Join-Path $staging $path

        if (Test-Path $source) {
            $destParent = Split-Path $dest -Parent
            New-Item -ItemType Directory -Path $destParent -Force | Out-Null
            Copy-Item -Path $source -Destination $dest -Recurse -Force
        }
    } -ArgumentList $BasePath, $syncPath, $tempStagingPath
}

$jobs | Wait-Job | Receive-Job
$jobs | Remove-Job

Cost Estimation

  • Storage costs: ~$0.018 per GB/month (LRS)
  • Transaction costs: ~$0.004 per 10,000 operations
  • Data transfer: Ingress free, egress varies by region
  • Estimated monthly cost (100GB, daily sync): ~$2-5

Alternative Solutions

  • Azure File Sync: Native Azure solution for file synchronization
  • AzCopy: Command-line utility for optimized transfers
  • Azure Data Factory: Enterprise-grade data movement
  • Robocopy with Azure Files: Mount Azure Files as network drive
  • DFS Replication: If VPN connection becomes available

Conclusion

This solution provides a reliable method for synchronizing multiple independent folder paths between Windows servers using Azure Blob Storage as an intermediary. The approach preserves complete folder structure, supports multiple paths, includes manifest tracking, and can be fully automated with Task Scheduler.

The solution is flexible, secure, and scalable for enterprise use cases where direct file sharing is not available.