This is the second part to my series of posts describing how I automated our Production –> Staging database restores. This part is still fairly early on in the process and was basically for me to hone my powershell skills , anyways onto the script:
Function GetBackupFromFilesystem { param ( [string]$Source, [string]$Destination ) #Debug Info #Write-host "destination:$destination\$i" #Write-host "source"= $source ##### foreach( $i in (ls $source )){ } if ((Test-Path -path ($destination +"\"+$i.Name)) -eq $True) { Write-host '**********************************************************************************' write-host ("File:"+ $destination+ "\"+$i.Name +" already exists") Write-host '**********************************************************************************' return @( "$($i.name)") exit } ls $destination | where {$_.name -like "*.bak"} | rm -force Write-host "DestinationFilesystem" Write-Host $destination #write-host "$i.path" Copy-Item ($source + '\' +$i.name) ($Destination + '\' +$i.name) return @( $i.name) }
The script is fairly simple. It takes the Source directory and the target directory. It then grabs the most recent file and copies that to the destination drive. However it only copies it if the file doesn’t already exist. It only uses a filename at this time to determine if the file already exists. Eventually ill do some additional checking but for now this satisfies our requirement. Also of note, It grabs the most recent file assuming you are using the standard backup naming convention for sql server 2008 r2. The standard convention appends the date to the backup filename so more recent backups end up at the bottom of the directory listing and hence the most recent backup will be the last file. If you do something different then you will have to change the logic of the “ foreach( $i in (ls $source )){ }” line to ensure your proper directory listing shows the most recent as last. The function will then return the filename to the calling procedure. This allows you to perform further operations on that file without having to get the name again.
There is a downside to this script however. There is no status output using the Copy-Item method. You get no feedback about the progress of the file copy and in many cases with large backups you are staring at a console that could be doing nothing or it could be copying the file. you simply don’t know.The following is a function that will give you a status output of how the file copy is going. It continually gives you the percentage of the file copy that has completed. I believe I lifted a good portion of this logic from somewhere though now the source escapes me.
function Copy-File { param( [string]$Source, [string]$Destination) $Sourcefile = [io.file]::OpenRead($Source) $Destinationfile = [io.file]::OpenWrite($Destination) Write-Progress -Activity "Copying file " -status "$Source -> $Destination" -PercentComplete 0 try { [byte[]]$buffer = new-object byte[] 4096 [Long]$total = [Long]$count = 0 do { $count = $Sourcefile.Read($buffer, 0, $buffer.Length) $Destination.Write($buffer, 0, $count) $total += $count if ($total % 1mb -eq 0) { Write-Progress -Activity "Copying file" -status "$Source -> $Destination" ` -PercentComplete ([int]($total/$Sourcefile.Length* 100)) } } while ($count -gt 0) } finally { $Sourcefile.Close() $Destination.Close() } }
There's a problem with this method however. If you are moving files locally its just as fast as the built-in copy-item. However if your copying from one network source to another network source as a build server might do then you incur a heavy IO cost. The file buffer must be filled locally then written out to the destination network file. This creates a network bottleneck. As such I don’t actually use this function in my scripts but have included it because it’s a handy piece of scripting and might have a use for your application.
No comments:
Post a Comment