Async Copy Splat

It’s been quite some time since I last posted here.  A few things derailed my Powershell activity but with a new job I am back to learning Powershell. 

So there was an ongoing project for the WindowsXP to Windows7 migration company wide.  As I understand it the project was behind schedule and the push to get the project underway was pressing.  I was approached to write a script to address an issue that the use of USMT just wasn’t working or adequate.  The request was easy enough and I drafted the code.  Before it was even tested there was one of those “could you add”, after a few more of those I really wanted to re-work the whole thing but it was time to perform pilot testing so a re-work really wasn’t going to be well received.

Part of the migration is to also change the location and naming for the user “HomeDirectory” so data would have to be moved, or at least copied.  I chose a copy just to be safe as their original contents would still be in place until the user was able to login and verify, less headaches.

Well now the issue was how to facilitate up to 50 user updates and the coinciding data copies. Certainly “start-job” could be used so that I can process a user, start the file copy and go to the next user while, not after, the previous file copy was processed.  So I put together the code but noticed that the background jobs would be destroyed as soon as the script completed.  At first I thought maybe it was an issue with running it as a scheduled task or maybe I needed to start Powershell with –STA.

So using a simple local test on my home network I started to experiment.

 function copy-userdata {
Param(
	[string]$srcpath,
	[string]$destpath
)
"Starting backup of $srcpath to $destpath"
$jobdat = @{
    'Name' = "test"
    'ArgumentList' = $srcpath,$destpath
    'ScriptBlock' = {copy-Item -Path $args[0] -Destination $args[1] -Recurse -Force -ea 'SilentlyContinue' -Container}
}
start-Job @jobdat  
}
# Just cleaning up jobs leftover from previous test
get-Job | remove-job
copy-userdata "c:\Users\jkavanagh58" "\\jkav-homerig\public\Backup\test" 
# Wait for all jobs to complete
get-job | wait-Job 


Again the above code is just my test copy, but became the working model.  What seemed to be the issue was instead of worrying about making the get-job specific, I could just call it and pipe that to wait-job (get-job | wait-job).  So for the working copy during my ForEach statement it does the set-qaduser functions, starts the file copy job in the background and moves to the next user, when the ForEach loop is complete get-job | wait-job holds the script until the last job is completed.

Now I have to admit my first attempts to use start-job I was calling another script, but thought it would make things easier when (if) someone else looks at the code if it was a function in the same script.  Hmmmm how about if we splat the start-job, that should be easy.  Not as easy as I thought, at least not the ScriptBlock parameter.  So I got it to work with start-job @jobdat –ScriptBlock { <code>} but there had to be a way.  I found this post and now the whole start-job is splatted.

Nothing advanced here, was just fun to get back into some practical powershell.

Advertisements

One thought on “Async Copy Splat

  1. I have to work on an update to this one. While the syntax seems to work, it quietly isn’t working. The issue is related to the scope of variables, in short when start-job starts to process the copy-item, the variables used to provide source and destination folders/files do not get carried into the scriptblock. Answers is somewhere in the argumentlist variable.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s