A few days back a budy from our Sitecore team has alerted me to this interesting question on StackOverflow which asks for automation of content promotion from one Sitecore instance to another. He suggested – and rightly so ? that the PowerShell Console could be used in that scenario. While this was always possible by simply writing it as a PowerShell code the latest version of the console added a few commandlets making building packages much easier.
The easiest approach is to build the package visually in the package designer, save it and then simply use the console to read it and generate the installation zip like:
get-package "powershell console.xml" `
| Export-Package -FileName "PowerShell Console.zip" -Zip
That?s fine in most cases but if you have some more complex scenarios or want to generate some custom packages ? you might want to generate packages directly in PowerShell.
To Create a Package you simply use:
$package = new-package "Test Package";
Now that you have that package you might want to add some files and items to it.
Let’s add for example our item templates by querying the master database and creating a dynamic item source:
$TemplatesSource = get-childitem "master:/templates/Cognifide" `
| New-ItemSource "Cognifide Templates";
And subsequently add it to our new package:
$package.Sources.Add($TemplatesSource);
While that by itself is fairly useful, the really cool part is that you have a full flexibility of PowerShell at your disposal when you create a source with static items. Let?s say you want to add all items of template ?Article Template? that reside anywhere under your ?home? node ? now that would require quite a bit of clicking in the Package Designer, but is trivial with the PowerShell Console:
$ArticlesSource = get-childitem master:/content/home/about-us/* -recurse `
| where-object { $_.TemplateName -match "ArticleTemplate" } `
| New-ExplicitItemSource "Cognifide Articles";
$package.Sources.Add($ArticlesSource);
You can specify any automation or filter you can think of to your Get-ChildItem, and you really don?t have to skimp on the number of data sources ? after all you can re-generate your package at any time!
Similarly you can do this to the files on disk. Let?s say ? you want to add all .aspx, .ascx and .ashx files, just to make sure your deployment features all the latest code and for the sake of this example let?s assume your UI elements are located in the Layouts folder under your web application:
$LayoutsPath = $AppPath+"layouts\*"
$Layouts = get-childitem $LayoutsPath -include "*.as?x" -recurse -force `
| New-ExplicitFileSource "My Layouts";
$package.Sources.Add($Layouts);
Easy enough? now let?s add everything that is within the bin folder as a dynamic file source:
$BinFolder = New-FileSource "Bin Folder" -Root "/bin"
$package.Sources.Add($BinFolder);
That is it really? you may want to specify your package metadata which you would do like:
$package.Metadata.Author = "Auto generated " + `
[DateTime]::Now.ToShortDateString();
$package.Metadata.Comment = "Isn't it cool?!";
$package.Metadata.Publisher = "Cognifide";
and then save it for later opening in package designer:
$package | Export-Package -FileName "test package.xml"
alternatively you can open such package as specified earlier
get-package "test package.xml"
if you ever wanted to add more sources to it or export as a zip file to be imported with the assets in your target environment:
$package | Export-Package -FileName "test package.zip" -Zip
? now on your target machine you need to upload your package to the Data\Packages folder. But then to install it all it takes is:
Import-Package "test package.zip"
Obviously all of it can be hooked to ribbon, context items, or be scheduled? but I get ahead of myself…
So how does it all relate to continuous deployment?
All of this can be completely automated, all you need to do is create a Script item as described in one of my previous posts and call the PowerShell execution URL referencing your script from your CruiseControl server or whichever continuous integration product you use in a fashion similar to:
http://myhost/Console/Layouts/PowerShellResults.aspx?scriptId={1680E211-BD28-49BE-82FB-DA7232814C62}&scriptDb=web
You need to deal with the fact that you are most probably not logged in with your continuous delivery environment ? in this case probably best approach is to use the web database or the script item my turn out to be unavailable to you and the script will not execute.
Now in your source environment your script will create the package and upload it to an FTP server (there is plenty of ways to do this from PowerShell? you can find a couple of samples on Stack Overflow) and subsequently call a second part of the script on the target server.
On the Target server ? a complementary script will be executed in the similar fashion ? by the originating server and if you don?t have direct access to the file on the FTP server you?ve just uploaded ? you can download it and import the package.
Now if you integrate the script with a ribbon in Content Editor on the source server (like described in the previous post) you can have a one-click-deployment solution on your dev machine, but then the REALLY cool part would be to integrate it with the context menu (as described in this post) and be able to push parts of the site to production with a single click! Not to mention your nightlies can really be nightlies if you do it using the scheduled tasks integration.