It?s been a while since my last summary on April 2012 but finally getting to update the Reference page with the latest community summary for Sitecore PowerShell Extensions.

book_open_light_shine_out_800_clr_9090

Following are the updates I will be pushing onto the reference page I?m trying to keep up to date and failing miserably most of the time. Read the rest of this article »

Sitecore PowerShell Extensions Remoting

hand_controlling_puppet_figure_400_clr_14285I?ve been meaning to write this article for quite a while since the functionality to remote into the Sitecore environment exists in the module at least for at least a couple of versions now and the recent email from one of the Sitecore PowerShell Extensions users convinced me this cannot wait any longer.

When would I remote into my Sitecore instance?

You would probably need this as part of your Continuous Integration or installation scripts. If you need to manipulate Sitecore data from your deployment script remoting is the right solution for you.

How is that special?

We have a a number of web services that could somewhat achieve this functionality for even longer but I didn?t consider those sufficient since a real remoting functionality cannot be limited to just passing text results from the scripts passed but rather should enable the script writers to achieve true interactions between scripts running locally and the scripts that are being executed on the server.

To enable remoting on your Sitecore instance you don?t really have to do anything the web services are already deployed when you install Sitecore PowerShell Extensions. On your local machine all you need to do is include the commandlets in the script that you can find at the following path in your Sitecore instance if you’re using SPE versions older than 2.8:

master:/system/Modules/PowerShell/Script Library/Functions/Remoting

or here in 2.8 and newer:

master:/system/Modules/PowerShell/Script Library/Platform/Functions/Remoting

When you execute the script You will get 4 new commandlets at your disposal:

Read the rest of this article »

Continuous deployment in Sitecore with PowerShell

A few days back a budy from our Sitecore team has alerted me to this interesting question on StackOverflow which asks for automation of content promotion from one Sitecore instance to another. He suggested – and rightly so ? that the PowerShell Console could be used in that scenario. While this was always possible by simply writing it as a PowerShell code the latest version of the console added a few commandlets making building packages much easier.

The easiest approach is to build the package visually in the package designer, save it and then simply use the console to read it and generate the installation zip like:

get-package "powershell console.xml" `
  | Export-Package -FileName "PowerShell Console.zip" -Zip

That?s fine in most cases but if you have some more complex scenarios or want to generate some custom packages ? you might want to generate packages directly in PowerShell.

To Create a Package you simply use:

$package = new-package "Test Package";

Now that you have that package you might want to add some files and items to it.

Let’s add for example our item templates by querying the master database and creating a dynamic item source:

$TemplatesSource = get-childitem "master:/templates/Cognifide" `
  | New-ItemSource "Cognifide Templates";

And subsequently add it to our new package:

$package.Sources.Add($TemplatesSource);

While that by itself is fairly useful, the really cool part is that you have a full flexibility of PowerShell at your disposal when you create a source with static items. Let?s say you want to add all items of template ?Article Template? that reside anywhere under your ?home? node ? now that would require quite a bit of clicking in the Package Designer, but is trivial with the PowerShell Console:

$ArticlesSource = get-childitem master:/content/home/about-us/* -recurse `
  | where-object { $_.TemplateName -match "ArticleTemplate" } `
  | New-ExplicitItemSource "Cognifide Articles";

$package.Sources.Add($ArticlesSource);

You can specify any automation or filter you can think of to your Get-ChildItem, and you really don?t have to skimp on the number of data sources ? after all you can re-generate your package at any time!

Similarly you can do this to the files on disk. Let?s say ? you want to add all .aspx, .ascx and .ashx files, just to make sure your deployment features all the latest code and for the sake of this example let?s assume your UI elements are located in the Layouts folder under your web application:

$LayoutsPath = $AppPath+"layouts\*"
$Layouts = get-childitem $LayoutsPath -include "*.as?x" -recurse -force `
  | New-ExplicitFileSource "My Layouts";
$package.Sources.Add($Layouts);

Easy enough? now let?s add everything that is within the bin folder as a dynamic file source:

$BinFolder = New-FileSource "Bin Folder" -Root "/bin"
$package.Sources.Add($BinFolder);

That is it really? you may want to specify your package metadata which you would do like:

$package.Metadata.Author = "Auto generated " + `
  [DateTime]::Now.ToShortDateString();
$package.Metadata.Comment = "Isn't it cool?!";
$package.Metadata.Publisher = "Cognifide";

and then save it for later opening in package designer:

$package | Export-Package -FileName "test package.xml"

alternatively you can open such package as specified earlier

get-package "test package.xml"

if you ever wanted to add more sources to it or export as a zip file to be imported with the assets in your target environment:

$package | Export-Package -FileName "test package.zip" -Zip

? now on your target machine you need to upload your package to the Data\Packages folder. But then to install it all it takes is:

Import-Package "test package.zip"

Obviously all of it can be hooked to ribbon, context items, or be scheduled? but I get ahead of myself…

So how does it all relate to continuous deployment?

All of this can be completely automated, all you need to do is create a Script item as described in one of my previous posts and call the PowerShell execution URL referencing your script from your CruiseControl server or whichever continuous integration product you use in a fashion similar to:

http://myhost/Console/Layouts/PowerShellResults.aspx?scriptId={1680E211-BD28-49BE-82FB-DA7232814C62}&scriptDb=web

You need to deal with the fact that you are most probably not logged in with your continuous delivery environment ? in this case probably best approach is to use the web database or the script item my turn out to be unavailable to you and the script will not execute.

Now in your source environment your script will create the package and upload it to an FTP server (there is plenty of ways to do this from PowerShell? you can find a couple of samples on Stack Overflow) and subsequently call a second part of the script on the target server.

On the Target server ? a complementary script will be executed in the similar fashion ? by the originating server and if you don?t have direct access to the file on the FTP server you?ve just uploaded ? you can download it and import the package.

Now if you integrate the script with a ribbon in Content Editor on the source server (like described in the previous post) you can have a one-click-deployment solution on your dev machine, but then the REALLY cool part would be to integrate it with the context menu (as described in this post) and be able to push parts of the site to production with a single click! Not to mention your nightlies can really be nightlies if you do it using the scheduled tasks integration.