CMS UX ? give the content some thought!

One of the many things we debate constantly at Cognifide is how to improve the user experience. How to make editor?s life easier, how to simplify the common everyday tasks, what can be automated, and simply how to make our customer smile a little when they use our projects.

For that to work, apart from the overall big blocks to be in place and working seamlessly (which is the absolute minimum required) ? you need to be VERY attentive to details.

What happens when user enters a place in the system where they are not usually required to work, are they properly guided? What do they see if they click on a little link somewhere in the corner? Does every image has an Alt text attached to it? Do your buttons have tooltips? Do the users have alternate views on all content? Do the system communicates abnormal states in a descriptive way and guides the user towards the solution? Is the UI logically laid out? Did you REALLY think what property should go on which tab? Have you setup the property names in a way that they make sense to non programmer? Do they have descriptions?

Part of the job we do is help sometimes troubled EPiServer customers get their solution built elsewhere or in-house to work, and I seem to be noticing some patterns which we have addressed in Cognifide as being bad practices. Many of those stem from the lack of research of the content served being done in the discovery phase.

Discovery phase? What?s that?

It seems that a great deal of projects does not seem to be well thought out in many aspects. When you look at the solution, It feels like a developer just got some templates and ran with them. Once the front end matches what the html templates outline, the solution is pushed to production and forgotten by the design/development agency and the poor customer is struggling with it for years trying to improve the ever degrading performance and fighting the CMS UI that?s been thrown together in a rush. Possibly aggravating in the process and rebuilding it again and again.

You need to realize that once your site goes to production, the trouble begins for your customer, not ends.

Understand your content, please

A very basic tendency for a lot of them is storing all the content of one type under a single tree node. or a very basic hierarchy. But:

  • What is the volume of content in the start?
  • Have you talked to the client about the maintenance of the content?
  • How do they plan to store older content?
    • Are they archiving it?
    • Do they plan to serve it to general while archived?
    • Is the old content actively browsed on the CMS side?
  • What is the volume increase over time?
  • What is the profile of the content? Is it a catalogue? Chronological news library?
  • Is there a taxonomy in place?
  • How often and which content is being modified?

EPiServer shows pages in a tree, and while we have observed the CMS performance improving over time there are some basic scenarios that the hierarchy structure will never be able to deal with efficiently if not well thought out.

So your potential edge case scenarios might be that the customer has:

  1. 10 000 articles that need to be migrated for the site to go live, but they only plan to add 2-3 a month,
  2. They might be starting fresh but they plan to add 20 to 30 articles a day!

How do you deal with those?

Obviously the worst thing you can do is to put them all under the ?Articles? node. The customer will be frustrated to no end! Both you and the CMS reputation gets damaged while they try to do any basic thing in the CMS.

In the first case you need to work with the client to dissolve them into the hierarchy that?s granular enough to leave you with the 20-50 articles per node tops. Dividing it into 10 categories of roughly 1000 items won?t do! If the page names are meaningful, you may attempt to create a structure based on the first and second letter of the articles. This works best for directories of people or places.

The second case is probably going to happen to you when you will be working with any kind of news site, be that intranet of sorts or a news agency, TV portal or an information broker. In which case, it seems to be making the most sense to put the content into a chronological structure. Have a node for each year, month (and day if needed) there is an article.

AUTOMATE!

When a user writes an article that is supposed to fit into your content plan, move it into the proper node automatically. In the first case, move it to the proper category or based on the page title move it around to the proper place in the catalogue. In the chronological plan, move it to the day or month node upon creation. If a new node needs to be created for it ? that?s your responsibility to do it. Organize the content for the user or you will be in a world of pain sooner than you expect!

Those tasks are easily automated by hooking into the page save and create events. Your customer will love you for it.

Naturally you don?t necessarily have to have a single plan on a given site. A site can have both branches with news-like plan and directory-like sub trees.

The base line is ? you need to plan it ahead, and you need to learn about the content profile.

Distinction with a difference

You need to realize the distinction between the content and the presentation of it. Don?t try to cram the content into the browsing structure. Separation of concerns should not be limited to code organization. Separate concerns in the content structure as well.

Have the user trip designed around the best SEO practices. The hub pages should make sense from the visitor?s perspective. DON?T try to put your articles under it just because the hub page browses them. this may work for a little tiny sites but then again, those are not the sites that you will run into troubles because it basically means your customer is barely ever touching them.

Have your content stored in a separate branch and reach out for it with a tag set that is related to the hub page.

Build a logical taxonomy and stick to it!

That basically means ? treat your content equally no matter where it?s stored ? preferably having it tagged with a high performance framework like the Faceted Navigation Framework published by Cognifide some time ago or make it Lucene.Net based. You can try to use the EPiServer built in categories. We have had limited success with it and the performance of FindPagesWithCriteria (which we have effectively banned from our arsenal) but I was told the performance in that front have improved greatly in the latest EPiServer CMS release.

Regardless ? don?t rely on the page hierarchy structure to select the content, it doesn?t matter where it is, the metadata is what should be driving the content stream. You can hand pick your navigation links, fine, but article lists, news, announcements, events, treat it as a single content stream. Use a taxonomy to divide it into categories and you will be much happier in the long run as you will gain much more flexibility to reorganize the structure and move the content around when its profile changes later.

Using the EPiServer CMS for nearly 4 years now, those are the basic principles we have come to establish.

I wonder what are your practices? Where do you think our practices do or do not fit your design philosophy? Is there anything we could do better? Do you have practices regarding content planning? How do you analyze the content to make the best plan for it?

CMS UX ? give the content some thought!

One of the many things we debate constantly at Cognifide is how to improve the user experience. How to make editor?s life easier, how to simplify the common everyday tasks, what can be automated, and simply how to make our customer smile a little when they use our projects.

For that to work, apart from the overall big blocks to be in place and working seamlessly (which is the absolute minimum required) ? you need to be VERY attentive to details.

What happens when user enters a place in the system where they are not usually required to work, are they properly guided? What do they see if they click on a little link somewhere in the corner? Does every image has an Alt text attached to it? Do your buttons have tooltips? Do the users have alternate views on all content? Do the system communicates abnormal states in a descriptive way and guides the user towards the solution? Is the UI logically laid out? Did you REALLY think what property should go on which tab? Have you setup the property names in a way that they make sense to non programmer? Do they have descriptions?

Part of the job we do is help sometimes troubled EPiServer customers get their solution built elsewhere or in-house to work, and I seem to be noticing some patterns which we have addressed in Cognifide as being bad practices. Many of those stem from the lack of research of the content served being done in the discovery phase.

Discovery phase? What?s that?

It seems that a great deal of projects does not seem to be well thought out in many aspects. When you look at the solution, It feels like a developer just got some templates and ran with them. Once the front end matches what the html templates outline, the solution is pushed to production and forgotten by the design/development agency and the poor customer is struggling with it for years trying to improve the ever degrading performance and fighting the CMS UI that?s been thrown together in a rush. Possibly aggravating in the process and rebuilding it again and again.

You need to realize that once your site goes to production, the trouble begins for your customer, not ends.

Understand your content, please

A very basic tendency for a lot of them is storing all the content of one type under a single tree node. or a very basic hierarchy. But:

  • What is the volume of content in the start?
  • Have you talked to the client about the maintenance of the content?
  • How do they plan to store older content?
    • Are they archiving it?
    • Do they plan to serve it to general while archived?
    • Is the old content actively browsed on the CMS side?
  • What is the volume increase over time?
  • What is the profile of the content? Is it a catalogue? Chronological news library?
  • Is there a taxonomy in place?
  • How often and which content is being modified?

EPiServer shows pages in a tree, and while we have observed the CMS performance improving over time there are some basic scenarios that the hierarchy structure will never be able to deal with efficiently if not well thought out.

So your potential edge case scenarios might be that the customer has:

  1. 10 000 articles that need to be migrated for the site to go live, but they only plan to add 2-3 a month,
  2. They might be starting fresh but they plan to add 20 to 30 articles a day!

How do you deal with those?

Obviously the worst thing you can do is to put them all under the ?Articles? node. The customer will be frustrated to no end! Both you and the CMS reputation gets damaged while they try to do any basic thing in the CMS.

In the first case you need to work with the client to dissolve them into the hierarchy that?s granular enough to leave you with the 20-50 articles per node tops. Dividing it into 10 categories of roughly 1000 items won?t do! If the page names are meaningful, you may attempt to create a structure based on the first and second letter of the articles. This works best for directories of people or places.

The second case is probably going to happen to you when you will be working with any kind of news site, be that intranet of sorts or a news agency, TV portal or an information broker. In which case, it seems to be making the most sense to put the content into a chronological structure. Have a node for each year, month (and day if needed) there is an article.

AUTOMATE!

When a user writes an article that is supposed to fit into your content plan, move it into the proper node automatically. In the first case, move it to the proper category or based on the page title move it around to the proper place in the catalogue. In the chronological plan, move it to the day or month node upon creation. If a new node needs to be created for it ? that?s your responsibility to do it. Organize the content for the user or you will be in a world of pain sooner than you expect!

Those tasks are easily automated by hooking into the page save and create events. Your customer will love you for it.

Naturally you don?t necessarily have to have a single plan on a given site. A site can have both branches with news-like plan and directory-like sub trees.

The base line is ? you need to plan it ahead, and you need to learn about the content profile.

Distinction with a difference

You need to realize the distinction between the content and the presentation of it. Don?t try to cram the content into the browsing structure. Separation of concerns should not be limited to code organization. Separate concerns in the content structure as well.

Have the user trip designed around the best SEO practices. The hub pages should make sense from the visitor?s perspective. DON?T try to put your articles under it just because the hub page browses them. this may work for a little tiny sites but then again, those are not the sites that you will run into troubles because it basically means your customer is barely ever touching them.

Have your content stored in a separate branch and reach out for it with a tag set that is related to the hub page.

Build a logical taxonomy and stick to it!

That basically means ? treat your content equally no matter where it?s stored ? preferably having it tagged with a high performance framework like the Faceted Navigation Framework published by Cognifide some time ago or make it Lucene.Net based. You can try to use the EPiServer built in categories. We have had limited success with it and the performance of FindPagesWithCriteria (which we have effectively banned from our arsenal) but I was told the performance in that front have improved greatly in the latest EPiServer CMS release.

Regardless ? don?t rely on the page hierarchy structure to select the content, it doesn?t matter where it is, the metadata is what should be driving the content stream. You can hand pick your navigation links, fine, but article lists, news, announcements, events, treat it as a single content stream. Use a taxonomy to divide it into categories and you will be much happier in the long run as you will gain much more flexibility to reorganize the structure and move the content around when its profile changes later.

Using the EPiServer CMS for nearly 4 years now, those are the basic principles we have come to establish.

I wonder what are your practices? Where do you think our practices do or do not fit your design philosophy? Is there anything we could do better? Do you have practices regarding content planning? How do you analyze the content to make the best plan for it?

Easy Enum property for EPiServer

One of the most frequently and eagerly used programming constructs of the Microsoft.Net Framework is Enum. There are several interesting features that make it very compelling to use to for all kinds of dropdowns and checklists:

  • The bounds factor ? proper use of Enum type guarantee that the selected value will fall within the constraints of the allowed value set.
  • The ability to treat Enums as flags (and compound them into flag sets) as well as a one-of selector.
  • The ease of use and potentially complete separation of the ?Enum value? from the underlying machine type representation that ensures the most efficient memory usage.

Surprisingly enough EPiServer as it stands right now does not have an easy facility to turn Enums into properties. To give credit where credit is due, the EPiServer framework provides a nice surrogate that mimic that behaviour to a degree. The relevant property types are:

  • PropertyAppSettingsMultiple  – which ?creates check boxes with options that are defined in the AppSettings section in web.config. The name of the property should match the key for the app setting.?
  • PropertyAppSettings  – which ?creates a drop down list with options that are defined in the AppSettings section in web.config. The name of the property should match the key for the app setting.?

You quickly realize though that the properties have some limitations that makes their use a bit less compelling:

  • The properties are not strongly typed
  • The property entry in AppSettings section has to have the name that matches the property name on the page.
  • It?s rather poorly documented, Other than relating to this blog entry or Erik?s post documenting it I could not find any other examples on how to use them. (but then again, who needs docs really when we have Reflector)
  • You cannot have the very same property duplicated on the page since you can only have a single property of a given name per page. So you need to have multiple entries in AppSettings that match the name of each of those properties on your pages. I know? semantics but still?
  • You are working on strings rather than enums (Did i mention it?s not type safe?)
  • The values in the AppSettings are stored in a somewhat DLS-y manner (consecutive options are separated from each other with the ?|? character, the name and the value are separated with a ?;?, for example: <add key = "RegionId" value="First Option;Option1|Default Option;Option2|Disabled Option;Option3" /> ) and I have had on an occasion entered a string there that caused the server to crash.
  • The values are not translatable, or at least I could not find how to do it and any Reflector digging rendered no results either.

Read the rest of this article »

Aparently I have written something on that note before for CMS 4 and it looks like someone still needs it as I got a request for an updated version for it a couple of days ago. So here we go:

for the most part the syntax for the call is equivalent to what is was before so go to my previous article regarding that (check out the old article for details). What I?ve added this time around is:

  • the @PropertyName can be declared as ?%? if you want to look in all property names
  • @PropertyType can be ?1 if you want to look in all property types otherwise you need to specify type id (this has changed from type name before due to database schema changes)
  • additionally this version of the stored proc will only look in the Master language Branch, so it will work for the single language pages and for multi-language but for language agnostic properties. (should you require the language to be variable the change is pretty simple ? I can send you the updated version by email.
/****** Object:  StoredProcedure [dbo].[PagedSearch]    Script Date: 07/07/2009 12:18:10 ******/
IF  EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[PagedSearch]') AND type in (N'P', N'PC'))
DROP PROCEDURE [dbo].[PagedSearch]
GO

CREATE Procedure PagedSearch
    @Condition varchar(1024),
    @PropertyName varchar(1024),
    @PropertyType int,
    @PageSize int,
    @PageNumber int,
    @Offset int
AS
BEGIN

    DECLARE @RowStart int
    DECLARE @RowEnd int

    SET @RowStart = @PageSize * @PageNumber + @Offset;
    SET @RowEnd = @RowStart + @PageSize + @Offset;

    WITH PageRefs AS
        (SELECT page.pkID as PageId,
            ROW_NUMBER() OVER (ORDER BY pageLang.StartPublish DESC) as RowNumber
            FROM tblPage page, tblProperty propValue, tblPageDefinition propDef, tblPageLanguage pageLang
            WHERE page.pkID = propValue.fkPageID
                AND page.fkMasterLanguageBranchID = pageLang.fkLanguageBranchID
                AND page.pkID = pageLang.fkPageID
                AND propValue.fkPageDefinitionID = propDef.pkID
                AND (@propertyType = -1 or propDef.fkPageDefinitionTypeID = @propertyType) -- is proper type
                AND propDef.Searchable = 1 -- the property is searchable
                AND propValue.String like @Condition -- contains facets
                AND propDef.[Name] like @PropertyName) -- property of proper name
    SELECT PageId
        FROM PageRefs
        WHERE (RowNumber Between @RowStart and @RowEnd) or (@PageSize = 0);
END
GO

However… looking how the schema has changed over time, I am not convinced this approach is really the best one for someone who is not prepared to deal with the changes (e.g. you better be able to change the stored procedure based on the schema changes – or bribe me with pizza and beers for updates :) ).

Additionally this procedure only searches for properties that store their value in Short string field. To make it look into long string you need to Change the highlighted line to.

AND (propValue.LongString like @Condition)

or alternatively to look in both change it to:

AND ((propValue.String like @Condition) or (propValue.LongString like @Condition))

Enjoy!

SoakIE – a Web Server Stress Tool with a twist

Last week or so ago a couple of friends in another project in Cognifide has run into a wall while trying to load test their website. the problem was as follows: The website is highly AJAX based – the page merely loads a stub in the initial request but then loads the rest of its data in a dynamic matter therefore a traditional web testing tools are fairly useless. What they tried was to setup a number of Selenium clients to pound the server, but that turned out to be fairly challenging to the machine doing the testing. It was not possible to set up more than 10 clients on a fairly strong machine.

Also there are other limitations like time to wait for the server to timeout and time between clicks, which I am not sure the tool allowed them to adjust. Talking to them I recalled a tool for grabbing website thumbnails long time ago. one way for them would be to to make a batch file with it. The tool would grab the sites’ thumbnail and stress it, but they would still have to setup a number of clients. Also it creates and tears down an instance of IE every time, making it’s not optimal for that task.

So a couple of evenings later (and a few back-s and forth-s during the testing sessions) out comes SoakIE:

SoakIETest

Read the rest of this article »

SoakIE ? a Web Server Stress Tool with a twist

Last week or so ago a couple of friends in another project in Cognifide has run into a wall while trying to load test their website. the problem was as follows: The website is highly AJAX based ? the page merely loads a stub in the initial request but then loads the rest of its data in a dynamic matter therefore a traditional web testing tools are fairly useless. What they tried was to setup a number of Selenium clients to pound the server, but that turned out to be fairly challenging to the machine doing the testing. It was not possible to set up more than 10 clients on a fairly strong machine.

Also there are other limitations like time to wait for the server to timeout and time between clicks, which I am not sure the tool allowed them to adjust. Talking to them I recalled a tool for grabbing website thumbnails long time ago. one way for them would be to to make a batch file with it. The tool would grab the sites? thumbnail and stress it, but they would still have to setup a number of clients. Also it creates and tears down an instance of IE every time, making it?s not optimal for that task.

So a couple of evenings later (and a few back-s and forth-s during the testing sessions) out comes SoakIE:

SoakIETest

Read the rest of this article »

SoakIE ? a Web Server Stress Tool with a twist

Last week or so ago a couple of friends in another project in Cognifide has run into a wall while trying to load test their website. the problem was as follows: The website is highly AJAX based ? the page merely loads a stub in the initial request but then loads the rest of its data in a dynamic matter therefore a traditional web testing tools are fairly useless. What they tried was to setup a number of Selenium clients to pound the server, but that turned out to be fairly challenging to the machine doing the testing. It was not possible to set up more than 10 clients on a fairly strong machine.

Also there are other limitations like time to wait for the server to timeout and time between clicks, which I am not sure the tool allowed them to adjust. Talking to them I recalled a tool for grabbing website thumbnails long time ago. one way for them would be to to make a batch file with it. The tool would grab the sites? thumbnail and stress it, but they would still have to setup a number of clients. Also it creates and tears down an instance of IE every time, making it?s not optimal for that task.

So a couple of evenings later (and a few back-s and forth-s during the testing sessions) out comes SoakIE:

SoakIETest

Read the rest of this article »

Advanced Language Manipulation Tool for EPiServer

Have you ever (or have your customers) created and edited a page in one language only to realize that their selected locale was wrong? Have you ever wished you could delete a master language branch of a page  after creating its localized counterpart but you could only delete the newly created slave language instead? Have a customer ever requested that they could copy a whole branch and you convert it to another language so that they could then translate in-place?

Well I have? and I?m sure I will. And so did Fredrikj on the our #epicode IRC channel ;).

Basically I had the tool that would convert from one language to another, but Fredrikj requested something that would switch master language of a page from one to another. Since I?ve already had some of the work done, I?ve updated the stored procedure I?ve written some time ago and slapped a nice GUI up on it. Here?s the result:

 

AndvancedLanguageTool

What the tool allows you to do is perform either language conversion or master branch switching on a selected page and all of its children (if you choose so).

The stored procedure have been updated to work on CMS5 R2 (will no longer work on R1 ? but if you need that functionality, comment here or give me a shout and I?ll create a compatible version for you).

A word of caution though ? I take no guarantee whatsoever about its operation. Especially, if you wreck your client?s database with it. I did what I could to prevent some of the obvious problems (like switching to a non existing master or converting to an existing one) but I will not be responsible if it won?t work for you. make a database backup and experiment there before you do any changes on the real data. That said ? it works for me, so I think it should also work for you.

You can download the archive containing the tool here. unzip it to your EPiServer web application folder keeping the folder structure or the plugin reference will be wrong. Include the *.aspx and the *.cs files in your project and apply the SQL file to your database (The manipulation is performed by a stored procedure located in the file).

Also if you?re performing the change in a load balanced environment, you may need to restart the other servers once you do the changes. I reset the DataFactory cache, but I am not sure it propagates through to other servers.

Immediately after you implement the VirtualPathProvider proxy from my previous post you will notice a one fairly serious lack in it. Namely all the files within that provider will be hiding behind the registration form. That is not cool for a couple of reasons?

  • You may want to keep all of the files in one store ? being forced to put them into a designated folder is not desired.
  • You may want to make some file freely available for some time and lock it after a while, or the other way around (e.g. to allow the robots to crawl it initially). having to move them is just silly and defeats the purpose.

So how do you discriminate the files that you want locked from those that you want to be publically available, and potentially from those that you want only the logged in users to be able to get?

Specifying the EPiServer File Metadata sweetness

One of the potential solutions would be to define a special rights group and check for that group for the people that have your ?registered? magic-cookie. That however introduces a bogus group, and I would rather like to avoid that. However if you look into the FileSummary.config file that?s located in your web application folder you will find a slightly mysterious content. A bit of hacking reveals that you can actually add your own metadata to the file. For example adding the access rights based on what I?ve established above would look as follows (the content you can already find in the file that comes with the public templates that-we-all-oh-so-love is skipped):

Read the rest of this article »

Simple registration for files served by EPiServer

With the culture of knowledge sharing and open source spreading, everyone races to show they have something valuable that you may want. And while you may not ask for money for your content you may still want to get something in return, say a contact, an email address that?s verified (or not), to keep in touch with the consumer of your content.

Yet a full fledged registration doesn?t seem like a proper thing to do ? cluttering your EPiServer user repository with (let?s face it ? for a large part fake or temporary email addresses that user create only to get your content).

While there may be a lot of ways to handle that (streaming it through a page Response.WriteFile might seem as one of the more obvious ones), I would like to show you a cleaner, simpler and more elegant way that I?ve come up with.

We really don?t want people to deep link to our files without them knowing the files are from our site, that?s just rude ? so hiding them behind an obscure URL wouldn?t work (thus we cannot use the regular file providers). We?ve already establish that we don?t want to log them in, so setting file rights are useless. But I want all the benefits including client-side caching.

Basically the solution boils down to creating a thin layer over the File provider of our choice, in my case the versioning file provider. The only method we need to override in it is GetFile. I want to allow downloading for logged in users and I want to allow downloading for all users that have a ?magic-cookie? set. If either of the conditions are met, the file just downloads using the underlying provider?s routines including all the logic EPiServer has put for caching and rights. But if neither of the requirements are met, the user is directed to a page of our choice.

Read the rest of this article »