A/B-Testing

e-Spirit AG

2020-10-09
Table of Contents

1. Introduction

FirstSpirit can be used to create complex websites. These websites are usually created based on a range of subjective decisions. Subjective decisions will also govern design.

The design of a website aims to stimulate interaction on the part of visitors to the site by the specific placing of calls-to-action. These calls-to-action might include a request to subscribe to a newsletter, to download a document, or to purchase a product, for example. This approach is designed to maximize the conversion rate. The conversion rate provides information about how many visitors respond to a call-to-action and in doing so become potential customers.

To achieve this aim, developers must constantly assess how and where to place calls-to-action on a website in order to have the most resonance with visitors. This can seldom be decided on the basis of pre-existing information; developers must work it out for themselves. The more information developers have about visitors and the better they know them through this information, the more accurately they will be able to predict their behavior.

This is exactly where the A/B-Testing module comes in. The module can be used to show a variety of visitors several different variants of one website in the context of an experiment. The different variants of the website are distributed uniformly or according to a predefined rate. They appear again when the page is called up again subsequently. Therefore, the individual visitors are unaware that they are taking part in the experiment.

When the module is used in conjunction with an analytics tool, important statistical information can be gathered about visitors' behavior. This approach is also referred to as data-driven marketing. It provides developers with feedback about the success of each variant of the website. They can see which change to the design achieves the best results in terms of maximizing the conversion rate. At the end of an experiment, developers can apply this change directly by selecting the winning variant as the new original site.

When making changes in the future, developers will no longer have to rely on their gut instincts. They can start other experiments or use information already gathered in order to continuously optimize their websites.

The A/B-Testing module has the following technical requirements:

  • FirstSpirit (Isolated- oder Legacy-Mode) starting version 2018-11
  • Java 8 or higher

2. Range of functions

With the A/B-Testing module, editors can:

  • Create and carry out experiments
  • Add any number of variants to experiments
  • Define a distribution rate for each individual variant
  • Specify project-specific participation rate

The corresponding functions are made available in both ContentCreator and the SiteArchitect preview when the module is installed and configured.

The A/B-Testing module functions cannot be used in conjunction with the external FirstSpirit preview.

Familiar FirstSpirit tools are used to set up and maintain an experiment. Therefore, editors are not expected to have any knowledge in addition to being familiar with FirstSpirit.

An experiment is carried out with a release workflow. The workflow must include all of the pages and page references involved in an experiment. A workflow of this type is supplied with the A/B-Testing module. Based on the BasicWorkflows, it meets the specified requirements. If a project-specific workflow is to be used instead, it must be adapted accordingly with the API provided.

Tracking during the runtime of an experiment is essential in order to analyze the success of each individual variant. This is not part of the module and must be carried out on a project-by-project basis. An analytics tool (freely chosen by the user) is required for this purpose. In this regard, the A/B-Testing module has a completely open architecture and is not limited to specific services.

A tracking format template plug-in regulates the connection between the analytics tool and the FirstSpirit server. A plug-in of this type is supplied with the module as standard. It is set up for Google Analytics and takes little time and effort to configure for immediate use. Developers who wish to use a different analytics tool should implement a corresponding tracking plug-in.

When a page which is part of an experiment is called up, the A/B-Testing module creates cookies. Therefore, we recommend drawing the attention of users to the use of cookies (in accordance with the EU's ePrivacy Directive and - specifically in Germany - Section 15 Paragraph 3 of the Teleservices Act), if you do not already do this.

3. Quick start guide

The quick start guide outlines the possible statuses of an experiment. It is designed for use by editors and provides them with an initial overview. Therefore, the content of this chapter has been kept brief deliberately; only the basic functions of the A/B-Testing module are described. For a detailed description of the module and additional information, see chapters Life cycle of an experiment and Use in FirstSpirit.

During its runtime, each experiment completes a specific cycle. A simplified map of this cycle is shown below (see figure Simple life cycle of an experiment).

Simple life cycle of an experiment
Figure 1. Simple life cycle of an experiment


The cycle begins with the creation and start of an experiment. This triggers a workflow which must release all of the pages and page references involved in the experiment.

On successful completion of the start, the experiment changes status to running. The cycle stops when an experiment is finished and a new original page is selected. A new experiment can potentially be created for this page.

3.1. Creating an experiment

An experiment must be created before it can be carried out. Experiments can be created in various ways in the SiteArchitect and the ContentCreator.

ContentCreator
To create an experiment for the page displayed, select ActionsCreate experiment from the menu in ContentCreator. If the experiment is to be created for a different page, you must call it up first.
SiteArchitect
To create an experiment in SiteArchitect, click the Create experiment button (see figure Create experiment button in the SiteArchitect). This button can only be used on page references. Therefore, in the SiteStore, you must first select the required page reference for which an experiment is to be carried out.
Create experiment button in the SiteArchitect
Figure 2. Create experiment button in the SiteArchitect


If the selected page reference is already involved in an experiment or if it is not permissible for creating an experiment, the Create experiment menu item or button will be hidden.

If the menu item or button is visible but deactivated, a configuration error has occurred or the permissions for carrying out an experiment are lacking. In this case you must contact the administrator.

The A/B-Testing module functions cannot be used in conjunction with the external FirstSpirit preview.

In both clients, the A/B-Testing bar appears when the experiment is created. It contains two tabs: one for the original page and another for the variant created automatically. The variant is selected for immediate editing (see figure A/B-Testing bar in ContentCreator and in the SiteArchitect preview).

A/B-Testing bar in ContentCreator and in the SiteArchitect preview
Figure 3. A/B-Testing bar in ContentCreator and in the SiteArchitect preview


3.2. Editing an experiment

Any number of new variants can be added to or existing variants can be deleted from a current experiment. Moreover, a distribution rate can be defined for each variant, along with a participation rate for the entire experiment.

Add variant
New variants can be added by selecting the Add variant button (displayed as a plus sign) or the duplicate option from the drop-down menu of each tab.
Delete variant
Existing variants can be removed from the experiment by selecting Delete variant from the drop-down menu of the corresponding tab. A confirmation prompt will be displayed if you attempt to delete the original page.

Only variants that are not currently in focus can be deleted in SiteArchitect.

Defining the participation rate and distribution rates
The Edit settings button (displayed as three intermeshing cogwheels) opens a dialog in which the participation rate and the distribution rates for the variants can be defined (see figure Edit settings). The settings dialog is described in detail in chapter Configuring an experiment.
Edit settings
Figure 4. Edit settings


3.3. Starting an experiment

Once an experiment has been created and edited, it can be started with the Start button. If the experiment starts successfully, the button turns green and its label changes to Running.

Under certain circumstances, it is possible to create and edit an experiment without then being able to start it afterwards. A message dialog box appears instead and the experiment remains in its initial status.

If the dialog box indicates that the workflow cannot be determined, the selection made when the A/B-Testing module was configured is incorrect. When an experiment is created, the information is simply checked to ensure that it exists, not for potential errors. This check is only run when an attempt is made to start an experiment.

In such cases, please contact your administrator.

3.4. Finishing an experiment

A current experiment can be stopped with the Finish experiment button (displayed as a flag). The selected variant is set as the new original page and the A/B-Testing bar disappears from view.

If an element of the experiment is still in the workflow, the experiment cannot be finished. The editor is notified accordingly and the experiment is retained.

4. Installation & configuration

Various components must be installed and configured in order to use the functions supported by the A/B-Testing module. The steps to be completed are described in the following sub-chapters.

4.1. Module installation

Use the abtesting-<version number>.fsm file supplied to add the module on the FirstSpirit Server. To install the module, open the ServerManager and select Server propertiesModules.

Module management in the server properties
Figure 5. Module management in the server properties


The main panel contains a list of modules installed on the FirstSpirit Server. After clicking Install, select the abtesting-<version number>.fsm file supplied with the module and click Open to confirm your selection. After successful installation, an A/B-Testing folder is added to the list and must be given All permissions (see figure Module management in the server properties).

After any module installation or update, the FirstSpirit Server needs to be restarted.

4.2. Configuring the project component

A number of templates and project-specific settings are necessary in order to use the A/B-Testing module. Templates can be imported and the remaining configuration settings can be made via the project component, which must be added to the project you are using. To add the project component, open the ServerManager and select Project propertiesProject components.

Project components in the project properties
Figure 6. Project components in the project properties


A list of all existing project components is displayed in the main panel. After clicking Add, select the A/B-Testing ProjectApp and click OK to confirm your selection. The project component is then added to the list in the main panel and will need to be configured (see figure Project components in the project properties). To configure the project component, select the entry in the list and click Configure to open the associated dialog (see figure Configuration dialog for the project component).

Configuration dialog for the project component
Figure 7. Configuration dialog for the project component


Workflow to start the experiment

First, specify a workflow to start an experiment in a combo box in the configuration dialog. This must be a release workflow that is to be created on a project-specific basis and releases all pages and page references involved in an experiment.

The combo box label is displayed in red until a workflow is selected. If a workflow is not specified and/or if any other configuration errors occur, it will not be possible to create an experiment. The corresponding button or menu item will be visible in both clients but not activated.

Only a null check is performed when creating an experiment. Potentially erroneous information is not relevant to this check and is not taken into account. If the workflow selected in the combo box is deleted within the project, it will still be possible to create an experiment.

The correctness of the data is not checked until an attempt is made to start, continue, or update an experiment. In such instances, a message dialog box is displayed for the editor and the experiment is paused in its current status.

When the experiment is started via the A/B-Testing bar, the workflow is only executed on the dispatcher page. All variants of the experiment must also be included in the workflow automatically (see also chapter API). This is absolutely essential in particular if the workflow contains a deployment.

Name of the prioritized transition

A workflow often contains several outgoing transitions from a single status. In such cases, the user must decide which of these transitions is to be selected. The workflow pauses here until it is continued in the current status. Accordingly, the reference name of a transition to be prioritized can be specified for the workflow selected in the project component. This transition is executed by the workflow and the other transitions are ignored.

If the transition to be prioritized cannot be identified or has not been defined, the transition with the reference name ab_testing_prioritized is executed. This is a default value. If this transition does not exist, the editor will see a corresponding message.

If the selected workflow contains just one outgoing transition for each status, a decision does not have to be made. This transition is always executed and there is no need to specify a prioritized transition in the configuration dialog.

Selecting permitted page templates

The configuration dialog also displays a list of all page templates that are available to editors. This means that technical or similar templates that are hidden in the selection lists of the two clients are not displayed here. Select the permissible templates for carrying out an experiment from the list in the configuration dialog. The button or menu item for creating an experiment will be hidden in both clients for page references which have an underlying page based on a template that has not been selected. Use the key combination CTRL + A to select all of the templates in the list. In this case there is no restriction at all and an experiment can be created on every page reference.

Initially, when the project component is configured, no template is selected. Due to this, the name of the list is displayed in red and it is not permissible to carry out experiments. The Create experiment button or menu item is thus hidden in both clients until at least one template is selected.

Deleting pages

Next, the user must decide what happens to the pages of the individual variants when variants are deleted from a current experiment or when an experiment is finished. The corresponding checkbox Delete pages is deactivated by default. In this case, the pages of all variants are retained following the removal of a variant from a current experiment or after an experiment finishes. If the checkbox is activated, the pages of the variants that are not being used are deleted.

If the initial original page contains more references to other page references, it will be retained regardless. This still applies even if checkbox Delete pages is activated and the page is not selected as the winning variant. Otherwise the other page references would no longer contain an associated page and if they were to be called, this would lead to exceptions.

The page references of the variants that are not being used and those from the dispatcher are always deleted, regardless of the status of the checkbox.

Importing templates

Finally, the various templates can be transferred to the project with the Import templates button. They provide the functions of the A/B-Testing module to the editor.

4.3. Activating the web component

A web component must be added to the project. To add a web component, open the ServerManager and select Project propertiesWeb components.

Web components in the project properties
Figure 8. Web components in the project properties


Inside the main panel, various tabs are visible, which contain a list of the existing web components. Select all the tabs in succession and click Add. Next, select the A/B-Testing WebApp and click OK to add it. The web component is added to the list in the main panel (see figure Web components in the project properties).

The web component must be installed on an active web server and then activated. The server can be selected using the selection box.

The Active web server must be a web server with a servlet engine, version 3.0 or higher, and JSTL version 1.0 or higher.

More detailed information on how to add web components is available in the FirstSpirit Documentation for Administrators.

4.4. Specifying the metadata template

When different variants are created for an experiment, the associated dispatcher page reference is saved with them. This assignment is made using the metadata.

The template must be specified in the project properties and has to exist in the project for this purpose. If the template does not exist, start by creating it. If the template already exists, open the ServerManager and select Project propertiesOptions. Then, click the corresponding button to select the metadata template (see figure Options in the project properties). Click OK to save the changes you have made.

Options in the project properties
Figure 9. Options in the project properties


4.5. Analytics tool

Tracking during the runtime of an experiment is essential in order to analyze the success of each individual variant. Tracking is not part of the A/B-Testing module and must be carried out on a project-by-project basis. An analytics tool is required for this purpose (users are essentially free to choose which one). In this regard, the A/B-Testing module has a completely open architecture and is not limited to specific services.

The Google Analytics plug-in was added to the project with the import completed during the configuration of the project component. As its name suggests, this tracking plug-in is designed for Google Analytics and, therefore, requires a corresponding account.

If you are already working with another analytics tool, you should consider implementing your own plug-in for tracking purposes. Using the Google Analytics function is only recommended if you also use Google Analytics for other purposes. Otherwise, you should use a different analytics tool.

Please refer to the supplier's documentation when configuring the analytics tool you have chosen to use.

5. Adaptations in the FirstSpirit project

Once the various components have been installed and configured and an analytics tool has been registered, a number of adaptations must be made in the project you are using. The steps to be completed are described in the following sub-chapters.

5.1. Expansion of the metadata template

An experiment consists of a dispatcher (created automatically) and any number of variants. Each variant must be assigned to the dispatcher page of the corresponding experiment. This assignment is made with the following input component:

Metadata input component. 

<CMS_INPUT_TEXT name="md_experiment_uid" hFill="yes" singleLine="no" useLanguages="yes" hidden="yes">
	<LANGINFOS>
		<LANGINFO lang="*" label="Dispatcher" description="Dispatcher"/>
	</LANGINFOS>
</CMS_INPUT_TEXT>

The input component is hidden. It must be added to the metadata template of the project you are using. The UID of the dispatcher page reference of the experiment is written to it automatically when a new variant is created.

If a metadata template does not already exist in the project, one must be created.

Select the metadata template in the project properties.

More information about metadata is available in the FirstSpirit Documentation for Administrators.

5.2. Expansion of the page template

The functions of the A/B-Testing module are provided via four format templates. These templates must be referenced in the HTML code of all permissible page templates. The A/B-Testing Head, A/B-Testing Body, and Traffic Allocation Plug-in templates were added to the FirstSpirit project during the import completed when the project component was configured. The fourth template corresponds to the tracking plug-in to be integrated.

A/B-Testing Head
This format template registers the integrated plug-ins and incorporates the CSS for the A/B-Testing bar in the preview.
A/B-Testing Body
This format template takes the tracking code from the integrated tracking plug-in and makes it available by referencing the A/B-Testing Tracking format template. It is also needed to show and hide the A/B-Testing bar.
Traffic allocation plug-in
The traffic allocation plug-in makes the settings dialog of the A/B-Testing bar available. The participation rate for the entire experiment and the distributions of the individual variants are defined and saved in this dialog.
Tracking plug-in
If you do not wish to use the Google Analytics plug-in supplied with the module, the tracking plug-in must be implemented on a project-by-project basis. Among other things, the plug-in contains the necessary tracking code which enables the success of the variants involved in an experiment to be determined. It is also used to query other specific data that might be available.

All four format templates are essential to the use of the A/B-Testing functions. They must be referenced in the page template you are using with CMS_RENDER calls:

Referencing format templates in the page template. 

<head>
   [...]
   $CMS_RENDER(script:"has_experiment", pageref: #global.node)$
   $CMS_IF(hasExperiment)$
      $CMS_RENDER(template:"abtesting_head")$
      $CMS_RENDER(template:"traffic_allocation_plugin")$
      $CMS_RENDER(template:"TRACKING PLUGIN REFERENCENAME")$ ❶
   $CMS_END_IF$
</head>
<body>
   $CMS_RENDER(template:"abtesting_body")$
   [...]
</body>

The reference name of the tracking plug-in to be integrated must be entered here.

The JSTL tag library below and the JSP page directive must be integrated in addition to the format templates. However, unlike the format templates, both calls should be placed at the start of the HTML code.

If this is a JSP project, the lines may already be there.

JSTL tag library and JSP page directive. 

<%@ page language="java" contentType="text/html; charset=$CMS_VALUE(#global.encoding)$" %>
<%@ taglib uri="http://java.sun.com/jsp/jstl/core" prefix="c" %>

It must also be ensured that all variants involved in an experiment are JSP pages. To make sure of this, the file extension jsp must be defined in the Properties of the corresponding page template (see figure Defining a file extension).

Defining a file extension
Figure 10. Defining a file extension


5.3. Permission definition

It is essential for editorial permissions to be in place in order to execute the various functions supported by the A/B-Testing module. The table below explains which permissions are needed during the runtime of the experiment and which function belongs to each permission.

PermissionContentSiteExplanation

Visible/Read

The ability to view objects and read their content is required as standard.

Change

Various steps in an experiment need the ability to change objects:

while an experiment is being both created and finished, the URL switches between the original page and the dispatcher. This changes both pages.

What's more, a two-way link is created between every new variant and the dispatcher. This step requires a change to be made to the dispatcher too.

The need for the Change permission is self-explanatory when it comes to editing existing variants.

Create object

The dispatcher and the first variant are created automatically when an experiment is created. Each additional variant must be added manually. In both cases, the permission to create objects is required.

Create folder

This permission is not needed to carry out experiments.

Remove object

When an experiment finishes, the project is purged of the page references and, depending on the configuration, of the pages of the discarded variants and the dispatcher.

The same applies to removing variants from an existing experiment.

The permission to delete objects is indispensable for these actions.

Remove folder

No folders are deleted in connection with an experiment.

Release

Various objects are released when an experiment starts, finishes, and is continued.

Show metadata

The visibility of metadata is fundamental when carrying out an experiment. If it is not visible, an experiment cannot be started, continued, updated or finished. The view permission is also needed to edit settings.

Change metadata

When an experiment is created, as well as new variants, a two-way link is generated between the variants and the associated dispatcher. Since the link is stored in the metadata of the respective page references, it must be possible to change that data.

Change permissions

This permission is not needed to carry out experiments.

Select the root node of the PageStore or the SiteStore to distribute the editorial permissions. Then, right-click to open the context menu, select ExtrasChange permissions, then open Permission assignment. Assign the relevant permissions to the editors of your project in the dialog that appears.

When an experiment finishes, the page references and, depending on the configuration, the pages of the discarded variants and the dispatcher are removed from the project. The URL of the dispatcher is also transferred to the new original page.

These changes are persisted in the PageStore or the SiteStore by releasing the page and the page reference, as well as the respective parent folder. This is why the release permission is indispensable explicitly for finishing an experiment too.

5.4. Release workflow

The workflow selected in the project component is executed on the dispatcher when carrying out an experiment via the A/B-Testing bar. However, it can also be carried out manually on the individual variants using the general FirstSpirit functions. In both cases, it is essential to release all pages and page references involved in the experiment in order to avoid errors during generation and when live.

The module is supplied with an example workflow which meets this requirement. However, it is also possible to adapt an existing project-specific workflow. Both options are full-fledged alternatives. They are described in the following sub-chapters.

Only one workflow at a time can be executed per experiment. This means:

  • The workflow selected in the project component cannot be executed via the A/B-Testing bar if one of the variants is locked or already involved in a workflow.
  • Manual execution of a workflow on a variant is not possible if the page or page reference of the associated dispatcher is already involved in a workflow.

In both cases, the editor is notified accordingly and the corresponding workflow cannot be started.

5.4.1. API

When an experiment is carried out, all of the pages and page references involved must always be released. If a project-specific workflow was selected during the configuration of the project component, its function must be extended accordingly. The A/B-Testing API provides a number of methods for doing this; these are described briefly below. The abtesting-api folder containing the Javadoc documentation is also included in the scope of supply.

isDispatcher
This method determines whether the passed page reference is the dispatcher of an experiment. It returns true or false.
hasExperiment
This method can be used to check whether a page reference is involved in an experiment. If the method returns true, the page reference is a variant.
getVariants
This method fetches all variants of an experiment whenever the associated dispatcher is passed to it. If it fetches another variant or a page reference which cannot be assigned to an experiment, the list returned will be empty.
findDispatcherPageRef / findDispatcherPageRefForVariant
These two methods support the same function and differ only with regard to the parameter that is passed. In the case of the former method, the Uid serves as the basis for finding the corresponding dispatcher. In the case of the latter method, the page reference of a variant serves as the basis for finding the corresponding dispatcher.
loadPageRefByUid
This method uses a passed Uid to return the associated page reference.
isDispatcherInWorkflow
This method checks for a transferred variant whether the page or page reference of the associated dispatcher is already involved in a workflow. If it is, the method returns true and the execution of the workflow started manually on the variant must be canceled. If it was not canceled, status displays might be incorrect and there may be inconsistencies after going live.
isRemainingVariant
At the end of an experiment, the workflow selected in the project component is executed on the winning variant. It must be ensured (in this case only) that the workflow runs automatically and the editor has no means of canceling it. Otherwise this can lead to inconsistencies. Therefore, this method can be used to check whether a passed page reference is the winner variant of an experiment that is to be finished. If it is, the method returns true.
finalizeExperiment
The metadata field added when the project was adapted is used when finishing an experiment. Once the winning variant has been selected and the experiment has been finished, the content of this field must be deleted. This option is supported by the method.

The methods can be used in a script that is to be added to an activity of the release workflow being used. It makes sense to select the first activity in this case.

5.4.2. Workflow supplied

The A/B-Testing module is supplied with a workflow which meets the requirements described. To use it, select it in the configuration dialog for the project component. Since this workflow is based on the BasicWorkflows, these are a prerequisite of the project.

Installing the BasicWorkflows module

If the BasicWorkflows are not already part of the project, before using the workflow supplied, you must install the BasicWorkflows module on the FirstSpirit Server and activate the web component. Proceed as described for installing the A/B-Testing module and activating the associated web component. However, the web component for the BasicWorkflows is only needed on the ContentCreator tab.

In addition, the workflow must be activated for ContentCreator by selecting the Element Status Provider supplied. To do this, open the ServerManager and select Project propertiesContentCreator. Then change the Element Status Providers entry to the BasicWorkflows Status Provider entry and confirm the change by clicking OK.

Selecting the Element Status Provider
Figure 11. Selecting the Element Status Provider


Importing scripts and the workflow

Next, the BasicWorkflows scripts used for the release workflow must be added to the project. To do this, go to the Template Store in SiteArchitect and click to select Import from the context menu. This opens an import dialog box where you should select the export_basic_release import file from the workstation directories. Click Open to confirm; a second dialog is displayed listing all of the elements contained in the file. Since only the scripts are needed, the import of the release workflow can be deactivated (see figure Importing scripts). Finally, repeat the same process to import the workflow supplied with the A/B-Testing module.

Importing scripts
Figure 12. Importing scripts


The imported workflow must now be authorized in the individual stores so that it can be executed on FirstSpirit elements. To do this, select Change permissions from the context menu of the stores' root nodes to call up the permission assignment. Next, on the Workflow permissions tab, activate the Authorized and Use release permission checkboxes for the imported workflow. To finish, click the OK button to close the dialog.

Functionality

Once installation and import processes are complete, the workflow can be used in the project. The workflow can essentially be executed on various elements:

Dispatcher
When an experiment is started, updated, and continued via the A/B-Testing bar, the workflow is executed on the associated dispatcher page. In this case, both the dispatcher and all of the variants associated with the element are released automatically.
Variant
The general FirstSpirit function can also be used to execute the workflow on a variant. In this case, the editor must confirm a security prompt in order to release the associated experiment. If the editor confirms this prompt, all variants and the dispatcher are released. If the editor does not confirm this prompt, the workflow is canceled and the variant remains in its previous status.

Only one workflow can ever be executed per experiment. This means:

  • The workflow selected in the project component cannot be executed via the A/B-Testing bar if one of the variants is locked or already involved in a workflow.
  • Manual execution of a workflow on a variant is not possible if the page or page reference of the associated dispatcher is already involved in a workflow.

In both cases, the editor is notified accordingly and the corresponding workflow cannot be started.

If the workflow is executed manually using the general FirstSpirit function, it is necessary to also manually reload the preview in order to update the status of the A/B-Testing bar.

Element without reference to an experiment
If the workflow is executed on a FirstSpirit element that is not involved in an experiment, it will behave as per its standard functionality.

You can find more detailed information on the BasicWorkflows in the associated documentation.

6. Tracking

An experiment is only worthwhile if it returns a meaningful result. This depends in turn on the success of the variants. Tracking during the runtime of the experiment is essential in order to analyze the success of each individual variant.

Tracking is not part of the A/B-Testing module and must be carried out on a project-by-project basis. In addition to an analytics tool that can be freely selected by the user, a tracking plug-in is required in the project for this purpose. The plug-in corresponds to a format template and must contain the tracking code, among other things.

The Google Analytics plug-in was added to the project with the import completed during the configuration of the project component. In order to use it, you must complete just a few configuration steps which are described in the next sub-chapter. You should only consider using the plug-in if you are also using Google Analytics for other purposes.

Otherwise, we recommend choosing a different analytics tool. In this case, a project-specific tracking plug-in is required. Developers can implement this plug-in easily. The requirements to be met are also described below.

6.1. Google Analytics plug-in

The Google Analytics plug-in is supplied with the module. Designed for Google Analytics, as its name suggests, the plug-in is added to the project with the import completed during the configuration of the project component.

In addition to having a Google Analytics account, you simply need to add some information to the project settings template and select it in the project properties. It needs to be referenced in the page template too.

The plug-in also requires a Google Analytics Id and a Dimension Id:

All of the necessary steps are described below.

6.1.1. Registration and configuration of a Google Analytics account

A Google Analytics account is required in order to use the Google Analytics plug-in that is supplied with the module. If you do not have an account, you can register here: http://www.google.com/analytics/

A wizard guides you through the steps of the registration process that must be completed in order to create an account.

Registration of a Google Analytics account
Figure 13. Registration of a Google Analytics account


Confirm the Google Analytics terms of use to complete the registration process.

More information about configuration is available in the Google Analytics help.

6.1.2. Plug-in integration

The Google Analytics plug-in is imported into the project during the configuration of the project component. It is made available as a format template and contains the tracking code for capturing the success of the variant. It also serves to query the Dimension Id, which must be entered when configuring an experiment.

In order to use the Google Analytics plug-in, it must be integrated in all permissible page templates by means of a $CMS_RENDER$ call. It is important to note that the call must be positioned in the page header after the reference to the A/B-Testing Head.

Referencing the Google Analytics plug-in in the page template. 

<head>
   [...]
   $CMS_RENDER(script:"has_experiment", pageref: #global.node)$
   $CMS_IF(hasExperiment)$
      $CMS_RENDER(template:"abtesting_head")$
      $CMS_RENDER(template:"traffic_allocation_plugin")$
      $CMS_RENDER(template:"google_analytics_plugin")$
   $CMS_END_IF$
</head>
<body>
   $CMS_RENDER(template:"abtesting_body")$
   [...]
</body>

6.1.3. Expansion of the project settings template

The Google Analytics plug-in supplied with the module uses the following input component, which must be added to the project settings template. If a project settings template does not already exist in the project, one must be created. The template must also be selected in the project properties.

Project settings input component. 

<CMS_GROUP>
   <LANGINFOS>
      <LANGINFO lang="*" label="A/B-Testing"/>
   </LANGINFOS>

   <CMS_INPUT_TEXT name="ab_googleid" hFill="yes" singleLine="no" useLanguages="no">
      <LANGINFOS>
         <LANGINFO lang="*" label="Google Analytics Id"/>
      </LANGINFOS>
   </CMS_INPUT_TEXT>
</CMS_GROUP>

Within the project, the Google Analytics Id must be added to the project settings by entering it in this field. The Id is used by the plug-in. It enables the variants involved in an experiment to be tracked by Google Analytics.

You will find the Google Analytics Id in your Google Analytics account on the following path: AdminAccountPropertyTracking InfoTracking Code.

Path to find the Google Analytics Id
Figure 14. Path to find the Google Analytics Id


6.1.4. Specification of the project settings template

If you are using the Google Analytics plug-in, in addition to the metadata template, the project settings template also has to be added to the project properties. The template is used for the provision of the Google Analytics Id; it must exist in the project and be expanded as described in the previous chapter. If the template does not exist, complete these steps first.

If the template already exists, open the ServerManager and select Project propertiesOptions. Then click the corresponding button to select the project settings (see figure Options in the project properties). Click OK to save the changes you have made.

Options in the project properties
Figure 15. Options in the project properties


6.1.5. Specification of the Dimension Id

The Google Analytics plug-in adds a text box to the settings dialog of an experiment. Enter the Id of the Custom Dimension created in Google Analytics in this box.

To do this, click the Edit settings button to open the corresponding dialog of an existing experiment (see figure Edit settings). Then enter the Dimension Id in the designated text box and click OK to close the dialog and save your entry.

More detailed information about the Custom Dimension is available in the Google Analytics help under Dimensions and measured values.

Edit settings
Figure 16. Edit settings


6.2. Separate tracking plug-in

If you are using a different analytics tool for tracking, this can be added as a tracking plug-in at any time on a project-by-project basis. In this regard, the A/B-Testing module has a completely open architecture and is not limited to specific services.

The following code corresponds to the basic implementation of this type of tracking plug-in. The general structure of the code is shown:

Base code. 

<script>
   var myPlugin=( function(){
      $-- init variables --$
      var myVar; ❶
      return {
         $-- add gui --$
         addGui: function (container) { ❷
            myVar=('myPlugin' in config && 'myVar' in config['myPlugin']) ? config['myPlugin'].myVar : "";
         },

         $-- store additional params --$
         storeParams: function () { ❸
            var addParams={
               $-- new parameter: store --$
               "myVar":myVar
            }
            return addParams;
         },

         $-- add analytics code --$
         addTrackingCode: function (variant) { ❹
            var trackingId='$CMS_RENDER(script:"getconfiguration", param:"myPlugin:myVar",
            srcUid:#global.node.uid)$';
         }
      };
   })();
   pluginRegistry.register('myPlugin', myPlugin); ❺
</script>

Global variables (in this case myVar) are required to use the methods; they must first be initialized.

These global variables are used by the addGui method, which serves to expand the configuration dialog and writes the values entered to the defined variables.

This is followed by persistence of the values entered in the storeParams method.

They can also be passed to the tracking code, which is added with the addTrackingCode method.

Finally, registration of the tracking plug-in must be completed with the register method.

Every tracking plug-in must implement the addGui, storeParams, and addTrackingCode methods, as well as the register method.

6.2.1. Expansion of the configuration dialog

The addGui(container) method is used to expand the configuration dialog. Click the Edit settings button on the A/B-Testing bar to open this dialog box. It supports a number of configuration options as standard. These options are all specific to the associated experiment and, therefore, cannot be set globally (in the project properties, for example).

To expand the dialog, a DomElement (container) is passed to the method. Additional DomElements can be incorporated into this DomElement. If default values saved upstream are to be written to form fields, these values can be read from the config object.

Populating fields with default values. 

variable = ('pluginname' in config) ? config['pluginname'].variable : "";

It is essential that all variables used for saving are initialized at the start of the tracking plug-in code. Otherwise, persistence of the information entered in the configuration dialog by the editor will not be possible.

Initialization. 

var pluginname=( function(){
   $-- init variables --$
   var variable;

   return {
      $-- add gui --$
      addGui: function (container) {
         [...]
      },
      [...]
   };
})();

6.2.2. Persistence

The tracking plug-in must be told which values are to be persisted. This is done with a map. The values in the map must use the following syntax:

Syntax of map entries. 

"<NAME>":<VALUE>

The map is created in the storeParams method and returned by it so that the persisted values are subsequently available on the page. They are used by the dispatcher page and transferred in a hidden input component.

6.2.3. Tracking code

Every analytics tool generally has a tracking code which is used to capture interactions on the website. The tracking code must be added to the tracking plug-in with the addTrackingCode(variant) method. The variant parameter is also provided for the purpose of identifying the individual variants involved in an experiment. It contains the Id of the variant displayed in each case.

If the addGui and storeParams methods have been used to capture and persist other information, this information can be retrieved at this point and also passed to the analytics tool. This requires a CMS_RENDER call containing the plug-in name as a parameter as well as the corresponding variable name and the Uid of the generated page, separated by a colon.

CMS_RENDER call. 

$CMS_RENDER(script:"getconfiguration", param:"pluginname:variable", srcUid:#global.node.uid)$';

6.2.4. Tracking plug-in registration

Finally, to link the selected analytics tool to the functions of the A/B-Testing module, the implemented tracking plug-in must be registered. The function required to do this, pluginRegistry, can be found in the A/B-Testing Head format template, which was imported with the configuration of the project component. It is incorporated into the page templates used upstream of the tracking plug-in and contains the register(name,plugin) method. The plug-in name and the tracking plug-in itself must be passed to the function.

Tracking plug-in registration. 

pluginRegistry.register('pluginname', pluginname);

7. Life cycle of an experiment

During its runtime, each experiment completes a specific cycle. A map of this cycle is shown below (see figure Life cycle of an experiment).

For reasons of complexity, when creating the graphic, it was assumed that once submitted, a release request will always be processed and never rejected.

Life cycle of an experiment
Figure 17. Life cycle of an experiment


The cycle begins with the creation and start of an experiment. This triggers a workflow which must release all of the pages and page references involved in the experiment.

In many cases, release workflows are processed in accordance with the principle of double-checking. This means that the release is initially just requested; it is not processed immediately. The experiment must then continue before progressing to the running stage. In the case of immediate release this step is omitted, as the experiment progresses directly to running.

It is possible to modify an experiment during its runtime. This can be done by editing or deleting an existing variant, for example, or adding a variant. After this, the experiment must be updated (at this point, the release can only be requested again).

The cycle stops when an experiment finishes. This step can be taken from within a modified experiment or a running experiment. At the end of an experiment, a new original page is selected, for which a new experiment can potentially be created.

8. Use in FirstSpirit

Installing the A/B-Testing module made various functions for carrying out experiments available in both FirstSpirit clients. These functions are equivalent in both clients. They are described below using a story. The example focuses on the ContentCreator. However, in principle, it is possible to carry out an experiment in both FirstSpirit clients.

The most fundamental permission for the steps described below, over and above the other permissions that are required, is that which enables the metadata to be viewed. If it is not visible, an experiment cannot be started, continued, updated or finished. The view permission is also needed to edit settings.

The story starts on the Services page of the Mithras Energy demo project, the content of which includes a teaser to prompt users to request a consultation. A closer look at this initial page shows that its design is not at all eye-catching. It does not contain a header and the teaser We visit you! is not in the direct line of sight of the viewer (see figure Original page).

This gives rise to the assumption that changing the design would encourage more visitors to request a consultation.

Original page
Figure 18. Original page


8.1. Creating an experiment

To check this initial assumption, a variant of the original page is created with the ActionsCreate experiment menu item. This variant and the original page are displayed as tabs on the A/B-Testing bar which then appears, with the variant selected for immediate editing (see figure A/B-Testing bar in ContentCreator).

A/B-Testing bar in ContentCreator
Figure 19. A/B-Testing bar in ContentCreator


At the same time, alongside the variant, a corresponding dispatcher page is created automatically in each of the PageStore and the SiteStore in SiteArchitect. The dispatcher page contains a list of all variants involved in the experiment; a reference is also created between it and the metadata of the variants. This generates a two-way assignment.

An experiment can only be created if experiments are allowed to be carried out for the selected page and it is not already involved in an experiment. If experiments are not allowed or if the page is already involved in an experiment, the menu item will not be visible. An error-free configuration and an adequate permission definition are also prerequisites. In its absence, the corresponding menu item will be visible in the ContentCreator but not activated.

These rules also apply for the SiteArchitect, where the Create experiment button is used to create an experiment.

Creating an experiment in the SiteArchitect
Figure 20. Creating an experiment in the SiteArchitect


It can only be used on page references in the SiteStore. Otherwise, it too is hidden from view. The A/B-Testing bar is then displayed in the preview.

A/B-Testing bar in SiteArchitect
Figure 21. A/B-Testing bar in SiteArchitect


The A/B-Testing module functions cannot be used in conjunction with the external FirstSpirit preview.

FirstSpirit version 5.1 supports Internet Explorer versions 8 and 9. If a newer variant is used, this will cause problems with the A/B-Testing bar in the SiteArchitect.

FirstSpirit 5.2 and higher will support Internet Explorer versions 10 and 11, so no problems will arise in this regard.

The URL of the original page is technically transferred to the dispatcher page. It is thus ensured that all existing references to the page will continue to function, and that the URL always looks the same to the outside world. This applies regardless of which variant is being displayed to the visitor.

The original page and the variants are also assigned a prefix. This prefix is added to the display name of the corresponding pages and page references. It is used to differentiate the elements that are involved in an experiment in SiteArchitect and to avoid problems when using a URL creator. Provision is made automatically to ensure that the assigned index is always unique. This also applies even if a display name has changed.

In the SiteArchitect, the default configuration of Content Highlighting creates a mutually reciprocal relationship between the workspace and the preview. This setting triggers automatic forwarding to one of the variants when the dispatcher page is selected. If this is not required, select Workspace → Preview on the main menu bar under ViewContent highlighting control.

8.2. Add variant

Any number of variants can be added by selecting the Add variant button (displayed as a plus sign) or using the duplicate option listed in the drop-down menu of each tab. A tab is displayed for each variant added. Only one variant is required for the story. This variant was created automatically as part of the process to create the experiment.

The changes required according to the assumption formulated above are made to the variant: a banner is added containing an image and a header. The teaser We visit you! is repositioned so that it is in the direct line of sight, and a different image is used (see figure Variant).

Variant
Figure 22. Variant


8.3. Delete variant

Variants can be removed from an experiment by selecting Delete variant from the drop-down menu of each tab on the A/B-Testing bar. The corresponding tab then disappears from view on the bar and the page reference is deleted in the SiteArchitect.

The original page is a special case in this regard. It is the initial reference on which the experiment is based. Therefore, as a general rule, it should not be removed from the experiment. However, if you do wish to delete the original page, you must first confirm a prompt (see figure Deleting the original page in the ContentCreator).

The page of the deleted variant is retained in the SiteArchitect by default. However, if you wish to delete this too, simply activate the corresponding option in the configuration of the project component.

The original page represents a special case in this regard too: If it contains more references, it will be retained, even if the delete option has been activated. If it were not retained, the page associated with the other references would be removed and this would lead to exceptions.

Only variants that are not currently in focus can be deleted in SiteArchitect.

Deleting the original page in the ContentCreator
Figure 23. Deleting the original page in the ContentCreator


8.4. Configuring an experiment

Once all variants have been added with Add variant, click the Edit settings button (displayed as three intermeshing cogwheels) and use the slider in the next dialog that opens to set a percentage value defining how many visitors to the website are to take part in the experiment (see figure Edit settings). If a percentage value is not set, all visitors to the website will always take part in an experiment.

Edit settings
Figure 24. Edit settings


The distribution rate for each variant can also be defined at this point. By default, all variants are displayed with the same distribution rate (100/n %). However, this can be changed freely by the user. The dialog can thus contain both data that has been calculated and data that has been entered manually. To differentiate between these two types of data, calculated values are displayed against a gray background.

If a manual definition only exists for some variants, the difference is divided equally between the remaining variants. If distribution rates have been set for all variants, they are used. The total of all values entered must add up to 100%.

This is illustrated in the examples below.

Example 1. No user input

In the case shown in the figure below, all variants have a distribution rate of 25% each. As this distribution rate has been calculated for all variant, the form fields are displayed against a gray background.

Default uniform distribution
Figure 25. Default uniform distribution




Example 2. User input for just one variant

In the case shown in the figure below, Variant_1 has a user-defined distribution rate of 40% and the other variants have a calculated distribution rate of 20% each.

Distribution with user input
Figure 26. Distribution with user input




Example 3. User input for multiple variants

In the case shown in the figure below, the original page and Variant_1 have user-defined distribution rates of 20% and 30% respectively and variants 2 and 3 have a calculated distribution rate of 25% each.

Distribution with multiple user inputs
Figure 27. Distribution with multiple user inputs




In order to query further information, additional elements can be incorporated into the dialog with the addGui tracking plug-in method. The number of additional elements is determined by the tracking plug-in implemented. In figure Edit settings, for example, the Google Analytics plug-in, which queries a dimension, has been used.

8.5. Starting an experiment

Once all necessary variants have been created and the participation rates and all other settings for the experiment have been configured, click Start to start the experiment. The button triggers the workflow selected during the configuration of the project component. The workflow must release all pages and page references involved in the experiment so that they will go live with the next deployment.

An error-free configuration and permission definition are prerequisites for starting an experiment. If these do not exist, the editor will see a corresponding message.

When the experiment is started via the A/B-Testing bar, the workflow is only executed on the dispatcher page. As described above in chapter API, the workflow must automatically include all variants. This is absolutely essential in particular if the workflow contains a deployment.

Once the workflow has been completed successfully, the name of the button changes to Running and the release is visualized by the status indicators in the ContentCreator and on the A/B-Testing bar changing color in the familiar way (see figure Running experiment).

Running experiment
Figure 28. Running experiment


8.6. Continuing an experiment

In many cases, workflows contain manual steps too. An example of this is the principle of double-checking used in releases. With this approach, the release is initially just requested; it is not processed immediately. During such manual steps, the experiment is at the workflow stage until it advances. The button on the A/B-Testing bar is labeled Continue (see figure Experiment in the workflow). Since the experiment counts as edited and not released in this status, the status is also visualized accordingly.

The experiment can only continue if an error-free configuration and an adequate permission definition have been created. If these do not exist, the editor will see a corresponding message.

Experiment in the workflow
Figure 29. Experiment in the workflow


8.7. Updating an experiment

It is possible to modify an experiment during its runtime. This can be done by editing or deleting an existing variant, for example, or by adding a variant. In this case, the name of the button on the A/B-Testing bar changes from Running to Update and the status is displayed in the familiar colors (see figure Experiment to be updated).

The experiment can only be updated if there is an error-free configuration and an adequate permission definition. If these do not exist, the editor will see a corresponding message.

Experiment to be updated
Figure 30. Experiment to be updated


Depending on the scope, making a change to an experiment during its runtime can affect its result, possibly distorting the data returned at the end of the experiment. Therefore, careful consideration should always be given before making changes and changes should only be made once an experiment is running if they are of high priority.

8.8. Analyzing an experiment

Tracking during the runtime of an experiment is essential in order to analyze the success of each individual variant. In this story, Google Analytics is used for tracking purposes, as it is required in order to use the tracking plug-in supplied with the module. However, in principle, any analytics tool can be used.

In Google Analytics, tracking takes the form of a custom report which is restricted to a segment created in advance. Restriction to the segment means that only visitors who are relevant to an experiment are captured. The report contains an overview of the individual variants restricted to a defined period of time in the form of a diagram and a table (see also figure Google Analytics - Custom report).

It is important to select a sensible runtime for an experiment. If the runtime of an experiment is too short, the information captured will not be statistically relevant. If the runtime of an experiment is too long, there is a risk of potential customers seeing a variant with poor quality content.

While the diagram shows visitor numbers during the defined period of time, the table contains information about the individual variants. It shows how often the form to request a consultation was completed and submitted. The values are indicated as both absolute and percentage values as well as being set against the total number.

For more information about custom reports and segments, refer to Reporting tools in the Google Analytics help.

Google Analytics - Custom report
Figure 31. Google Analytics - Custom report


8.9. Finishing an experiment

Once the variant which comes closest to achieving the predefined aim has been identified, the experiment can be stopped. For the story, figure Google Analytics - Custom report shows that the modified design encouraged more visitors to the website to request a consultation. The variant created has proved to be better than the original page. The analysis confirms the initial assumption made and Variant_1 should be used as the original page. The experiment must be opened again in the ContentCreator and the corresponding tab selected.

An A/B-Testing report containing an overview of all existing experiments has been added to both FirstSpirit clients.

To stop the experiment, click the Finish experiment button (displayed as a flag). A confirmation prompt is displayed (see figure Confirmation prompt). Confirm the prompt to apply the selected variant as the new original page.

The experiment can only finish if there is an error-free configuration and an adequate permission definition. Moreover, none of the elements of the experiment must be in a workflow.

The editor is notified accordingly if either of the above conditions are not met.

Confirmation prompt
Figure 32. Confirmation prompt


When the experiment finishes, the A/B-Testing bar disappears from view and the page references of the remaining variants, along with those of the dispatcher, are deleted. The URL of the dispatcher is also transferred technically to the new original page. This ensures that all existing references to the page will continue to function and the URL always looks the same to the outside world. This applies regardless of which variant was displayed to the visitor during the experiment. In addition, for the transfer of the URL, the prefix added to the pages and page references involved when creating the experiment is removed again. As a result, the new original page contains the original display names. These changes are persisted in the PageStore and the SiteStore by releasing the page and the page reference of the new original page, as well as the respective parent folder.

After an experiment finishes, the pages of the other variants are retained in the SiteArchitect by default. However, if you wish to delete this too, simply activate the corresponding option in the configuration of the project component.

The original page represents a special case in this regard: If it contains more references, it will be retained, even if the delete option has been activated. If it were not retained, the page associated with the other references would be removed and this would lead to exceptions.

9. Glossary

TermDefinition

Custom Dimension

The Custom Dimension is used to uniquely identify an experiment, thus enabling a distinction to be made between different experiments. This means that a separate dimension must be created manually for each experiment.

More detailed information is available in the Google Analytics help under Dimensions and measured values.

Dimension Id

The Dimension Id is only needed if you are using the Google Analytics plug-in. It is used to capture statistics from an experiment and must be specified in the settings dialog. To avoid data overlaps, a separate Id is required for each experiment.

Dispatcher

A dispatcher page is created automatically in the PageStore and the SiteStore when an experiment is created. This page is a technical page. The dispatcher page contains a list of all variants involved in the experiment; a reference is also created between it and the metadata of the variants. This generates a two-way assignment.

The URL of the original page is also transferred to the dispatcher page. This means that the URL always looks the same to the outside world. This applies regardless of which variant is being displayed to the visitor. It is thus ensured that all existing references to the page will continue to function.

Google Analytics Id

The Google Analytics Id is only needed if you are using the Google Analytics plug-in. It identifies the associated Google Analytics account and must be entered in the project settings of the project you are using.

10. Legal notices

The A/B-Testing module is a product of e-Spirit AG, Dortmund, Germany.

Only a license agreed upon with e-Spirit AG is valid with respect to the user for using the module.

Details regarding any third-party software products in use but not created by e-Spirit AG, as well as the third-party licenses and, if applicable, update information can be found in the file THIRD-PARTY.txt included with the module.

11. Disclaimer

This document is provided for information purposes only. e-Spirit may change the contents hereof without notice. This document is not warranted to be error-free, nor subject to any other warranties or conditions, whether expressed orally or implied in law, including implied warranties and conditions of merchantability or fitness for a particular purpose. e-Spirit specifically disclaims any liability with respect to this document and no contractual obligations are formed either directly or indirectly by this document. The technologies, functionality, services, and processes described herein are subject to change without notice.