FirstSpirit can be used to create complex websites. These websites are usually created based on a range of subjective decisions. Subjective decisions will also govern design.
The design of a website aims to stimulate interaction on the part of visitors to the site by the specific placing of calls-to-action
. These calls-to-action might include a request to subscribe to a newsletter, to download a document, or to purchase a product, for example. This approach is designed to maximize the conversion rate
. The conversion rate provides information about how many visitors respond to a call-to-action and in doing so become potential customers.
To achieve this aim, developers must constantly assess how and where to place calls-to-action
on a website in order to have the most resonance with visitors. This can seldom be decided on the basis of pre-existing information; developers must work it out for themselves. The more information developers have about visitors and the better they know them through this information, the more accurately they will be able to predict their behavior.
This is exactly where the A/B-Testing module comes in. The module can be used to show a variety of visitors several different variants of one website in the context of an experiment. The different variants of the website are distributed uniformly or according to a predefined rate. They appear again when the page is called up again subsequently. Therefore, the individual visitors are unaware that they are taking part in the experiment.
When the module is used in conjunction with an analytics tool, important statistical information can be gathered about visitors' behavior. This approach is also referred to as data-driven marketing
. It provides developers with feedback about the success of each variant of the website. They can see which change to the design achieves the best results in terms of maximizing the conversion rate
. At the end of an experiment, developers can apply this change directly by selecting the winning variant as the new original site.
When making changes in the future, developers will no longer have to rely on their gut instincts. They can start other experiments or use information already gathered in order to continuously optimize their websites.
The A/B-Testing module has the following technical requirements:
|
With the A/B-Testing module, editors can:
The corresponding functions are made available in both ContentCreator and the SiteArchitect preview when the module is installed and configured.
The A/B-Testing module functions cannot be used in conjunction with the external FirstSpirit preview. |
Familiar FirstSpirit tools are used to set up and maintain an experiment. Therefore, editors are not expected to have any knowledge in addition to being familiar with FirstSpirit.
An experiment is carried out with a release workflow. The workflow must include all of the pages and page references involved in an experiment. A workflow of this type is supplied with the A/B-Testing module. Based on the BasicWorkflows, it meets the specified requirements. If a project-specific workflow is to be used instead, it must be adapted accordingly with the API provided.
Tracking during the runtime of an experiment is essential in order to analyze the success of each individual variant. This is not part of the module and must be carried out on a project-by-project basis. An analytics tool (freely chosen by the user) is required for this purpose. In this regard, the A/B-Testing module has a completely open architecture and is not limited to specific services.
A tracking format template plug-in regulates the connection between the analytics tool and the FirstSpirit server. A plug-in of this type is supplied with the module as standard. It is set up for Google Analytics and takes little time and effort to configure for immediate use. Developers who wish to use a different analytics tool should implement a corresponding tracking plug-in.
When a page which is part of an experiment is called up, the A/B-Testing module creates cookies. Therefore, we recommend drawing the attention of users to the use of cookies (in accordance with the EU's ePrivacy Directive and - specifically in Germany - Section 15 Paragraph 3 of the Teleservices Act), if you do not already do this. |
The quick start guide
outlines the possible statuses of an experiment. It is designed for use by editors and provides them with an initial overview. Therefore, the content of this chapter has been kept brief deliberately; only the basic functions of the A/B-Testing module are described. For a detailed description of the module and additional information, see chapters Life cycle of an experiment and Use in FirstSpirit.
During its runtime, each experiment completes a specific cycle. A simplified map of this cycle is shown below (see figure Simple life cycle of an experiment).
The cycle begins with the creation
and start
of an experiment. This triggers a workflow which must release all of the pages and page references involved in the experiment.
On successful completion of the start, the experiment changes status to running
. The cycle stops when an experiment is finished
and a new original page is selected. A new experiment can potentially be created for this page.
An experiment must be created before it can be carried out. Experiments can be created in various ways in the SiteArchitect and the ContentCreator.
→
from the menu in ContentCreator. If the experiment is to be created for a different page, you must call it up first.
If the selected page reference is already involved in an experiment or if it is not permissible for creating an experiment, the menu item or button will be hidden.If the menu item or button is visible but deactivated, a configuration error has occurred or the permissions for carrying out an experiment are lacking. In this case you must contact the administrator. |
The A/B-Testing module functions cannot be used in conjunction with the external FirstSpirit preview. |
In both clients, the A/B-Testing bar appears when the experiment is created. It contains two tabs: one for the original page and another for the variant created automatically. The variant is selected for immediate editing (see figure A/B-Testing bar in ContentCreator and in the SiteArchitect preview).
Any number of new variants can be added to or existing variants can be deleted from a current experiment. Moreover, a distribution rate can be defined for each variant, along with a participation rate for the entire experiment.
Only variants that are not currently in focus can be deleted in SiteArchitect. |
Once an experiment has been created and edited, it can be started with the
button. If the experiment starts successfully, the button turns green and its label changes to .
Under certain circumstances, it is possible to create and edit an experiment without then being able to start it afterwards. A message dialog box appears instead and the experiment remains in its initial status. If the dialog box indicates that the workflow cannot be determined, the selection made when the A/B-Testing module was configured is incorrect. When an experiment is created, the information is simply checked to ensure that it exists, not for potential errors. This check is only run when an attempt is made to start an experiment. In such cases, please contact your administrator. |
A current experiment can be stopped with the A/B-Testing bar disappears from view.
button (displayed as a flag). The selected variant is set as the new original page and the
If an element of the experiment is still in the workflow, the experiment cannot be finished. The editor is notified accordingly and the experiment is retained. |
Various components must be installed and configured in order to use the functions supported by the A/B-Testing module. The steps to be completed are described in the following sub-chapters.
Use the abtesting-<version number>.fsm file supplied to add the module on the FirstSpirit Server. To install the module, open the ServerManager
and select →
.
The main panel contains a list of modules installed on the FirstSpirit Server. After clicking abtesting-<version number>.fsm file supplied with the module and click to confirm your selection. After successful installation, an A/B-Testing
folder is added to the list and must be given All permissions
(see figure Module management in the server properties).
After any module installation or update, the FirstSpirit Server needs to be restarted. |
A number of templates and project-specific settings are necessary in order to use the A/B-Testing module. Templates can be imported and the remaining configuration settings can be made via the project component, which must be added to the project you are using. To add the project component, open the ServerManager
and select →
.
A list of all existing project components is displayed in the main panel. After clicking A/B-Testing ProjectApp
and click to confirm your selection. The project component is then added to the list in the main panel and will need to be configured (see figure Project components in the project properties). To configure the project component, select the entry in the list and click to open the associated dialog (see figure Configuration dialog for the project component).
Workflow to start the experiment
First, specify a workflow to start an experiment in a combo box in the configuration dialog. This must be a release workflow that is to be created on a project-specific basis and releases all pages and page references involved in an experiment.
The combo box label is displayed in red until a workflow is selected. If a workflow is not specified and/or if any other configuration errors occur, it will not be possible to create an experiment. The corresponding button or menu item will be visible in both clients but not activated. |
Only a null check is performed when creating an experiment. Potentially erroneous information is not relevant to this check and is not taken into account. If the workflow selected in the combo box is deleted within the project, it will still be possible to create an experiment. The correctness of the data is not checked until an attempt is made to start, continue, or update an experiment. In such instances, a message dialog box is displayed for the editor and the experiment is paused in its current status. |
When the experiment is started via the A/B-Testing bar, the workflow is only executed on the dispatcher page. All variants of the experiment must also be included in the workflow automatically (see also chapter API). This is absolutely essential in particular if the workflow contains a deployment. |
Name of the prioritized transition
A workflow often contains several outgoing transitions from a single status. In such cases, the user must decide which of these transitions is to be selected. The workflow pauses here until it is continued in the current status. Accordingly, the reference name of a transition to be prioritized can be specified for the workflow selected in the project component. This transition is executed by the workflow and the other transitions are ignored.
If the transition to be prioritized cannot be identified or has not been defined, the transition with the reference name If the selected workflow contains just one outgoing transition for each status, a decision does not have to be made. This transition is always executed and there is no need to specify a prioritized transition in the configuration dialog. |
Selecting permitted page templates
The configuration dialog also displays a list of all page templates that are available to editors. This means that technical or similar templates that are hidden in the selection lists of the two clients are not displayed here. Select the permissible templates for carrying out an experiment from the list in the configuration dialog. The button or menu item for creating an experiment will be hidden in both clients for page references which have an underlying page based on a template that has not been selected. Use the key combination CTRL + A
to select all of the templates in the list. In this case there is no restriction at all and an experiment can be created on every page reference.
Initially, when the project component is configured, no template is selected. Due to this, the name of the list is displayed in red and it is not permissible to carry out experiments. The button or menu item is thus hidden in both clients until at least one template is selected. |
Deleting pages
Next, the user must decide what happens to the pages of the individual variants when variants are deleted from a current experiment or when an experiment is finished. The corresponding checkbox Delete pages
is deactivated by default. In this case, the pages of all variants are retained following the removal of a variant from a current experiment or after an experiment finishes. If the checkbox is activated, the pages of the variants that are not being used are deleted.
If the initial original page contains more references to other page references, it will be retained regardless. This still applies even if checkbox |
The page references of the variants that are not being used and those from the dispatcher are always deleted, regardless of the status of the checkbox. |
Importing templates
Finally, the various templates can be transferred to the project with the A/B-Testing module to the editor.
button. They provide the functions of theA web component must be added to the project. To add a web component, open the ServerManager
and select →
.
Inside the main panel, various tabs are visible, which contain a list of the existing web components. Select all the tabs in succession and click A/B-Testing WebApp
and click to add it. The web component is added to the list in the main panel (see figure Web components in the project properties).
The web component must be installed on an active web server
and then activated. The server can be selected using the selection box.
The |
More detailed information on how to add web components is available in the FirstSpirit Documentation for Administrators.
When different variants are created for an experiment, the associated dispatcher page reference is saved with them. This assignment is made using the metadata.
The template must be specified in the project properties and has to exist in the project for this purpose. If the template does not exist, start by creating it. If the template already exists, open the ServerManager
and select →
. Then, click the corresponding button to select the metadata template
(see figure Options in the project properties). Click to save the changes you have made.
Tracking during the runtime of an experiment is essential in order to analyze the success of each individual variant. Tracking is not part of the A/B-Testing module and must be carried out on a project-by-project basis. An analytics tool is required for this purpose (users are essentially free to choose which one). In this regard, the A/B-Testing module has a completely open architecture and is not limited to specific services.
The Google Analytics plug-in was added to the project with the import completed during the configuration of the project component. As its name suggests, this tracking plug-in is designed for Google Analytics and, therefore, requires a corresponding account.
If you are already working with another analytics tool, you should consider implementing your own plug-in for tracking purposes. Using the Google Analytics function is only recommended if you also use Google Analytics for other purposes. Otherwise, you should use a different analytics tool. |
Please refer to the supplier's documentation when configuring the analytics tool you have chosen to use.
Once the various components have been installed and configured and an analytics tool has been registered, a number of adaptations must be made in the project you are using. The steps to be completed are described in the following sub-chapters.
An experiment consists of a dispatcher (created automatically) and any number of variants. Each variant must be assigned to the dispatcher page of the corresponding experiment. This assignment is made with the following input component:
Metadata input component.
<CMS_INPUT_TEXT name="md_experiment_uid" hFill="yes" singleLine="no" useLanguages="yes" hidden="yes"> <LANGINFOS> <LANGINFO lang="*" label="Dispatcher" description="Dispatcher"/> </LANGINFOS> </CMS_INPUT_TEXT>
The input component is hidden. It must be added to the metadata template of the project you are using. The UID of the dispatcher page reference of the experiment is written to it automatically when a new variant is created.
If a metadata template does not already exist in the project, one must be created.
Select the metadata template in the project properties.
More information about metadata is available in the FirstSpirit Documentation for Administrators.
The functions of the A/B-Testing module are provided via four format templates. These templates must be referenced in the HTML code of all permissible page templates. The A/B-Testing Head
, A/B-Testing Body
, and Traffic Allocation Plug-in
templates were added to the FirstSpirit project during the import completed when the project component was configured. The fourth template corresponds to the tracking plug-in to be integrated.
A/B-Testing Tracking
format template. It is also needed to show and hide the A/B-Testing bar.
Google Analytics plug-in
supplied with the module, the tracking plug-in must be implemented on a project-by-project basis. Among other things, the plug-in contains the necessary tracking code which enables the success of the variants involved in an experiment to be determined. It is also used to query other specific data that might be available.
All four format templates are essential to the use of the A/B-Testing functions. They must be referenced in the page template you are using with CMS_RENDER
calls:
Referencing format templates in the page template.
<head> [...] $CMS_RENDER(script:"has_experiment", pageref: #global.node)$ $CMS_IF(hasExperiment)$ $CMS_RENDER(template:"abtesting_head")$ $CMS_RENDER(template:"traffic_allocation_plugin")$ $CMS_RENDER(template:"TRACKING PLUGIN REFERENCENAME")$ ❶ $CMS_END_IF$ </head> <body> $CMS_RENDER(template:"abtesting_body")$ [...] </body>
The reference name of the tracking plug-in to be integrated must be entered here. |
The JSTL
tag library below and the JSP
page directive must be integrated in addition to the format templates. However, unlike the format templates, both calls should be placed at the start of the HTML code.
If this is a JSP
project, the lines may already be there.
JSTL tag library and JSP page directive.
<%@ page language="java" contentType="text/html; charset=$CMS_VALUE(#global.encoding)$" %> <%@ taglib uri="http://java.sun.com/jsp/jstl/core" prefix="c" %>
It must also be ensured that all variants involved in an experiment are JSP pages. To make sure of this, the file extension jsp
must be defined in the Properties
of the corresponding page template (see figure Defining a file extension).
It is essential for editorial permissions to be in place in order to execute the various functions supported by the A/B-Testing module. The table below explains which permissions are needed during the runtime of the experiment and which function belongs to each permission.
Permission | Content | Site | Explanation |
---|---|---|---|
Visible/Read | The ability to view objects and read their content is required as standard. | ||
Change | Various steps in an experiment need the ability to change objects: | ||
Create object | The dispatcher and the first variant are created automatically when an experiment is created. Each additional variant must be added manually. In both cases, the permission to create objects is required. | ||
Create folder | This permission is not needed to carry out experiments. | ||
Remove object | When an experiment finishes, the project is purged of the page references and, depending on the configuration, of the pages of the discarded variants and the dispatcher. | ||
Remove folder | No folders are deleted in connection with an experiment. | ||
Release | Various objects are released when an experiment starts, finishes, and is continued. | ||
Show metadata | The visibility of metadata is fundamental when carrying out an experiment. If it is not visible, an experiment cannot be started, continued, updated or finished. The view permission is also needed to edit settings. | ||
Change metadata | When an experiment is created, as well as new variants, a two-way link is generated between the variants and the associated dispatcher. Since the link is stored in the metadata of the respective page references, it must be possible to change that data. | ||
Change permissions | This permission is not needed to carry out experiments. |
Select the root node of the PageStore or the SiteStore to distribute the editorial permissions. Then, right-click to open the context menu
, select →
, then open Permission assignment
. Assign the relevant permissions to the editors of your project in the dialog that appears.
When an experiment finishes, the page references and, depending on the configuration, the pages of the discarded variants and the dispatcher are removed from the project. The URL of the dispatcher is also transferred to the new original page. |
The workflow selected in the project component is executed on the dispatcher when carrying out an experiment via the A/B-Testing bar. However, it can also be carried out manually on the individual variants using the general FirstSpirit functions. In both cases, it is essential to release all pages and page references involved in the experiment in order to avoid errors during generation and when live.
The module is supplied with an example workflow which meets this requirement. However, it is also possible to adapt an existing project-specific workflow. Both options are full-fledged alternatives. They are described in the following sub-chapters.
Only one workflow at a time can be executed per experiment. This means:
In both cases, the editor is notified accordingly and the corresponding workflow cannot be started. |
When an experiment is carried out, all of the pages and page references involved must always be released. If a project-specific workflow was selected during the configuration of the project component, its function must be extended accordingly. The A/B-Testing API provides a number of methods for doing this; these are described briefly below. The abtesting-api
folder containing the Javadoc documentation is also included in the scope of supply.
true
or false
.
true
, the page reference is a variant.
true
and the execution of the workflow started manually on the variant must be canceled. If it was not canceled, status displays might be incorrect and there may be inconsistencies after going live.
true
.
The methods can be used in a script that is to be added to an activity of the release workflow being used. It makes sense to select the first activity in this case.
The A/B-Testing module is supplied with a workflow which meets the requirements described. To use it, select it in the configuration dialog for the project component. Since this workflow is based on the BasicWorkflows, these are a prerequisite of the project.
Installing the BasicWorkflows module
If the BasicWorkflows are not already part of the project, before using the workflow supplied, you must install the BasicWorkflows module on the FirstSpirit Server and activate the web component. Proceed as described for installing the A/B-Testing module and activating the associated web component. However, the web component for the BasicWorkflows is only needed on the ContentCreator
tab.
In addition, the workflow must be activated for ContentCreator by selecting the Element Status Provider
supplied. To do this, open the ServerManager
and select →
. Then change the Element Status Providers
entry to the BasicWorkflows Status Provider
entry and confirm the change by clicking .
Importing scripts and the workflow
Next, the BasicWorkflows scripts used for the release workflow must be added to the project. To do this, go to the Template Store
in SiteArchitect and click to select from the context menu. This opens an import dialog box where you should select the export_basic_release
import file from the workstation directories. Click to confirm; a second dialog is displayed listing all of the elements contained in the file. Since only the scripts are needed, the import of the release
workflow can be deactivated (see figure Importing scripts). Finally, repeat the same process to import the workflow supplied with the A/B-Testing module.
The imported workflow must now be authorized in the individual stores so that it can be executed on FirstSpirit elements. To do this, select Change permissions
from the context menu of the stores' root nodes to call up the permission assignment. Next, on the Workflow permissions
tab, activate the Authorized
and Use release permission
checkboxes for the imported workflow. To finish, click the button to close the dialog.
Functionality
Once installation and import processes are complete, the workflow can be used in the project. The workflow can essentially be executed on various elements:
Only one workflow can ever be executed per experiment. This means:
In both cases, the editor is notified accordingly and the corresponding workflow cannot be started. |
If the workflow is executed manually using the general FirstSpirit function, it is necessary to also manually reload the preview in order to update the status of the A/B-Testing bar. |
You can find more detailed information on the BasicWorkflows in the associated documentation.
An experiment is only worthwhile if it returns a meaningful result. This depends in turn on the success of the variants. Tracking during the runtime of the experiment is essential in order to analyze the success of each individual variant.
Tracking is not part of the A/B-Testing module and must be carried out on a project-by-project basis. In addition to an analytics tool that can be freely selected by the user, a tracking plug-in is required in the project for this purpose. The plug-in corresponds to a format template and must contain the tracking code, among other things.
The Google Analytics plug-in was added to the project with the import completed during the configuration of the project component. In order to use it, you must complete just a few configuration steps which are described in the next sub-chapter. You should only consider using the plug-in if you are also using Google Analytics for other purposes.
Otherwise, we recommend choosing a different analytics tool. In this case, a project-specific tracking plug-in is required. Developers can implement this plug-in easily. The requirements to be met are also described below.
The Google Analytics plug-in is supplied with the module. Designed for Google Analytics, as its name suggests, the plug-in is added to the project with the import completed during the configuration of the project component.
In addition to having a Google Analytics account, you simply need to add some information to the project settings template and select it in the project properties. It needs to be referenced in the page template too.
The plug-in also requires a Google Analytics Id
and a Dimension Id
:
Google Analytics Id
must be saved in the project settings.
Dimension Id
must be specified in the settings dialog when configuring an experiment.
All of the necessary steps are described below.
A Google Analytics account is required in order to use the Google Analytics plug-in that is supplied with the module. If you do not have an account, you can register here: http://www.google.com/analytics/
A wizard guides you through the steps of the registration process that must be completed in order to create an account.
Confirm the Google Analytics terms of use to complete the registration process.
More information about configuration is available in the Google Analytics help.
The Google Analytics plug-in is imported into the project during the configuration of the project component. It is made available as a format template and contains the tracking code for capturing the success of the variant. It also serves to query the Dimension Id
, which must be entered when configuring an experiment.
In order to use the Google Analytics plug-in, it must be integrated in all permissible page templates by means of a $CMS_RENDER$
call. It is important to note that the call must be positioned in the page header after the reference to the A/B-Testing Head
.
Referencing the Google Analytics plug-in in the page template.
<head> [...] $CMS_RENDER(script:"has_experiment", pageref: #global.node)$ $CMS_IF(hasExperiment)$ $CMS_RENDER(template:"abtesting_head")$ $CMS_RENDER(template:"traffic_allocation_plugin")$ $CMS_RENDER(template:"google_analytics_plugin")$ $CMS_END_IF$ </head> <body> $CMS_RENDER(template:"abtesting_body")$ [...] </body>
The Google Analytics plug-in supplied with the module uses the following input component, which must be added to the project settings template. If a project settings template does not already exist in the project, one must be created. The template must also be selected in the project properties.
Project settings input component.
<CMS_GROUP> <LANGINFOS> <LANGINFO lang="*" label="A/B-Testing"/> </LANGINFOS> <CMS_INPUT_TEXT name="ab_googleid" hFill="yes" singleLine="no" useLanguages="no"> <LANGINFOS> <LANGINFO lang="*" label="Google Analytics Id"/> </LANGINFOS> </CMS_INPUT_TEXT> </CMS_GROUP>
Within the project, the Google Analytics Id
must be added to the project settings by entering it in this field. The Id is used by the plug-in. It enables the variants involved in an experiment to be tracked by Google Analytics.
You will find the Figure 14. Path to find the Google Analytics Id |
If you are using the Google Analytics plug-in, in addition to the metadata template, the project settings template also has to be added to the project properties. The template is used for the provision of the Google Analytics Id; it must exist in the project and be expanded as described in the previous chapter. If the template does not exist, complete these steps first.
If the template already exists, open the ServerManager
and select →
. Then click the corresponding button to select the project settings
(see figure Options in the project properties). Click to save the changes you have made.
The Google Analytics plug-in adds a text box to the settings dialog of an experiment. Enter the Id of the Custom Dimension
created in Google Analytics in this box.
To do this, click the Edit settings). Then enter the Dimension Id
in the designated text box and click to close the dialog and save your entry.
More detailed information about the Custom Dimension
is available in the Google Analytics help under Dimensions and measured values
.
If you are using a different analytics tool for tracking, this can be added as a tracking plug-in at any time on a project-by-project basis. In this regard, the A/B-Testing module has a completely open architecture and is not limited to specific services.
The following code corresponds to the basic implementation of this type of tracking plug-in. The general structure of the code is shown:
Base code.
<script> var myPlugin=( function(){ $-- init variables --$ var myVar; ❶ return { $-- add gui --$ addGui: function (container) { ❷ myVar=('myPlugin' in config && 'myVar' in config['myPlugin']) ? config['myPlugin'].myVar : ""; }, $-- store additional params --$ storeParams: function () { ❸ var addParams={ $-- new parameter: store --$ "myVar":myVar } return addParams; }, $-- add analytics code --$ addTrackingCode: function (variant) { ❹ var trackingId='$CMS_RENDER(script:"getconfiguration", param:"myPlugin:myVar", srcUid:#global.node.uid)$'; } }; })(); pluginRegistry.register('myPlugin', myPlugin); ❺ </script>
Global variables (in this case | |
These global variables are used by the | |
This is followed by persistence of the values entered in the | |
They can also be passed to the tracking code, which is added with the | |
Finally, registration of the tracking plug-in must be completed with the |
Every tracking plug-in must implement the |
The addGui(container)
method is used to expand the configuration dialog. Click the button on the A/B-Testing bar to open this dialog box. It supports a number of configuration options as standard. These options are all specific to the associated experiment and, therefore, cannot be set globally (in the project properties, for example).
To expand the dialog, a DomElement (container
) is passed to the method. Additional DomElements can be incorporated into this DomElement. If default values saved upstream are to be written to form fields, these values can be read from the config
object.
Populating fields with default values.
variable = ('pluginname' in config) ? config['pluginname'].variable : "";
It is essential that all variables used for saving are initialized at the start of the tracking plug-in code. Otherwise, persistence of the information entered in the configuration dialog by the editor will not be possible. Initialization. var pluginname=( function(){ $-- init variables --$ var variable; return { $-- add gui --$ addGui: function (container) { [...] }, [...] }; })();
|
The tracking plug-in must be told which values are to be persisted. This is done with a map. The values in the map must use the following syntax:
Syntax of map entries.
"<NAME>":<VALUE>
The map is created in the storeParams
method and returned by it so that the persisted values are subsequently available on the page. They are used by the dispatcher page and transferred in a hidden input component.
Every analytics tool generally has a tracking code which is used to capture interactions on the website. The tracking code must be added to the tracking plug-in with the addTrackingCode(variant)
method. The variant
parameter is also provided for the purpose of identifying the individual variants involved in an experiment. It contains the Id of the variant displayed in each case.
If the addGui and storeParams methods have been used to capture and persist other information, this information can be retrieved at this point and also passed to the analytics tool. This requires a CMS_RENDER
call containing the plug-in name as a parameter as well as the corresponding variable name and the Uid of the generated page, separated by a colon.
CMS_RENDER call.
$CMS_RENDER(script:"getconfiguration", param:"pluginname:variable", srcUid:#global.node.uid)$';
Finally, to link the selected analytics tool to the functions of the A/B-Testing module, the implemented tracking plug-in must be registered. The function required to do this, pluginRegistry
, can be found in the A/B-Testing Head
format template, which was imported with the configuration of the project component. It is incorporated into the page templates used upstream of the tracking plug-in and contains the register(name,plugin)
method. The plug-in name and the tracking plug-in itself must be passed to the function.
Tracking plug-in registration.
pluginRegistry.register('pluginname', pluginname);
During its runtime, each experiment completes a specific cycle. A map of this cycle is shown below (see figure Life cycle of an experiment).
For reasons of complexity, when creating the graphic, it was assumed that once submitted, a release request will always be processed and never rejected. |
The cycle begins with the creation
and start
of an experiment. This triggers a workflow which must release all of the pages and page references involved in the experiment.
In many cases, release workflows are processed in accordance with the principle of double-checking. This means that the release is initially just requested; it is not processed immediately. The experiment must then continue
before progressing to the running
stage. In the case of immediate release this step is omitted, as the experiment progresses directly to running.
It is possible to modify an experiment during its runtime. This can be done by editing or deleting an existing variant, for example, or adding a variant. After this, the experiment must be updated
(at this point, the release can only be requested again).
The cycle stops when an experiment finishes
. This step can be taken from within a modified experiment or a running experiment. At the end of an experiment, a new original page is selected, for which a new experiment can potentially be created.
Installing the A/B-Testing module made various functions for carrying out experiments available in both FirstSpirit clients. These functions are equivalent in both clients. They are described below using a story. The example focuses on the ContentCreator. However, in principle, it is possible to carry out an experiment in both FirstSpirit clients.
The most fundamental permission for the steps described below, over and above the other permissions that are required, is that which enables the metadata to be viewed. If it is not visible, an experiment cannot be started, continued, updated or finished. The view permission is also needed to edit settings. |
The story starts on the Services page of the Mithras Energy demo project, the content of which includes a teaser to prompt users to request a consultation. A closer look at this initial page shows that its design is not at all eye-catching. It does not contain a header and the teaser We visit you! is not in the direct line of sight of the viewer (see figure Original page).
This gives rise to the assumption that changing the design would encourage more visitors to request a consultation.
To check this initial assumption, a variant of the original page is created with the →
menu item. This variant and the original page are displayed as tabs on the A/B-Testing bar which then appears, with the variant selected for immediate editing (see figure A/B-Testing bar in ContentCreator).
At the same time, alongside the variant, a corresponding dispatcher page is created automatically in each of the PageStore and the SiteStore in SiteArchitect. The dispatcher page contains a list of all variants involved in the experiment; a reference is also created between it and the metadata of the variants. This generates a two-way assignment.
An experiment can only be created if experiments are allowed to be carried out for the selected page and it is not already involved in an experiment. If experiments are not allowed or if the page is already involved in an experiment, the menu item will not be visible. An error-free configuration and an adequate permission definition are also prerequisites. In its absence, the corresponding menu item will be visible in the ContentCreator but not activated. These rules also apply for the SiteArchitect, where the button is used to create an experiment.Figure 20. Creating an experiment in the SiteArchitect It can only be used on page references in the SiteStore. Otherwise, it too is hidden from view. The A/B-Testing bar is then displayed in the preview. |
The A/B-Testing module functions cannot be used in conjunction with the external FirstSpirit preview. |
FirstSpirit version 5.1 supports Internet Explorer versions 8 and 9. If a newer variant is used, this will cause problems with the A/B-Testing bar in the SiteArchitect. FirstSpirit 5.2 and higher will support Internet Explorer versions 10 and 11, so no problems will arise in this regard. |
The URL of the original page is technically transferred to the dispatcher page. It is thus ensured that all existing references to the page will continue to function, and that the URL always looks the same to the outside world. This applies regardless of which variant is being displayed to the visitor. The original page and the variants are also assigned a prefix. This prefix is added to the display name of the corresponding pages and page references. It is used to differentiate the elements that are involved in an experiment in SiteArchitect and to avoid problems when using a URL creator. Provision is made automatically to ensure that the assigned index is always unique. This also applies even if a display name has changed. |
In the SiteArchitect, the default configuration of Content Highlighting creates a mutually reciprocal relationship between the workspace and the preview. This setting triggers automatic forwarding to one of the variants when the dispatcher page is selected. If this is not required, select |
Any number of variants can be added by selecting the create the experiment.
button (displayed as a plus sign) or using the duplicate option listed in the drop-down menu of each tab. A tab is displayed for each variant added. Only one variant is required for the story. This variant was created automatically as part of the process toThe changes required according to the assumption formulated above are made to the variant: a banner is added containing an image and a header. The teaser We visit you! is repositioned so that it is in the direct line of sight, and a different image is used (see figure Variant).
Variants can be removed from an experiment by selecting A/B-Testing bar. The corresponding tab then disappears from view on the bar and the page reference is deleted in the SiteArchitect.
from the drop-down menu of each tab on theThe original page is a special case in this regard. It is the initial reference on which the experiment is based. Therefore, as a general rule, it should not be removed from the experiment. However, if you do wish to delete the original page, you must first confirm a prompt (see figure Deleting the original page in the ContentCreator).
The page of the deleted variant is retained in the SiteArchitect by default. However, if you wish to delete this too, simply activate the corresponding option in the configuration of the project component. The original page represents a special case in this regard too: If it contains more references, it will be retained, even if the delete option has been activated. If it were not retained, the page associated with the other references would be removed and this would lead to exceptions. |
Only variants that are not currently in focus can be deleted in SiteArchitect. |
Once all variants have been added with Add variant, click the button (displayed as three intermeshing cogwheels) and use the slider in the next dialog that opens to set a percentage value defining how many visitors to the website are to take part in the experiment (see figure Edit settings). If a percentage value is not set, all visitors to the website will always take part in an experiment.
The distribution rate for each variant can also be defined at this point. By default, all variants are displayed with the same distribution rate (100/n %
). However, this can be changed freely by the user. The dialog can thus contain both data that has been calculated and data that has been entered manually. To differentiate between these two types of data, calculated values are displayed against a gray background.
If a manual definition only exists for some variants, the difference is divided equally between the remaining variants. If distribution rates have been set for all variants, they are used. The total of all values entered must add up to 100%.
This is illustrated in the examples below.
In the case shown in the figure below, all variants have a distribution rate of 25% each. As this distribution rate has been calculated for all variant, the form fields are displayed against a gray background.
In the case shown in the figure below, Variant_1 has a user-defined distribution rate of 40% and the other variants have a calculated distribution rate of 20% each.
In the case shown in the figure below, the original page and Variant_1 have user-defined distribution rates of 20% and 30% respectively and variants 2 and 3 have a calculated distribution rate of 25% each.
In order to query further information, additional elements can be incorporated into the dialog with the addGui tracking plug-in method. The number of additional elements is determined by the tracking plug-in implemented. In figure Edit settings, for example, the Google Analytics plug-in, which queries a dimension, has been used.
Once all necessary variants have been created and the participation rates and all other settings for the experiment have been configured, click
to start the experiment. The button triggers the workflow selected during the configuration of the project component. The workflow must release all pages and page references involved in the experiment so that they will go live with the next deployment.
An error-free configuration and permission definition are prerequisites for starting an experiment. If these do not exist, the editor will see a corresponding message. |
When the experiment is started via the A/B-Testing bar, the workflow is only executed on the dispatcher page. As described above in chapter API, the workflow must automatically include all variants. This is absolutely essential in particular if the workflow contains a deployment. |
Once the workflow has been completed successfully, the name of the button changes to A/B-Testing bar changing color in the familiar way (see figure Running experiment).
and the release is visualized by the status indicators in the ContentCreator and on theIn many cases, workflows contain manual steps too. An example of this is the principle of double-checking used in releases. With this approach, the release is initially just requested; it is not processed immediately. During such manual steps, the experiment is at the workflow stage until it advances. The button on the A/B-Testing bar is labeled (see figure Experiment in the workflow). Since the experiment counts as edited and not released in this status, the status is also visualized accordingly.
The experiment can only continue if an error-free configuration and an adequate permission definition have been created. If these do not exist, the editor will see a corresponding message. |
It is possible to modify an experiment during its runtime. This can be done by editing or deleting an existing variant, for example, or by adding a variant. In this case, the name of the button on the A/B-Testing bar changes from to and the status is displayed in the familiar colors (see figure Experiment to be updated).
The experiment can only be updated if there is an error-free configuration and an adequate permission definition. If these do not exist, the editor will see a corresponding message. |
Depending on the scope, making a change to an experiment during its runtime can affect its result, possibly distorting the data returned at the end of the experiment. Therefore, careful consideration should always be given before making changes and changes should only be made once an experiment is running if they are of high priority. |
Tracking during the runtime of an experiment is essential in order to analyze the success of each individual variant. In this story, Google Analytics is used for tracking purposes, as it is required in order to use the tracking plug-in supplied with the module. However, in principle, any analytics tool can be used.
In Google Analytics, tracking takes the form of a custom report
which is restricted to a segment
created in advance. Restriction to the segment means that only visitors who are relevant to an experiment are captured. The report contains an overview of the individual variants restricted to a defined period of time in the form of a diagram and a table (see also figure Google Analytics - Custom report).
It is important to select a sensible runtime for an experiment. If the runtime of an experiment is too short, the information captured will not be statistically relevant. If the runtime of an experiment is too long, there is a risk of potential customers seeing a variant with poor quality content. |
While the diagram shows visitor numbers during the defined period of time, the table contains information about the individual variants. It shows how often the form to request a consultation was completed and submitted. The values are indicated as both absolute and percentage values as well as being set against the total number.
For more information about custom reports
and segments
, refer to Reporting tools
in the Google Analytics help.
Once the variant which comes closest to achieving the predefined aim has been identified, the experiment can be stopped. For the story, figure Google Analytics - Custom report shows that the modified design encouraged more visitors to the website to request a consultation. The variant created has proved to be better than the original page. The analysis confirms the initial assumption made and Variant_1 should be used as the original page. The experiment must be opened again in the ContentCreator and the corresponding tab selected.
An A/B-Testing report containing an overview of all existing experiments has been added to both FirstSpirit clients. |
To stop the experiment, click the Confirmation prompt). Confirm the prompt to apply the selected variant as the new original page.
button (displayed as a flag). A confirmation prompt is displayed (see figure
The experiment can only finish if there is an error-free configuration and an adequate permission definition. Moreover, none of the elements of the experiment must be in a workflow. The editor is notified accordingly if either of the above conditions are not met. |
When the experiment finishes, the A/B-Testing bar disappears from view and the page references of the remaining variants, along with those of the dispatcher, are deleted. The URL of the dispatcher is also transferred technically to the new original page. This ensures that all existing references to the page will continue to function and the URL always looks the same to the outside world. This applies regardless of which variant was displayed to the visitor during the experiment. In addition, for the transfer of the URL, the prefix added to the pages and page references involved when creating the experiment is removed again. As a result, the new original page contains the original display names. These changes are persisted in the PageStore and the SiteStore by releasing the page and the page reference of the new original page, as well as the respective parent folder.
After an experiment finishes, the pages of the other variants are retained in the SiteArchitect by default. However, if you wish to delete this too, simply activate the corresponding option in the configuration of the project component. The original page represents a special case in this regard: If it contains more references, it will be retained, even if the delete option has been activated. If it were not retained, the page associated with the other references would be removed and this would lead to exceptions. |
Term | Definition |
---|---|
Custom Dimension | The |
Dimension Id | The |
Dispatcher | A dispatcher page is created automatically in the PageStore and the SiteStore when an experiment is created. This page is a technical page. The dispatcher page contains a list of all variants involved in the experiment; a reference is also created between it and the metadata of the variants. This generates a two-way assignment. |
Google Analytics Id | The |
The A/B-Testing module is a product of e-Spirit AG, Dortmund, Germany.
Only a license agreed upon with e-Spirit AG is valid with respect to the user for using the module.
Details regarding any third-party software products in use but not created by e-Spirit AG, as well as the third-party licenses and, if applicable, update information can be found in the file THIRD-PARTY.txt
included with the module.
This document is provided for information purposes only. e-Spirit may change the contents hereof without notice. This document is not warranted to be error-free, nor subject to any other warranties or conditions, whether expressed orally or implied in law, including implied warranties and conditions of merchantability or fitness for a particular purpose. e-Spirit specifically disclaims any liability with respect to this document and no contractual obligations are formed either directly or indirectly by this document. The technologies, functionality, services, and processes described herein are subject to change without notice.