Skip to content

Radically Better Results

Avatar of Euros Hogsden

More and more people are becoming aware of the key part that data can play in understanding the customer journey. As part of our ongoing work with clients, we are often asked by Product Owners to pass onsite data into reporting tools, such as Adobe Analytics.

This process varies from client to client. For example, with some clients, we might meet directly with Product Owners (POs) and take their requirements, then go on to enact the full journey right up until handing the insights back to the POs.

For other clients, it might be as little as activating appropriate variables in Analytics, or just advising on whether the internal strategy devised by the client is robust and will do what they want it to do.

 

Finding the right fit

For a full stack client, the PO might ask us to add an event on completion of an onsite survey. What we would typically do is go to the site, complete the survey, and observe the way it works. This is a personalised process; an SPA survey will need to be handled differently to a survey within an iFrame, which is different to an embedded survey, which is different again to a multi-page survey!  

For a simple survey, where the completion loads a new ‘Thank You’ page for example, it’s a simple task to construct some JavaScript via a Tag Management System to pass an Adobe event on that page. For the other survey types, different coding must be used: an SPA for example might need the site developers to include page triggers to tell us when the SPA viewpoint has changed. Once we have that, we can fire our event on ‘hearing’ that trigger occur on-site.

We’ll generally discuss this with the client before starting any work, ensuring they are happy with the approach we propose. Any additional work or stumbling blocks are often raised at the outset as well, such as adding triggers or the difficulty of tagging iFrame-based forms. 

 

Tagging, testing, tracking 

Once the tagging is completed, we test it (usually via a development site so no ‘real’ data is generated), with the events active for making sure the basic result is sound. At this point, it’s sometimes passed back to the client for internal QA again, although sometimes the client relies on us to do this as well. 

After all parties agree with the results of what’s been implemented, a date/time is agreed to get the changes live. There’s possibly a few steps here, or sometimes just one: 

  • If there are on-site changes supporting the release/what we are tracking, these go live first.

  • Then (or sometimes first if there are no on-site changes), we push our tracking changes live.

    • At this point, we would usually do a live check of the tagging before the variables are active in the analytics tool (where possible).

  • The last step is activating the variables in the tracking tool. This ensures, as much as possible, that the final data being delivered is correct. 

    • If the data is being pushed to an already active variable, this step is not needed.

Finally, the workflow is passed back to the client for their analysis, or we may analyse the data internally after some time (a week or more for low traffic pages/actions, a few hours for higher traffic items). We can supply the client with our thoughts on the added tracking, or we can allow them to derive their own insights.  

Whichever way the process is completed, the insights gained allow the client to understand the onsite journey of their customers more clearly and make improvements in light of this. These insights often make a key difference in increasing engagement, sales and revenue.