CXL Growth Marketing Minidegree: Week 3

Photo by Element5 Digital

This is the third article out of 12 articles that describe my journey while attending lessons for the Growth Marketing Minidigree at the CXL Institute.

This week, I continued with the Conversion Research course by Peep Laja, the founder of the CXL Institute. I have to say that his lessons are remarkable. He’s a great instructor, always including great examples and reflecting on his errors from the past, mentioning real-world problems he worked on for his clients.

The next thing to learn was about live chat transcripts, and how important it is to always keep them in your database as they can provide many valuable insights about your audience. Some common tools for this are:

User testing

The next lesson is about user testing and how it differs from A/B testing. A/B testing is carried out on a live website for actual users who are unaware that they are being tested. On the other hand, user testing typically entails people watching recruited testers perform a series of activities on a website. User evaluation may be done in person or remotely over the internet, capturing the user’s screen and speech as they speak aloud about their thought process.

When to conduct user testing?

  • Whenever you begin optimizing a new website
  • Whenever you alter a vital aspect of your website
  • When you’ve completed a design makeover, but before you go live

According to the course, there are three ways to run usability tests:

  1. Over the shoulder testing
  2. Unmoderated remote testing
  3. Moderated remote testing

Mouse tracking

Mouse tracking tools track mouse movements and clicks on a website. Most tools have three ways to display mouse data: attention heat maps (generated by an algorithm based on mouse cursor movements), click maps (which show the page’s aggregated click activity), and scroll maps (how far down people scroll).

Some tools for mouse tracking:

Ring model

This model, created by Craig Sullivan, is a way of looking at the ‘layers’ or ‘steps’ that have been achieved. This works for a lot of websites (but not all). It depends on the depth of interaction rather than the number of pages visited.

Funnels and goal flows

You must set up funnels, no matter how difficult it is or how many regular phrases you have to use. If funnels were in place when you arrived, don’t take it for granted — double-check.

The goal flows are comparable but not identical. In terms of data reporting, they have several significant variations:

  • The possibility to see the actual user flow
  • The options to segment according to various goals

Key audience insights

Here are some of the most important things to pay attention to:

  • Audience: country, region, city
  • Behavior: visitor type
  • Demographic reports
  • Age, gender, interests

Content reports

This part refers to identifying pages that rank better than others by creating reports in Google Analytics. When it comes to content reports, you basically look at landing pages and analyze their performance. Top landing pages with higher bounce rates than the site’s average can be added to a list of issue pages to look at. The aim is to find landing pages with a lot of traffic but poor results.

When looking at content reports, pay attention to navigation summary and in-page analytics.

Custom reports

You must be able to generate any report that is needed. If you’re scared of them, then go in and play around until you get what you need. Here are some custom reports mentioned in the course:

  • Conversion rate per time of day
  • Conversion rate per day of the week
  • Conversions per traffic source
  • Conversions per keyword
  • Top-performing landing pages

The course ends with an introduction to copytesting and a guide on how to use it. Personally, I find this blog by CXL very helpful on the copytesting subject.

A/B testing

The next course on the program is about A/B testing. The instructor in this course is Ton Wesseling, the founder of the conversion optimization agency Online Dialogue. He has over 15 years of experience in conversion optimization and experimentation, helping over 50 organizations with their conversion optimization challenges.

The course starts with the history of A/B testing and an explanation about how we got the A/B testing concept as we know it today. Apparently, it’s been going on since 1995 and before!

Then, we continue with the value of A/B testing. The big promise of A/B testing is that it would prioritize efficacy. If you’re trying to do something, you want to know for sure that it will have an impact. That you are making the above decision based on more than a few or a few sprints, assessments, and views.

The next module recommends when to use A/B testing. We can use A/B-testing for new deployments or for research, to find out whether some change has a negative or a positive impact on the user behavior.

Moving on to the data that you need to conduct A/B tests, we learn that you can’t start an A/B test without having enough data. That’s why, before starting the experiment, you need to do thorough research. However, before doing this, you also need to pick your KPIs.

According to the course, there are several KPIs you can monitor:

  • Clicks
  • Behavior
  • Transactions
  • Revenue per user
  • Potential lifetime value

After learning the KPIs, there’s a very big and valuable lesson about how to conduct research before doing A/B testing. This part involves the 6V model Ton Wesseling follows during his research:

  • Value of the company
  • Versus (the competition)
  • View of the data
  • Voice of the customer
  • Verified data
  • Validated data

This lesson is followed by an explanation on how to set a proper hypothesis.

During the following week, I will continue and (hopefully) finish the A/B testing course. In the end, there will be an exam that will check how well I followed the lessons, so wish me luck :)

Head of Marketing @ Lendary | https://www.linkedin.com/in/sara-miteva/