Many people question whether to focus more on quantitative data or qualitative data when analyzing their store. They’re hoping to find ways to improve the intuitive experience on the site and to increase the conversion rate. Experienced store owners and analysts don’t rely on just one data type but work with both.
Quantitative data is hard data, meaning it is measurable and does not contain any opinions or emotional bias. In e-commerce, tools like Google Analytics (GA) gather this kind of data. GA focuses on various metrics, such as the following:
User metrics: Number of users who engaged with your site
Landing page metrics: Number of users that landed on a specific page
Bounce rate metrics: Number of users who left your store without triggering another request
Transaction metrics: Total number of orders that were placed in your store
New vs. Return User metrics: Number of unique visitors to your store vs. number of visitors who returned to your store after a given period of time
Session metrics: Period of time the user spent on your store
With quantitative data, you answer “How” questions, such as, “How many users are dropping off at a certain point in the funnel?”
Qualitative data is the opposite. It’s soft data that is subjective and cannot be measured. The main goal of qualitative data is to describe the qualities and characteristics of users and their behavior. With this information, you can interpret the numerical metrics to answer “Why” questions such as, “Why are users dropping off at this point in the funnel?”
Analysts believe that the collection of both types of data provides substantive data for analysis.
Where Do I Start with Data Collection?
If you are running an e-commerce store and want to start gathering data, you first need to connect your store with a Google Analytics account and then set GA up correctly to report accurate data.
At Build Grow Scale, we focus on a process called “revenue optimization” for our e-commerce clients, and the foundation of this process requires the correct setup of both Google Analytics and Google Tag Manager. We ensure that we dig into the data because data does not lie. We link your store with GA, and, if you already have an account set up, we check to ensure that everything was set up correctly.
Once this is done, we set up goal funnels, depending on the nature of your business. Different businesses have different goals, and it’s important to understand what your client needs to achieve in order to set up, collect and interpret the data appropriately.
If your business is geared toward collecting leads, then we tailor the funnel steps toward this end goal. If your business is focused on selling products, then we design the funnel toward the purchase goal as well as ensuring that the data obtained will give us the metrics required to answer the hows. Later on, we interpret and visualize the data using Lucky Orange to help us determine the whys.
With the hard data collected, we’ll be able to read through and see how many users landed on the page, the location of these users, the bounce rate and how many users completed checkouts as well as the average order value.
As seen on the funnel below, we take a week’s data or a month’s data and see where we are losing users. It could be that a huge number of them land on the homepage, get to the product page, but then drop off without adding to cart. That’s a red light—we have an alert right there. With such numbers, you’re able to determine where you need to focus your attention. This shows us that we need to focus on the product page to improve the add-to-cart action.
Here’s an example of a funnel that we set up for one of our stores showing two weeks of data before and after a successful A/B split test.
Where Does Lucky Orange Come In?
Since quantitative data is just a mathematical analysis, we cannot 100% rely on the data and start making assumptions about what we need to fix on the product page. That’s when we use Lucky Orange to give us a visualization of the qualitative data, a clear picture of what is really happening when the users get to the product page.
There are several ways that you can analyze data in Lucky Orange, and we like to use all the options as they all give us information that we couldn’t see or get with the hard data alone:
Scroll heatmaps: They show us how far down the page users scroll.
Click maps: These give us visual data on user engagement with the website—mapping things on the desktop, like clicking on buttons and links as well as zooming images. It also maps mobile: tapping buttons and links as well as pinch-zooming. The elements with the most number of clicks are visually represented.
Move heatmaps: These heatmaps show the areas where users move the mouse. Those areas with the most moves are known as the hot areas and they are usually displayed with a rich orange color. The cold areas that receive fewer moves have a bluish color or no color at all.
With this information, we can ask ourselves some questions to find out why users are dropping off the product page (or any other page shown in the GA data):
- Are our click-to-action buttons above the fold?
- Are the important parts of our product page above the fold?
- Are we overloading users with too much information?
- Are there links on the product page that are not clickable?
- Is the information on the product page misleading?
- Is the user journey easy to follow, or did we complicate it with too many steps?
- Are users encountering bugs?
Here’s an example of Lucky Orange finding where users were dropping off the product page. It happened because the first step in the product selection process was autoselected for users when they arrived at the product page and, because this auto-selection wasn’t clear to them, they were unsure what to do next.
This image we gathered from Lucky Orange recordings. Using it, we could see that users would skip step 1 without realizing it and head straight to step 2. Later, they’d realize that they needed to go back to step 1 because they couldn’t proceed to step 2 without selecting the options available in step 1.
What Is the Next Step After Gathering All This Data?
After gathering the quantitative and qualitative data, it’s easy to feel overwhelmed. Most of the time, you won’t know what to do with this kind of information. We can take you through two steps that have always worked for us at Build Grow Scale:
1. We implement changes immediately
The easiest step to take is to implement the changes you have in mind with respect to the data gathered. This method has a limit, though. It applies to some of the changes that you can implement, but not all. Changes that can be made immediately include: fixing broken links on the store and changing the look of elements that cause friction (e.g., text that looks like a link but isn’t a link). These are basically bugs.
2. We test the data
Test, test, test is the song that we sing when we get new findings but aren’t sure how they will perform when implemented in the store. We advise that you test before implementing big changes. When you test the data, you can be sure of what your users want and how they will behave toward it. As Carly Fiorina says:
“The goal is to turn data into information and information into insight.”
—Carly Fiorina, former chief executive officer, Hewlett Packard
In this case, you need to create A/B split tests to test a variation of this same page that has been updated with the new changes. Next, send 50% of your traffic to the original page and 50% to the variation page to determine which works best. With the new data acquired through the A/B split test findings, you’ll now determine whether to remodel the page to match the variation or to keep it as is.
If by this time, you haven’t determined what the problem is, you can keep digging into the data to find other ideas for the A/B split test. You can set those tests up now, or you can use polls.
Following is an image of the product page on an e-commerce store before and after the test we conducted.
As seen on Lucky Orange, we kept all of the steps on the product page instead of autoselecting step one.
3. We use polls
Lucky Orange polls have always been a source of gold nuggets for us because we get quality information from the buyers. This is different than user testing where users who are not your buyers test your website.
When using the polls, ask users tailor-made questions that they can relate to and ones that resonate with the products you are selling. Some of the questions to ask them include:
- Was there something that almost stopped you from purchasing?
- Was the website easy to navigate?
- Is there something you would like us to improve on the website?
- Would you recommend us to a friend?
Getting responses to these questions will open your eyes to a different view of things, one that you hadn’t thought about. That’s because you will now have the buyer’s perspective instead of the seller’s perspective, and this is GOLD, right?
You can then choose to implement the findings right away if you get similar responses to the questions, or you can create an A/B split test to test the data.
How am I sure that my findings and implementations will yield positive results?
Whether your findings and implementations will produce positive results is largely determined by the data you gathered, how you analyzed it and how you implemented it.
If you choose to implement major changes on your e-commerce store without testing, you’ll be relying on your opinions (which are biased; no data is certain without running it through a test). You might end up fixing some element or problem that wasn’t broken, which ends up hurting your sales or creating other problems.
However, if you run the data through the A/B split test and find a winner or you acquire information from the poll answers and get a positive outcome from the test run, then you can proceed with more confidence. Considering that you only sent 50% of your store’s traffic to the “winning” option, you can be sure that when you send 100% of your store’s traffic, there will be a huge improvement in the store’s conversion rate, more user engagement (as the store becomes more intuitive to use), and lower bounce rates recorded.
What are the results of using the data effectively?
Understanding and implementing the data and findings correctly results in improved quantitative data reports that indicate better user metrics, improved landing page metrics, lower bounce rates, improvement in conversion rates and an increase in the transaction metrics.
Your store’s funnel goals, sales and conversion reports will now reflect improvements like the following:
The percentage of users that were dropping off will now be converted to greater engagement, which is likely to result in sales.
The percentage of users that were browsers will now be more engaged, making them potential buyers.
You may see improved customer retention rate and an increased percentage of return buyers.
You may also see an increase in organic traffic to your store through word-of-mouth recommendations as well as referral-driven sales.
To continue with the cycle, you can ask the same questions on the polls to find out if your users are now satisfied or if they still feel there’s room for improvement.
For example, when you ask if there’s anything that almost stopped the user from buying and you get a lot of responses saying, “None,” this is a positive sign. But if you get new suggestions, don’t worry and don’t let the data overwhelm you. You need to create new A/B tests so that you can test and observe if each buyer’s suggestions are relevant or irrelevant. If relevant, you can make the changes to your site, perform new A/B tests, and keep on collecting both the hard and soft data. The process continues in this way as long as it is useful to you.
As Aaron Koblin, an entrepreneur in data and digital technologies said when asked if you can have too much data:
“I think you can have a ridiculously enormous and complex data set, but if you have the right tools and methodology, then it’s not a problem.”
If you are working on a client’s e-commerce store and you show the improved conversion reports obtained because of the changes you made, your client will be very happy. Buyers will also be happily enjoying their shopping experience. It’s a win-win situation for both parties.
Here’s a result of the improved data that we got after running the A/B split test. Crazy how we overlook some things, right?
From the A/B split tests, we were able to find remarkable results: a 10% increase in the add-to-cart (ATC) ratio on desktop and an 18% increase in the ATC ratio on mobile. Here are the before and after stats from Google Analytics showing data for both desktop and mobile.
Lesson learned: Simplicity and consistency in how the website behaves is very important. Keeping the consistency by having each step expand as a result of a customer’s action resulted in better user engagement and improved the ATC ratio.
Conclusion: Rinse and Repeat
In conclusion, we would like to note that the process of data gathering does not end at that point. Keep on gathering both the quantitative and qualitative data for the various stages of your funnel, analyze the data, conduct A/B split tests, and make changes to the store every now and then when you find a winning test.
At Build Grow Scale we advise testing and implementation as the best way to identify the usability problems on your e-commerce store and also, a great way to find “low-hanging fruit”–those simple changes that could turn the tables and change your store from a three-figure to a six-figure store practically overnight.
Remember that you need to embrace the power of data and implementation. Don’t wait to collect data or put off making changes to your store because you are waiting for one “grand” rebuild. One big overhaul might result in more confusion for your buyers and could even negatively affect the rate of return buyers.
Instead, make small changes. And always keep in mind that data is the oil for the engine (e-commerce store). As the famous Jim Barksdale says: “If we have data, let’s look at data. If all we have are opinions, let’s go with mine.”
And my opinion is to follow the method laid out above! Cheers 🙂
Block, G. and T. Hilhorst. (2018). Why you need quantitative AND qualitative data. Mind the Product. https://www.mindtheproduct.com/need-quantitative-qualitative-data/
Froehlich, A. (2020). How to collect customer data to improve overall retail CX. Tech Target. https://searchcustomerexperience.techtarget.com/tip/How-to-collect-customer-data-to-improve-overall-retail-CX
(na). (nd). Quantitative vs. qualitative business research. The Hartford. https://www.thehartford.com/business-insurance/strategy/market-research/quantitative-qualitative
Owings, J. (2020). Your quantitative+qualitative digital experience flywheel. FullStory. https://blog.fullstory.com/quantitative-data-vs-qualitative-research-analytics-work-better-together/
Rowe, W. (2011). Aaron Koblins data art (Interview). Dazed Digital. https://www.dazeddigital.com/artsandculture/article/10222/1/aaron-koblins-data-art
Tabrizi, B. (2015). Carly Fiorina’s legacy as CEO of Hewlett Packard. Harvard Business Review. https://hbr.org/2015/09/carly-fiorinas-legacy-as-ceo-of-hewlett-packard
Theodore, N. (2017). We have data, let’s look at data. Virtual Store Trials.