How are your conversion rates? If you’re dissatisfied with them, then you’re not alone.
A common problem that almost every business faces is low conversion rates, which measure the rate of people who actually take a desired “action” (complete a purchase, click a link or button, subscribe to a newsletter) on your web page rather than simply visit and leave. A conversion rate is calculated as the ratio of the number of total visitors on your website to the number of visitors that convert to a desired action.
In practice, the average ecommerce business has a conversion rate of 1.33%, meaning that for every 1000 people that visit their website, less than 14 of them actually go on to make a purchase. A recent study showed that only 22% of businesses are satisfied with their current conversion rates, which means 78% of businesses are looking for solutions.
A case study from Flyte New Media sums up the 6 main reasons why conversion rates are so low in online businesses. They list the six factors: an ill-defined product or service, a target audience that is too broad, a website that isn’t search engine optimized (SEO), a product that offers too much and ends up being overwhelming, a lack of persuasive language, or no clear call-to-action. If any of these issues may apply to your business, it may explain low conversion rates.
One easy and cost-effective solution to all of these problems that will boost conversion rates in the process is by using A/B Testing.
What is A/B Testing?
A/B Testing is used to experiment with multiple versions of a developing product in order to identify which delivers the optimal conversion rate. Determining which version of your web page “performs better” will then result in increased lead generation, completed purchases, or link/button clicks. In the online world, A/B Testing can be used for any component associated with your business: a landing page, web page, or email campaign.
The process of A/B Testing works like this: a company or individual creates two versions of a product: Version A, the control or existing element, and Version B, the new element. These versions can differ in almost anything--wording, color, font, size, shape, or placement on the website. Once both versions are created, the company must decide on a duration of the test. During that time, both Version A and B will be “live.” As visitors approach the company’s page, 50% of them will be directed to Version A, while the other 50% to Version B. After the test ends, the company can analyze their performance and determine which version resulted in a higher conversion rate.
A/B Testing has some amazing benefits including improved content, greater profit margins, and lower decision-making risks, which you can read all about in one of our earlier blog posts.
Perfecting Your Loyalty Program
Now that you understand A/B Testing and how it works, let’s discuss some ways that you can easily integrate the tool into your own loyalty programs. In the following section, we utilize 7 Swell merchants to provide insight into ways that you can use A/B Testing to optimize conversions, reduce costs, and increase sales in your business. We also discuss possible outcomes of each potential A/B Test.
1. Points for Purchases.
Kopari Beauty, which sells beauty products made from 100% organic coconut oil, currently uses a rewards program that awards 1 point for every $1 that a customer spends. These points can be redeemed for discounts on future purchases. To test the optics of their loyalty program, Kopari could run an A/B Test that compares Version A, their existing promotion (1 point per $1, 100 points for a $5 discount) and Version B, a new promotion (10 points per $1, 1000 points for a $5 discount). Although the point per $1 spent is increased, the total spend required to redeem a discount ends up being the same. From there, Kopari could evaluate their customers’ response to both earning more points per dollar spent and having to spend more points to redeem discounts to decide on the optimal long term solution.
What could happen?
- Version A wins: Customer engagement and coupon redemption rates decrease perhaps due to the higher “perceived cost” of the coupons. If customers believe they’ve already accumulated a lot of value in points, they may be more likely to save their points until they can redeem a larger discount. Because fewer customers overall will earn enough points to redeem more expensive discounts, overall redemptions could see a decrease.
- Version B wins: Customer engagement and coupon redemption rates increase because customers may believe in a higher “perceived value” of the points. They may believe that they are earning more points per dollar spent, as their account always has thousands of points, and therefore will continue to spend, earn, and redeem for coupons.
2. Product Reviews.
Virgin Vapor, the leading provider of gourmet, organically flavored electronic cigarette liquid and refills, hosts a flashy rewards program on their website. They offer 15 reward points for writing a review of their products. Virgin Vapor could run a test that compares Version A (the existing 15 reward points for a review) and Version B, which offers 15 points for an email to the company providing feedback. Instead of public reviews, the company may gain more insight from individual emails that give crucial feedback. They can compare which version leads to more product feedback.
What could happen?
- Version A wins: Users review less from Version B. One reason could be that users would rather post their feedback publicly to the online community.
- Version B wins: Users review more from Version B. One reason could be that users are more comfortable writing private feedback to the company.
3. Newsletter Signup.
The Elephant Pants sells elephant-printed pants to raise money for the protection of wild African and Asian elephants. When you visit their website, a newsletter signup page pops up that rewards users with a 10% discount if they input their email. The company could run an A/B Test that compares the existing 10% discount (Version A) with a new 15% discount (Version B) to see which version results in more conversions.
What could happen?
- Version A wins: There would be no change in subscription rates.
- Version B wins: More users would subscribe because a 15% discount is enough for them to input their email.
The real question that could be answered here is: “What is the threshold of a discount that makes consumers subscribe to your website?”
4. Refer Your Friend.
Steiner Sports is the official memorabilia provider of the New York Yankees, New York Knicks, Derek Jeter, Odell Beckham Jr., and New York Rangers. They run a rewards program that offers $25 in credit to anyone that refers a friend who then makes a purchase on their website. Steiner Sports could use A/B Testing to run a test that rewards users with Version A (the existing $25 in credit) and Version B (an actual product that they sell, such as a baseball cap valued at $25). They can then compare referral engagement. Would people rather a $25 gift card in credit or an actual product?
What could happen?
- Version A wins: There would be no significant change in referral engagement.
- Version B wins: More users would refer if they received a baseball cap. One reason could be that they would rather a physical product than a $25 discount to their store.
5. Social Media Engagement.
Vanity Planet offers quality health, beauty, skin care, personal care and lifestyle products that enrich the lives of their customers. Currently, they offer a rewards program that gives 10 points for a “like” or a “share” on Facebook. Vanity Planet could run a test that compares Version A (the existing 10 points per action) and Version B (25 points for both “liking” and “sharing” the page during the same web session). Basically, Version 2 offers an increased reward for completing both actions at the same time. Then, the company could compare social media engagement.
What could happen?
- Version A wins: There would be no significant change in social media engagement.
- Version B wins: More users would “like” and “share” on Facebook at the same time because of the increased incentive.
6. Timing of Discount Pop Ups.
Flex Company sells new menstrual products for 12 hours of period protection. When browsing the website, a 20% discount coupon pops up after 1 minute of browsing. Flex Company could devise a test that uses Version A (1 minute before popping up) and Version B (30 seconds before popping up) to analyze conversion rates.
- Version A wins: There would be no significant change in newsletter engagement.
- Version B wins: More users (that would bounce before 1 minute) would sign up for the newsletter.
So How Do You Know if Your Results are Significant?
Once you have finished the A/B Tests, it’s important to understand their significance. To calculate it, all you need to know is each version’s (A and B) conversion rate, which you can find by dividing the number of overall conversions by the number of visitors on the specific page and then multiplying by 100 to convert the decimal into a percent. Compare the conversion rates using a tool such as Kissmetrics to determine if your data is significant.
Overall, A/B Testing is incredibly useful in understanding how potential customers react to your website, emails, or other online components in different ways--all with the focus on improving conversions.
So, if A/B Testing is right for your business goals, start testing!