How Top Marketers Are Winning with A/B Testing in 2025
Team-GPT secures $4.5M led by True Ventures.

A/B Testing Framework for User Behavior Analysis with Casey Hill

Implement effective user behavior analysis through A/B testing, covering proven strategies for digital experience platforms and website optimization.
Picture of Margarita Arsova
Margarita Arsova
9 min read
Picture of Margarita Arsova
Margarita Arsova
9 min read
The ultimate A/B Tetsing Framework for Marketing teams with Casey Hill
Table of Contents

In a recent webinar hosted by Iliya Valchanov (Co-founder and CEO at Team-GPT), Casey Hill, (Chief Marketing Officer at DoWhatWorks), shared insights on implementing effective A/B testing strategies for user behavior analysis. The session that revealed how companies can optimize their digital experiences based on actual user interactions rather than assumptions.

Top 5 Takeaways

Evidence-Based Testing Beats Guesswork

Successful user behavior analysis requires focused intent rather than random assumptions. Create tests that address specific issues identified through heat mapping and analytics.

Only 10% of Split Tests Beat Controls

According to Optimizely research referenced by Casey, just 10% of split tests outperform control versions. Effective testing requires understanding underlying user behaviors rather than copying competitors.

Clarity Outperforms Simplicity

While simplicity is often emphasized in design, clarity and meeting user expectations prove more important in creating effective digital experiences that convert.

Visual Elements Impact Conversion

Tests consistently show that strategic visual elements like blurred product backgrounds can increase conversion rates by 25-95%, challenging the belief that minimal designs always perform better.

Test Features Within Your Platform

Beyond website testing, analyzing how users interact with features inside your digital experience platform provides critical insights for product development and marketing strategy.

Effective Frameworks for User Behavior Analysis

Implement a Unified Variable Testing Approach

When conducting user behavior analysis, many B2B SaaS marketing teams take a scattered approach based on personal preferences or competitor observations. Casey emphasized the importance of having a “unified variable set” – ensuring all test changes address a specific user behavior problem.

“One of the first things that’s really important when thinking about split testing is to have a very focused intent, very focused goal,” Casey explained. This approach transforms random guesswork into strategic decision-making.

For example, if heat mapping shows users quickly scrolling past your pricing page, this indicates they’re searching for information they can’t immediately find. Your tests should focus specifically on addressing this issue, such as adding reassurance text about trial terms or plan comparisons.

Establish Traffic Baselines Before Testing

A common question for marketing teams is when to start implementing user behavior analysis through A/B testing. While some resources suggest waiting until you reach 5,000 monthly visitors, Casey recommends focusing on traffic consistency:

“What’s more important is having a steady baseline from a reliable source. If your SEO is providing 2,500 hits per month, and it’s consistent, I’m actually okay to run an A/B test because I feel like I’m going to be able to get meaningful signal from that.”

For companies with irregular traffic patterns, it’s essential to account for varying user intent across different traffic sources. A visitor from a targeted webinar will behave differently than someone arriving from a social post, potentially skewing test results.

Prioritize High-Impact Page Optimization

When resources are limited, focus your user behavior analysis and optimization efforts on pages directly connected to revenue. Casey highlighted the importance of optimizing:

  1. Homepage
  2. Pricing pages
  3. Trial/signup flows

These high-intent touchpoints have the most direct impact on conversion rates and revenue. One example shared was how a simple payment option expansion led to a $200,000 revenue increase for an e-commerce business by addressing international market barriers.

Avoid Blind Competitor Copying

One key takeaway from the webinar was the risk of blindly copying competitors’ website features. Casey warned that since companies typically conduct A/B tests, you could unknowingly adopt their failing variants rather than their successful ones. This highlights the importance of conducting your own testing rather than assuming what works for others will work for you.

“If you arrive on a Wednesday and they’re running a 5% test… you hop on their website and you see version B, and you’re like ‘wow, this is a big company, they must be doing things right,’ and you copy a losing variant.”

DoWhatWorks uses technology to track thousands of A/B tests across the web, identifying which variants companies ultimately implement. This provides evidence-based insights into what actually works rather than what simply appears on competitors’ sites.

Real-World A/B Testing Examples

Case Study 1: Canva’s Background Test

Casey shared a test from Canva comparing plain backgrounds versus stylized visual backgrounds during the signup process. Contrary to what many participants expected, the stylized illustrated background was the clear winner over the plain version.

“This is where it’s so interesting when you study not only individual tests but when you study aggregate trends. Often times it challenges assumptions that we have,” Casey explained. This result challenges the common belief that simplicity always wins in design.

Casey suggested this might be because the illustrated background better represented Canva’s brand identity as a design platform. For a company focused on creativity and design, the more visually engaging background created a stronger connection with users during the signup process.

Case Study 2: Blurred Product Backgrounds in B2B SaaS

Tom Orbach from Wiz (recently acquired by Google) conducted an extensive study comparing plain backgrounds versus blurred product backgrounds across MyCase, MineOS, and Grin.

The results showed blurred product backgrounds consistently outperformed plain backgrounds, delivering an impressive 25-95% conversion lift. This highlights the importance of giving users a preview of what they’re signing up for.

Iliya confirmed this finding from Team-GPT’s own testing:

“I have run a similar test twice before and blurred is always the best. My interpretation is that it leads to higher activation because it’s already in their mind how the platform could look.”

Case Study 3: Square’s Competitor Comparison Test

The third test examined Square’s approach to competitor comparisons on their pricing page. The test compared:

  • Version A: Standard pricing tables alone
  • Version B: Pricing tables with an expandable grid comparing Square to competitors

Despite the common practice of highlighting competitive advantages, the version without competitor mentions (Version A) performed better. Casey explained:

“When it comes to a pricing page, you want to be very careful about anything that distracts from the core thing that you want people to do. Pricing page is high intent, bottom of funnel… I would be very careful about mentioning competitors on your pricing page.”

He added that competitor grids often lack credibility with users:

“People are very distrustful of your graph that says we do everything and our competitors are missing all these key things and they suck. That’s what most competitor grids look like.”

Internal Feature Testing at Team-GPT

Iliya shared Team-GPT‘s approach to testing features within the platform using “experimental tools” – a collection of features available for users to discover and try. These included:

  1. Prompt Builder
  2. Image to Text tool
  3. YouTube Video Navigator
  4. Speech to Prompt

By measuring usage patterns, the team discovered the Prompt Builder dramatically outperformed all other experimental features, providing clear direction for product development. This approach to user behavior analysis within the platform itself proved more valuable than website testing alone.

“We started measuring these things, and the Prompt Builder was the only one which picked up,” Iliya explained. “We have this super strong signal that this is something that needs to go into the main part of the software.”

Optimizing Digital Experience Platforms for Better Conversion

Strategic Pricing Page Elements

Analyzing successful digital experience platforms like Zendesk and Shopify, Casey highlighted three critical elements for effective pricing pages:

  1. Strategic Anchoring: Position your preferred plan between lower and higher-priced options to make it appear more attractive.
  2. Audience Indicators: Add clear labels like “Great for small businesses” or “Perfect for solo entrepreneurs” to help users quickly identify the right plan for their needs.
  3. Specific AI Value Statements: Replace generic “AI-powered” messaging with specific benefits, such as “AI agents available 24/7 to solve customer issues.”

“The top performing teams we see when it comes to AI descriptions are tying the AI to the specific output that they want,” Casey noted. This approach to presenting marketing AI tools creates clearer value propositions for potential customers.

Optimize Trial Signup Messaging

For website conversion optimization, Casey recommended including reassurance text on trial signup buttons. The phrase “No credit card required” consistently improves conversion rates when it accurately reflects your trial process.

However, Iliya added an important qualification about customer journey mapping:

“These people are not always the ones that activate the most and they don’t drive the most revenue customer lifetime value. So it’s very important for you to know what you are optimizing for.”

This highlights the importance of aligning your optimization goals with your broader business objectives – sometimes fewer, higher-quality conversions are more valuable than maximizing total signups.

Consider Time-to-Value in Pricing Structure

One of the most insightful discussions centered around matching pricing structures to your product’s time-to-value. Iliya shared how introducing quarterly plans increased revenue by 30% for an online course platform when they discovered many users needed the product for 3-4 months rather than a full year.

For B2B SaaS marketing, understanding your specific time-to-value is crucial for optimizing both trial length and pricing structure. Products requiring extensive integration or team setup may benefit from longer trials and more flexible pricing options.

Putting User Behavior Analysis into Practice

Start with Heat Mapping

Before implementing complex A/B tests, begin with heat mapping to understand how users actually interact with your pages. Look for patterns like quick scrolling, which indicates users aren’t finding what they need in the initial view.

Test with Evidence, Not Opinion

Whether you’re running formal A/B tests or simply implementing changes, always base decisions on observed user behavior rather than personal preferences. Casey emphasized: “You’re trying to get away from it just being arbitrary and based on opinions and sometimes even based on outdated experiences.”

Measure In-Product Engagement

Track how users engage with features inside your platform. Team-GPT’s approach of creating experimental features and measuring their adoption provides clear signals for product development priorities.

Understand Industry-Specific Patterns

What works for e-commerce may not work for B2B SaaS. Casey and Iliya both emphasized the importance of understanding your specific context rather than applying generic practices.

“Industry didn’t matter for our tool. It was about the function in the company,” Iliya noted when discussing Team-GPT’s customer discovery process. This insight helped them focus their messaging on roles rather than industries.

Take the Next Step with Your Customer Journey Optimization

As digital experience platforms continue to evolve, marketers must adapt their strategies to encompass both traditional website optimization and user behavior analysis within their products. By focusing on evidence-based decision making and understanding your specific customer journey, you can create more effective digital experiences that drive conversions and revenue.

Team-GPT empowers your team with collaborative AI that scales your Marketing!

Start for free today and discover how Team-GPT can transform your team’s collaboration.

Meggie
Margarita Arsova
Product marketer at Team-GPT

Margarita combines marketing expertise with product knowledge to help teams use AI effectively. She focuses on practical applications of AI in marketing, showing companies how to boost productivity while addressing common implementation challenges.