VWO Humans VS AI Challenge

We hope you enjoy this post.  Be sure to get a FREE expert audit of your website from Trinity today!   

CLICK HERE TO LEARN MORE

Craig Smith  |  Founder & CEO

Partner of Trinity & beloved testing tool, VWO (that’s Visual Website Optimizer) recently launched a friendly competition in promotion of their product’s latest feature. The challenge, ‘Humans vs. AI’, utilizes OpenAI’s GPT-3 to spin up AI-generated ideas for site copy and pit their performance against hand-written headlines, buttons, and product descriptions.

With over 350 submissions, Trinity was shortlisted and given special access to VWO’s visual editor that includes the new GPT-3 autoregressive language model. We are participating alongside 16 other competitors, only two of which await their results!

The Setup

When VWO reached out to us to participate, we immediately started brainstorming ideas. In our initial assessment, we found opportunities for experimentation copy on: 

  • CTA buttons
  • Promotional banners
  • Email sign up boxes
  • & even product titles and descriptions

There was a bit of trial and error that happened while we were getting our feet wet as many of the AI-generated ideas didn’t make grammatical sense, especially with call to action buttons where little copy is present. Fortunately, users are able to remove AI recommendations that aren’t appropriate in order to further train the AI.

Once we narrowed down our list of potential experiments, we decided our best use case for the Humans vs. AI contest was a global promotional banner on Schneiders Saddlery, one of our longtime partners and friends. This global banner is used to announce sales and promotions that are currently active for Schneiders. We used a promotion for Turnout and Stable Blankets as the test subject for our VWO’s experiment, started on August 10, 2020.  

Initially, the banner text read “Up to 40% off Turnout & Stable Blankets – Show Now”. The AI within VWO’s visual editor generated many recommendations. Some of those recommendations made sense, and some didn’t. We eventually arrived at a viable option for an experiment. The copy that was generated by the AI and tested against the control reads: “Shop now and save up to 40% on Turnout & Stable Blankets”.

CTA button text
Promotional banner text
Header 1 text

The Experiment

Since the promotional banner is an internal link, we tracked clicks on the banner  as our main goal for measuring statistical significance.

Our experiment ran from Aug 19, 2020 – Sep 2, 2020. During that time, we recorded a 1.08% click-thru rate on the Control and a 1.16% click-thru rate for the Variation (a 7.06% improvement). At 91% probability to beat the baseline, we didn’t quite reach statistical significance—but the AI copy did perform better than the human copy for the life of the test. We believe the Variation performed better because it leads with the actionable text “Shop now and save,” prompting users to engage.

Results:

  • Uplift: 7.06%
  • Statistical Significance: 91%
  • Winner: AI Copy

The Takeaway

This was a unique and educational exercise for Trinity to participate in. We know there are countless items that can be A/B tested on a website, but sometimes coming up with ideas for Variations is the hardest part.

Testing website copy has recently become a more popular topic. If you’re looking for low-risk optimization efforts during the upcoming holiday buying season, copy testing is definitely an option we recommend.

45 Ways to Boost Conversions

Get our free 30 page guide packed with ideas on how to grow conversions this year with our 45 Ways to Boost Conversions Guide (PDF).

About Trinity

Helping online brands grow since 2006. How can we help you? Find out today!

Get a FREE website diagnostic consultation and report for your site.

FREE WEBINAR

Secrets for Optimizing Conversion