Case Study: How campaign re-structuring improved AdWords performance by 104%

Case Study: How campaign re-structuring improved AdWords performance by 104%

“Relevance” has been my mantra for AdWords success for the past 10 years. If you are at all familiar with running AdWords, you’ll know that keyword relevance, as measured by quality score, is the key to strong performance. 

Last year I read an intriguing article on the benefits of Single Keyword Ad Groups (S.K.A.G.), a popular, even controversial, topic among AdWords specialists. Experts argue both for and against it. And Google is very clear that it neither endorses nor discourages the strategy.

The hypothesis for Single Keyword Ad Groups is simple; by separating out each keyword into its own ad group, and writing ad copy specifically for that keyword, you drastically improve your relevance and quality score.

In spite of its apparent simplicity and good sense, I was unconvinced by the hypothesis. While I’m curious and open to new strategies, I’m also innately skeptical. And the article explaining the hypothesis came with no data validating the hypothesis, which was a red flag. It was time for a test.

Big Tree took one of our client’s most successful ad campaigns and created a campaign experiment to test S.K.A.G. The campaign we chose to test had produced the most historical revenue, and had the highest CTR, and best ROI, outside of our brand campaign.

I’m not going to go into the strategy of implementing S.K.A.G or how to do it. There are plenty of articles that go into that. The purpose here is provide a case study for the results.

 

Quick tip:

Converting the campaign to S.K.A.G. was time consuming. If you do this, expect it to be tedious. And make sure you are using AdWords editor. You’ll find that you can get creative with find and replace and save yourself hours of time in building out your new ads.

 

S.K.A.G. Campaign Experiment Results

  • We did a 50/50 split campaign experiment that lasted three weeks. 
  • Click Through Rates for the S.K.A.G. variation improved 47%
  • Impressions decreased 33%
  • Cost Per Click increased 12%
  • Total Clicks were flat
  • Total Cost increased 10%

 

A 47% improvement in CTR, for our best campaign, is no small improvement! However, the decrease in impressions and increase in cost appeared to negate the value. Without looking deeper, a manager would likely determine that S.K.A.G. raised costs for similar traffic, and that it was not worth pursuing. 

But I was fixated on the 47% improvement on CTR. There had to be value here. Could the decrease in impressions come from reducing the irrelevant traffic? Could the increase in cost also be related to traffic quality, but also the fact that our S.K.A.G ads were new and were still establishing quality score?

The logical next step would be to look at revenue to establish if the quality of traffic had improved for the S.K.A.G. experiment. Unfortunately, the number of transactions and revenue were about even between the original and S.K.A.G.

This put our agency in a challenging place. Remember, we are modifying our client’s most successful campaign, and our changes are not improving revenue and are increasing cost, while reducing brand exposure.

Our agency went out on a limb, and used the CTR improvement to justify a longer test. After running the test for several months, we saw the following results:

  • CTR improved by 104%
  • Impressions decreased 61%
  • Traffic decreased 29%
  • CPC Increase by 10%
  • Total Cost decreased by 21%

 This is looking similar to our previous experiment. Higher CTR is being negated by reduced exposure and higher CPC. At this stage, you might be wondering how we kept this client.

Fortunately, with the longer experiment we were able to account for a longer buying cycle. The S.K.A.G. experiment showed a dramatic increase in revenue, which supported the hypothesis that S.K.A.G. was improving the quality of the traffic.

  • S.K.A.G. revenue was up 49% over the original.
  • Dollar Per Visitor for S.K.A.G. was $9.36 per click vs only $4.81 per click for the original.

 

The net result was that we almost doubled the ROI on our client’s most profitable campaign. Our agency is now using and testing the S.K.A.G. approach for all our clients campaigns.

 

David Sinton

Co-Founder @ Quiet Owl Marketing | Performance Marketing, Digital Growth

6y

Great question Lori Weiman. Conversion rates were up 20%, transactions up 10%, and total revenue was up over 40%.  Landing pages were the same for both the control and the SKAG test.

Lori Weiman

CEO at The Search Monitor

6y

David - what were the results for sales and conversion rates on the landing pages?

Like
Reply

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics