Monday, April 28, 2014

A/B Testing for the iOS App Store

Update May 1, 2014: Manton Reece had a great question about my results. Scroll to the end for details.

As someone with a career attached to the iPhone App Store, I sometimes feel jealous of the folks who sell things on the web. Websites can use analytics and split testing to learn lots of things about how to make their product more profitable.

In the App store, you feel lucky to see your sales numbers a day after they happen. If Apple tracks how customers found my app’s page, the number of visitors, or how much time they spent, they don't share it with me.

This shortage of information has interesting consequences. First, there are lots of tools out there that try to help developers figure out what's going on in the app store. Tools like SensorTower, App Annie, Flurry, and AppCase.

Second, you'll find lots of lore on how to boost rankings, get listed higher in searches for keywords, and how to get featured by Apple.

Finally, there are tricks we use to get better reviews in the store.

These techniques are nice, but they aren’t proactive or customer focused. Even the best of these tools tell you almost nothing about the organic traffic coming to your App Store listing. It isn’t even clear what impact an improved ranking in the app store has on conversions.

Am I supposed to take it on faith that efforts spent on improving my rankings will be repaid by increased sales? I want tools that help me build a product that customers want to pay for. I also want tools to make it easy for customers to find my product.

Rob Walling's book Start Small Stay Small advocates that a developer-entrepreneur worry about developing the product last. Finding a market for a product, and figuring out how to reach it are the first priorities. Once the entrepreneur has found a market, she can tailor the product to fit.

Can mobile app developers can do something similar? Can we learn about the market for our apps in spite of the opaque App Store, or are we doomed to just make apps and fight our way up the charts?

I don't think we're doomed. I've decided to stop treating the App Store like the mouth of Moving Average's customer acquisition funnel. Instead, I've been experimenting with Facebook's Mobile App Ads for Installs.

These ads are cool because you can target an audience based on the things users have expressed an interest in on Facebook. Not only that, but you can tap into some interesting data about who is clicking on your ads. Note that I’m not saying Facebook is the only solution. I hear that Twitter has some similar tools. I just haven’t had a chance to play with them yet.

My Facebook ads are now the mouth of the customer acquisition funnel. Each ad has a small amount of copy and a 1200 x 627 image. To make my first ad, Facebook requested two different images. Out of the gate, I would have an immediate split test! Cool. And now my customer acquisition funnel looks like the below image.

Note that I still have organic traffic that might throw off my understanding of who is getting through the checkout stage. Since this is an app that is relatively new to the App Store, and it doesn’t rank very high in any search or rankings, I can make some assumptions. The real benefit here is trying to understand why folks are clicking through the ads. I’ll be able to get a better handle on this when I install the Facebook library in the app — it is supposed to let you attribute installs to their ad campaigns.

To make my A/B test, I decided that one image would feature an iPad showing my app and one image would feature an iPhone.

I opened one of my existing iPad screenshots and realized that Facebook was asking for a strange image resolution. I spent some time experimenting on how to resize the asset from the app store to fit the Facebook ad. By the time I got the iPad mockup looking OK and then uploaded it, I was feeling impatient. Below is the image for the first ad.

Like I said, I was feeling impatient. I opened my iPhone mockup and haphazardly cropped off parts of the top and bottom of the phone so the image fit the required dimensions. Not my best work, but I figured I could always replace it. I uploaded the image and launched my ad campaign. You can see the haphazard image for the second ad below.

When I looked at my campaign the next day, which ad do you think was doing better? The second ad with my haphazard, off-the-screen iPhone! After seven days, the ad with the iPad image had a 1.6% click-through-rate while the iPhone had a 3.7% CTR. That’s a pretty big difference that held fairly constant.

With my next app update, I'll replace my first App Store screenshot image with something more like the winning Facebook ad. My hypothesis is that the continuity from the ad to the store listing will help sales. I'm also hoping that the organic App Store traffic will feel attracted to the image as well.

Let me know if you find this sort of post interesting and I'll try to write more about the business of selling a product in the App Store.

Update: Manton Reece of Core Intuition fame had a great question after reading my article, which you can read on Basically, he asks if the difference in ad performance could have resulted from the differences in iPad versus iPhone impressions. In my words, the hypothesis is: "Users are more likely to click on an ad that features the same device they view the ad on." One of the assumptions behind that hypothesis is that more of my ads are getting viewed on iPhones rather than iPads.

Unfortunately, I was unable to find a way to report the iPad vs iPhone split from the past impressions. Fortunately, the ads do allow targeting along that split. To test the hypothesis, I'm going to change the iPad ad to only target iPads. If the hypothesis is correct, I would expect the CTR to increase for that ad.

If that seems to work, I will also test targeting the ad with the iPhone against only iPhone users. The hypothesis would predict that ad would also get a higher CTR targeting only iPhones. The effect might not be as strong because, again, I assume more Facebook users view Facebook on the iPhone. Still, wouldn't it be wild if I could target each specific color of the iPhone 5c?

If Manton's hypothesis is correct, I've still made the correct decision for the app. I've already submitted an update where I replaced the first screenshot with an image that looks like the second ad, but matches the target device. iPad users on the app store will see an iPad with the top and bottom cut off. iPhone users will see the same iPhone as in the second ad.

Thanks Manton, I'll have a good laugh at myself if the image composition ultimately has nothing to do with the CTR! Even if the composition does have an effect, it's a great experiment. And I'm reminded again that getting third-party opinions and doing things in public is a good idea.
*Moving Average Inc. is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Buying items through this link helps sustain my outrageous camera addiction and is much appreciated!