Showing posts with label Apple. Show all posts
Showing posts with label Apple. Show all posts

Tuesday, May 27, 2014

More App Store Ad Experiments: Platform Targeting

Note: This essay continues from questions raised by the previous essay on A/B testing in the App Store

In my last essay, I explored some ideas on how to improve results in the App Store by experimenting with ads. I ran Facebook Ads to A/B Test the Click Through Rates (CTR**) of two different images.

Based on the differing CTRs, I jumped to the conclusion that the composition of the images made the difference. In one ad, an entire iPad was shown. In the other ad, most of an iPhone was cut off, except for the screen. In other words, I believed that the way I depicted the iPad or iPhone resulted in more clicks. I didn't doubt that I was one step closer to buying a tropical island.

Manton Reece read my article and immediately wondered if maybe the iPad vs. iPhone split could account for the difference. In other words, maybe folks were more likely to click on an ad that depicted the device they were holding in their hands. I immediately slapped my forehead. My app island slipped below the waves. I vowed to figure out what really happened with my last experiment.

TLDR: Manton was correct. Also, implementing the changes suggested by Manton’s hypothesis improved my CTRs quite a bit.

A New Experiment

To start examining the Manton Hypothesis, I first tried sifting through the ad data Facebook provides. Perhaps I could see which clicks belonged to iPads and which belonged to iPhones or iPod Touches*. Unfortunately I couldn't find that information. It didn't seem like I could examine Manton’s theory with the ad data I already collected.

No worries, I can find out with another experiment! The Facebook Mobile App Ads for Installs (!) allows the each ad to target a different device. Keeping the other parameters the same, I set the underperforming ad (the one with a photo of an iPad) to only target iPads. Even one day into the experiment, it seemed like Manton was correct. The CTR for the iPad photo jumped up nicely.

After a week and a half of targeting the iPad ad to the iPad, the CTR rose from 1.5% to 3.08%. Double!

I was so impressed, I decided to try the same thing for the ad with the cropped iPhone. I targeted it to only display on iPhones or iPod Touches.

This time I didn't expect to see a huge jump in performance. Why? Remember, in the previous article, the iPhone ad was already out-performing the iPad ad. If the Manton hypothesis was the only explanation for the differences in CTR, that implies that there were a lot more iPhone or iPod Touch users getting my ads. The diagram below shows how the ad with the iPhone Photo gets a higher CTR when both ads get the same mix of iPhone and iPad users.


As the diagram shows, the iPad photo (right side) gets a lower CTR because the viewers as a whole are dominated by iPhone users. According to the Manton Hypothesis, iPhone users aren't as likely to click on a photo of an iPad. That’s why the pie is smaller on the right, and why that pie is mostly iPad clicks.

But the iPhone photo has the reverse situation: the iPhone users still dominate, but this time they are seeing a iPhone photo. So in this case the majority of the users see a photo which matches the device in their hands -- a favorable situation. This leads to the bigger pie on the left. This time the less interested party is the iPad users. They are the smaller percentage of the folks seeing the ad. Their diminished CTR has a smaller impact on the aggregate CTR.

So, what happens when you target the iPhone ad at only the iPhone users? Just as the Manton Hypothesis predicts, the CTR improved, and the impact of targeting the iPhone ad to only iPhone and iPod Touch users wasn’t as large. The CTR for the iPhone photo the week before the change was 2.686%, and was 3.128% after the change.

Ad Description Combined CTR CTR Targeting only the pictured device
iPad Photo1.5%3.08%
iPhone Photo2.686%3.128%

Noise

But here is an important point: the lifetime Combined CTR when I was indiscriminately showing an iPhone photo to both iPhones and iPads was 2.931%. The baseline I picked for the numbers above were for the week before my change. The improvement looks pretty small when compared to the larger baseline. 2.931% vs. 3.128% doesn't seem as exciting as 2.686% vs. 3.128%.

Why was the week-prior CTR lower than the all-time CTR? I don't know for sure. The numbers I’m dealing with here are small. I’m not spending hundreds of dollars a day on ads, and I’m not getting a huge number of installs. I don't have a $50,000 advertising budget. These tests are being done for $5 a day.

My experiments here come from 1,000’s of clicks, not tens of thousands or millions. The sample size may not be large enough. What looks like an insight might just be noise. So take these results with a grain of salt. Or, even better, run your own experiments using your own money.

If you have a small budget like this project, the best cure for uncertainty is to hedge your bets and keep re-checking your assumptions. If you have real money riding on an outcome, it makes sense to double check your work. At the very least, be prepared to revert your changes!

Making a Model

I did have another idea for examining our results. What if we could mathematically model our hypothesis and see how it fits the data. I would feel more confident of the hypothesis if I could make a model that makes a decent prediction about a different data set.

One cool thing about these Ads is that we can see the size of the potential audience for each ad. The audience for the iPad Ad is 182,000 people. The audience for the iPhone Ad is 620,000 people. The only difference between these audiences is that one targets the iPad and the other the iPhone / iPod Touch.

So, lets make lots of assumptions about the size of the audience and the probability of getting a iPad versus a iPhone user. Lets assume that if we don't target the iPad or iPhone specifically, the probability the ad will be shown on either device is proportional to the size of the audience. For instance:

Probability(iPad) = 182,000 / (182,000 + 620,000) = 23%
Probability(iPhone) = 620,000 / (182,000 + 620,000) = 77%

So, now we can make some assumptions and create a model for the iPhone ad targeting both iPads and iPhones:

77% * sameDeviceCTR + 23% * differentDeviceCTR = combinedCTR

Or visually:


Now we can do algebra:

23% * differentDeviceCTR = combinedCTR - (77% * sameDeviceCTR)

differentDeviceCTR =  (combinedCTR - (77% * sameDeviceCTR) ) / 23%

now assume combinedCTR = 2.686% (the iPhone image CTR when targeting both devices) and sameDeviceCTR = 3.128% (the iPhone image CTR after targeting only iPhones)

then differentDeviceCTR = 1.2%

Now lets take the differentDeviceCTR we just calculated from the iPhone ad and see if it predicts the outcome for the iPad Photo ad in the same situation: targeting both iPhones and iPads.

In this case, the equation looks a little different because we're flipping sameDevice (now iPad, because we're considering the iPad image) and differentDevice (now iPhone):


77% * differentDeviceCTR + 23% * sameDeviceCTR = combinedCTR

Now we plug in the same numbers from before:

77%* 1.2%  + 23% * 3.128% = combinedCTR

combinedCTR = 1.64%

This model doesn't seem too horrible! It predicted 1.64% CTR for the iPad photo ad when targeted against both iPad and iPhone. The reality was 1.5%. I'm pleasantly surprised. Again, the Manton theory seems quite reasonable. I'll leave it to you to see what happens if we use the other iPhone photo combinedCTR of 2.931% baseline -- the model doesn't agree as well.

Conclusions

So what do we conclude from this exercise? For one, targeting your ads specifically to the iPad or iPhone user could be worth your time. I doubled the CTR of my under-performing ad with two clicks.

Even more importantly, running experiment on your ads can really pay off. And the steps aren't difficult: form a hypothesis, run an experiment using ads, collect the data, and make a simple model. With a model, you can try to predict the impact of your change.

With this first bit of new knowledge, maybe I'll save tons of money on ad spend. And then maybe I can apply what I learned to other areas of the sales funnel. And then I can try another experiment, learn, and implement. Maybe my tropical app island isn't entirely out of reach.

I’m really glad that Manton commented on my last article. Thanks! An extra set of eyes is invaluable!
*No, it’s not called the iTouch! Also, be aware that Facebook lumps together the iPhone with the iPod Touch. For certain questions that could be important.

** Yes, technically it should be TTR, not CTR, since you tap on an iOS device.

Monday, April 28, 2014

A/B Testing for the iOS App Store

Update May 1, 2014: Manton Reece had a great question about my results. Scroll to the end for details.

As someone with a career attached to the iPhone App Store, I sometimes feel jealous of the folks who sell things on the web. Websites can use analytics and split testing to learn lots of things about how to make their product more profitable.

In the App store, you feel lucky to see your sales numbers a day after they happen. If Apple tracks how customers found my app’s page, the number of visitors, or how much time they spent, they don't share it with me.


This shortage of information has interesting consequences. First, there are lots of tools out there that try to help developers figure out what's going on in the app store. Tools like SensorTower, App Annie, Flurry, and AppCase.

Second, you'll find lots of lore on how to boost rankings, get listed higher in searches for keywords, and how to get featured by Apple.

Finally, there are tricks we use to get better reviews in the store.

These techniques are nice, but they aren’t proactive or customer focused. Even the best of these tools tell you almost nothing about the organic traffic coming to your App Store listing. It isn’t even clear what impact an improved ranking in the app store has on conversions.

Am I supposed to take it on faith that efforts spent on improving my rankings will be repaid by increased sales? I want tools that help me build a product that customers want to pay for. I also want tools to make it easy for customers to find my product.

Rob Walling's book Start Small Stay Small advocates that a developer-entrepreneur worry about developing the product last. Finding a market for a product, and figuring out how to reach it are the first priorities. Once the entrepreneur has found a market, she can tailor the product to fit.

Can mobile app developers can do something similar? Can we learn about the market for our apps in spite of the opaque App Store, or are we doomed to just make apps and fight our way up the charts?

I don't think we're doomed. I've decided to stop treating the App Store like the mouth of Moving Average's customer acquisition funnel. Instead, I've been experimenting with Facebook's Mobile App Ads for Installs.

These ads are cool because you can target an audience based on the things users have expressed an interest in on Facebook. Not only that, but you can tap into some interesting data about who is clicking on your ads. Note that I’m not saying Facebook is the only solution. I hear that Twitter has some similar tools. I just haven’t had a chance to play with them yet.

My Facebook ads are now the mouth of the customer acquisition funnel. Each ad has a small amount of copy and a 1200 x 627 image. To make my first ad, Facebook requested two different images. Out of the gate, I would have an immediate split test! Cool. And now my customer acquisition funnel looks like the below image.


Note that I still have organic traffic that might throw off my understanding of who is getting through the checkout stage. Since this is an app that is relatively new to the App Store, and it doesn’t rank very high in any search or rankings, I can make some assumptions. The real benefit here is trying to understand why folks are clicking through the ads. I’ll be able to get a better handle on this when I install the Facebook library in the app — it is supposed to let you attribute installs to their ad campaigns.

To make my A/B test, I decided that one image would feature an iPad showing my app and one image would feature an iPhone.

I opened one of my existing iPad screenshots and realized that Facebook was asking for a strange image resolution. I spent some time experimenting on how to resize the asset from the app store to fit the Facebook ad. By the time I got the iPad mockup looking OK and then uploaded it, I was feeling impatient. Below is the image for the first ad.


Like I said, I was feeling impatient. I opened my iPhone mockup and haphazardly cropped off parts of the top and bottom of the phone so the image fit the required dimensions. Not my best work, but I figured I could always replace it. I uploaded the image and launched my ad campaign. You can see the haphazard image for the second ad below.


When I looked at my campaign the next day, which ad do you think was doing better? The second ad with my haphazard, off-the-screen iPhone! After seven days, the ad with the iPad image had a 1.6% click-through-rate while the iPhone had a 3.7% CTR. That’s a pretty big difference that held fairly constant.

With my next app update, I'll replace my first App Store screenshot image with something more like the winning Facebook ad. My hypothesis is that the continuity from the ad to the store listing will help sales. I'm also hoping that the organic App Store traffic will feel attracted to the image as well.

Let me know if you find this sort of post interesting and I'll try to write more about the business of selling a product in the App Store.

Update: Manton Reece of Core Intuition fame had a great question after reading my article, which you can read on App.net. Basically, he asks if the difference in ad performance could have resulted from the differences in iPad versus iPhone impressions. In my words, the hypothesis is: "Users are more likely to click on an ad that features the same device they view the ad on." One of the assumptions behind that hypothesis is that more of my ads are getting viewed on iPhones rather than iPads.

Unfortunately, I was unable to find a way to report the iPad vs iPhone split from the past impressions. Fortunately, the ads do allow targeting along that split. To test the hypothesis, I'm going to change the iPad ad to only target iPads. If the hypothesis is correct, I would expect the CTR to increase for that ad.

If that seems to work, I will also test targeting the ad with the iPhone against only iPhone users. The hypothesis would predict that ad would also get a higher CTR targeting only iPhones. The effect might not be as strong because, again, I assume more Facebook users view Facebook on the iPhone. Still, wouldn't it be wild if I could target each specific color of the iPhone 5c?

If Manton's hypothesis is correct, I've still made the correct decision for the app. I've already submitted an update where I replaced the first screenshot with an image that looks like the second ad, but matches the target device. iPad users on the app store will see an iPad with the top and bottom cut off. iPhone users will see the same iPhone as in the second ad.

Thanks Manton, I'll have a good laugh at myself if the image composition ultimately has nothing to do with the CTR! Even if the composition does have an effect, it's a great experiment. And I'm reminded again that getting third-party opinions and doing things in public is a good idea.
*Moving Average Inc. is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to amazon.com. Buying items through this link helps sustain my outrageous camera addiction and is much appreciated!

Tuesday, November 1, 2011

Why Not to Make an iPhone App

When I tell business owners that I make iPhone, iPad, and Android apps, one of the first questions I hear is:

Is it worth it to make an iPhone app for my business?

I've been thinking about it a lot recently, and I think that the answer is usually NO. Your business probably doesn't need an iPhone app.

This sometimes feels a bit awkward. I think many people want an excuse to make an iPhone app, because apps seem clever or high-tech. Lets look at some of the difficulties in creating an app.

Ideas

You might have a clear vision of what your app would look like, but you also need a clear idea of what your app should do. If you can't explain exactly how the app functions, you probably shouldn't try making one.

All sorts of concepts may or may not work as an app. One way for them to find out is to write a detailed story about what the app would do from a user's perspective.

A sword fighter might write about his app idea:
As the user, I would tap the iFence app and first see a menu titled "I want to..." followed by a list of choices: 
defend myself from bullies, 
impress women, 
attack ships and steal gold, 
extinguish multiple candles impractically but dramatically, 
have a good costume for halloween 
When I tap a choice, a screen will appear which explains whether a sword fighter could help with the situation. In some cases the app will tell me "No, we can't help you with that, that's a bad idea. Please don't try." In other cases, the app will respond "Oh yeah, we can teach you to do that" and then explain how, and provide a way to contact a sword fighter.
After writing down the concept from the user's perspective it's possible to ask questions like "Would it be easy for the user to do this without an app?", or "How likely is it that somebody would look in the app store for a solution to the problem this app solves?"

If you can't do that, or if you can't explain how your app would be different from those apps already in the store, you may not be ready to get an iPhone app developed.

Cost

Developing an iPhone app is expensive. Unless you regularly exercise in a swimming pool full of cash, you might find the market rate for contract iOS app development a tad high. I don't have a statistically significant data set, but the rates I've heard are often numbers like $120, $136, $150, or $175 per hour. These are rates for one or two developers working from home or a small office.

If you contract with a team that has an office filling an entire floor of a high-rise and buys lots of ads online and in magazines, expect to pay more. You can find some data about how much it costs to develop an iPhone app here.

More complex iPhone apps typically involve a team of experienced designers and developers working a significant number of hours. Experienced mobile developers are not cheap, and they are often not easy to find or hire. I've personally interviewed several alleged iPhone developers who seemed unfamiliar with basic development practices. Buyer beware.

Also consider design. Although sometimes a developer is a decent designer, it isn't uncommon to have a dedicated designer working on the graphic elements and overall look of an app. Just because your website has graphics doesn't mean they will be suitable for the iPhone; graphics for the iPhone have specific technical requirements and different expected appearances. Mobile designers cost money too.

There are cheaper mobile developers on the low end of the market, but be sure to talk to their clients and see what kind of apps they have in the store. I have heard in one case of a "cheap" shop who failed to deliver a useful product after six months of development and many thousands of dollars of client money spent. The client had to switch developers to get the job done; who knows if they will ever recover money from the developer who failed.*

Focus

Expense isn't the only reason an iPhone app might not make business sense. If you have limited resources, you may wish to focus your efforts on a product that will work on PCs, Android devices, ChromeBooks,  MacBooks, and so on -- probably a web app. The market for iPhone apps is large, but web apps have an even larger audience.

Are you really sure that your customers will want to use your service on the go? Will they benefit from the extra features possible in a mobile app? If you already have a web version of your software, what do your analytics tell you about mobile browser usage? Are many of your customers trying to use your app from an iPhone? Are they spending much time on it?

If you already have a customer base and they don't seem to need an iPhone app, be sure that you couldn't better spend your time working on a different project.

Support

iPhone apps have support costs.You may not like getting one star reviews on the app store, or emails from unhappy customers, but it will happen if your app is popular enough. 

iPhone apps will require updates. You will almost certainly change your feature set, and Apple will change their devices. The risk of Apple breaking your app is low, but the risk of you adding a feature or needing an app update is high.

The chances that your users will find a bug in the app is high also. Even fancy, completely competent software developers create apps with flaws. It happens to everybody -- even Apple releases software updates to fix bugs.

Expect to spend some time and money keeping your app up to date. Version one is unlikely to be your last version -- unless your app is a flop.

Market Research

iPhone app market demand is opaque. You might make an app that there is very little demand for. Unfortunately, I know no way of seeing what customers are searching for in the app store. BatTracker Pro might be awesome, but unless someone is searching for bat tracking software, or your app gets featured, you won't make much money.

Online at least, you can research what folks are searching for on google, and you can try Adwords placements without actually having invested much. With the App store, there is some expectation of a nice-looking, well featured app even for version one. An ugly "coming soon" app will almost certainly be rejected by Apple, and will get low scores in the App store.

I have not seen evidence backing this up, but iOS developers seem to believe that releasing a low-rated app initially will hurt your success even after the app has been cleaned up and polished to look and perform like a porsche. I'm not sure if it's true, but it certainly discourages me from releasing an unattractive product for the purpose of market research.

Rejection Risk

Your app might get rejected or banned. Check out the review review guidelines (someone re-published the review guidelines here, but they may not be up to date) Although the app store rules are far more clear than they were in the beginning, there is always a chance that your app might get rejected from the store. You might even be lucky enough to add to the list of banned categories (e.g. fart apps).

Unless your app is cutting-edge from the rules perspective, this is unlikely to keep you out of the app store. Still, it's worth considering the possibility of rejection. Finding apps with similar features to your own is at least some evidence that Apple is likely to accept yours.

Conclusions

I've given you a few reasons why developing an iPhone app might not be as romantic as it initially seems. It mostly boils down to time and money. Or you might call it opportunity cost and expense. Could you spend your limited resources on a project with a higher rate of return?

iPhone apps are no guarantee of financial success; I've had a hand in at least two apps that only bring in tens of dollars per month in revenue. I suspect that better marketing and market research would have helped.

To make money in the App store, good execution and good marketing are a requirement. Just being there won't cut it. In many cases, a "boring" old web app works just as well, if not better, than an iPhone app at generating revenue. This is especially true if you research the demand in advanced of building a product.

Despite these warnings, there are many good reasons you should consider making an iPhone app. As an iPhone, iPad, and Android developer, I'm passionate about mobile apps. I make a living by developing iPhone apps for myself and my clients. There are opportunities to be had in the app stores, but they do require up-front market research and planning to make them a hit. 


*Note that I'm not saying problems can't happen at an expensive contract house. I've head whisperings of work on fixed-price jobs slowly trickled out to too many clients. Think about those construction projects on highways that never seem to have anybody working on them. Software is difficult, and software planning and management is even more difficult. Track records matter.

Saturday, July 30, 2011

The Faces of WWDC

IMG_2135People make conferences worthwhile. Lots of value comes from the technical content, the parties, the swag. But years after the event, you remember the wonderful conversations and adventures with people.

At Apple's World Wide Developers Conference, there are more than 5000 people. Sadly, I wasn't able to meet them all. Instead, I took some photos of the more interesting characters around Moscone center during WWDC 2011. Some of them are developers, but some of them are marketers, and even a few are Apple protesters who were working hard to keep things interesting.

See all of the Faces of WWDC Photos.

IMG_2214