gregrichardson
Greg Richardson

MOUNTAIN VIEW, Calif. – One of the key components in any technology business these days is the use of analytics to better understand and serve users. But with the power of analytics comes great risk, as well.

Speaking today at VentureBeat’s GamesBeat conference, Greg Richardson, the CEO of game development house Rumble, said one of the challenges is getting analytics to keep up with the pace of development.

“There’s a real challenge for analytics to move at the same speed as the business,” Richardson said.

But a focus solely on speed can prove problematic, according to Richardson.

Rumble received funding from Google Ventures, and one of the things Google did was set up a meeting between Rumble and Deepak Tiwari, Google’s Head of Analytics and Strategic Insights. According to Richardson, Tiwari told him of an important moment in Google’s history. The company was considering adding another sponsored link to its search results, and they were going to do a 30-day A/B test to see what the resulting change would be.

As it turns out, the change brought massive returns. Advertising revenues from those users who saw more ads doubled in the first 30 days. Google was on the brink of officially adding another ad to its results, when Tiwari “laid across the tracks” to get the company to continue its experiment instead.

So Google waited another 30 days. By the end of the second month, 80 percent of the people in the cohort that was being served an extra ad had started using search engines other than Google as their primary search engine. If the company had rolled the change out broadly, it’s entirely possible users worldwide would have fled.

The moral of the story, Richardson said, is that developers need to be cautious with how quickly they apply insights from their testing, until after at least some of the long-term effects come out. While it’s tempting to take insights that you gather about a new feature and apply them immediately, that path could be a road to ruin.

Comments

  • http://frugalmechanic.com/ Eric Peters

    Funny, we did a very similar experiment @ MSN Search when we still were serving up Overture ads – saw not quite as dramatic, but directionally similar results.

  • Ronny Kohavi

    It’s a story that may have been motivated by a real example, but
    unlikely true as told. Determining your OEC, the Overall Evaluation
    Criterion, is critical, and it can’t be just revenue.

    Putting more ads, bigger ads, and degrading algorithmic results
    all boost short-term revenue, but are all likely bad long-term if not
    controlled carefully (e.g., use bigger ads when you have high
    confidence). In our puzzling results paper (http://bit.ly/expPuzzling) we showed that degrading search results increases revenue and that sessions/user should be a key component of the OEC.

    Putting more ads will increase short-term revenue, but will
    degrade “user metrics” like sessions/user. You don’t need to run a
    controlled experiment for two months to see that effect though; we have
    regularly run such experiments at Bing and we’ve shown over and over that 1-2
    weeks is sufficient to get a strong negative signal from any experiment that
    naively increases vertical real-estate of ads.

    Add this to the bucket of myths: http://www.nytimes.com/1998/12/06/weekinreview/scientific-myths-that-are-too-good-to-die.html

    — Ronny Kohavi

  • zacag

    Lesson here is that data and experimentation are great, but should always be considered in the context of your business’ softer factors (in Google’s case: people may tolerate extra ads for the first month, but get frustrated after the second month).

    It’s nice to see A/B testing start to be applied to the world of games. Here’s an example of how just one test boosted IAP conversion rates by 125%: http://blog.splitforce.com/post/66332510495/driving-in-app-revenue-through-a-b-testing-reigndesign.

Job Listings on GeekWork