Intelligence Authentic + XGBOOST

Extreme Gradient Boosting to Lift Click Through Rates by 15%

 
XGBOOST+Visualized.jpeg

Challenge

Improving click through rates is a daunting task for most marketing organizations, engulfing ad budgets in eternal experimentation while often producing random results, even after qualitative optimization. In this case study, we show that machine learning not only reduces ad spend on low click probability ads, but also helps marketers target higher value customers.

At the forefront of analytics in this space is two fold: building variables that can differentiate 1000s of click events, and developing extensive algorithms to uncover insights from those variables. An excellent algorithm that tends to be a great starting point (yes, this was achieved with a starting point) is XGBOOST (eXtreme Gradient Boosting). Extreme gradient boosting combines decision trees with boosting algorithms, building successive trees that learn from prior mistakes. The final result is a highly specialized tree to both general trends in the data and in insightful, predictable corner cases.

Solution

For this experiment, 1000s of randomized click data were collected for the purposes of testing XGBOOST’s lift on CTR out of sample. When run on customized training data (in terms of sample selection, feature normalization, and cleaning) too complex to describe here.

Results

our XGBOOST algorithm achieved 10-15% out of sample performance improvement over baseline click through rates, while cutting the number of bids placed by over 50% in the best cases. In essence, the algorithm provides cost savings and optimization in two folds: cutting out the low probability bids that waste ad spend budgets, while optimizing budgets to customers most likely to click on ads.