|
|
|||
|
||||
OverviewWhen looking for ways to improve your website, how do you decide which changes to make? And which changes to keep? This concise book shows you how to use Multiarmed Bandit algorithms to measure the real-world value of any modifications you make to your site. Author John Myles White shows you how this powerful class of algorithms can help you boost website traffic, convert visitors to customers, and increase many other measures of success.This is the first developer-focused book on bandit algorithms, which were previously described only in research papers. You ll quickly learn the benefits of several simple algorithms including the epsilon-Greedy, Softmax, and Upper Confidence Bound (UCB) algorithms by working through code examples written in Python, which you can easily adapt for deployment on your own website.Learn the basics of A/B testing and recognize when it s better to use bandit algorithmsDevelop a unit testing framework for debugging bandit algorithmsGet additional code examples written in Julia, Ruby, and JavaScript with supplemental online materials Full Product DetailsAuthor: John Myles WhitePublisher: O'Reilly Media Imprint: O'Reilly Media ISBN: 9781306811170ISBN 10: 1306811171 Pages: 60 Publication Date: 01 January 2012 Audience: General/trade , General Format: Electronic book text Publisher's Status: Active Availability: Available To Order We have confirmation that this item is in stock with the supplier. It will be ordered in for you and dispatched immediately. Table of ContentsReviewsAuthor InformationTab Content 6Author Website:Countries AvailableAll regions |
||||