Fresh Relevance launches Multi-Armed Bandit for intelligent and automatic campaign optimization
The new functionality for web, email and cross-channel experiences takes advantage of the latest innovations in machine learning, thus removing the limitations of A/B testing. Traditional A/B tests split traffic evenly between two or more variations and often require manual intervention once a statistically significant winner is found. In contrast, the Multi-Armed Bandit allows organizations to dynamically allocate traffic to variations that are performing well while allocating less and less traffic to underperforming variations. This optimization approach produces faster results, ensuring the organization automatically serves the variations that best meet their pre-defined goal, including increasing average order value, conversions and identification rate of site visitors.
One of the most compelling benefits of the Multi-Armed Bandit is the speed at which the system works to quickly identify and capitalize on user behavior, removing any delay in deploying the best variation for maximum ROI. As the Multi-Armed Bandit continues to recheck the performance of the initially less successful variations, it can identify changes in performance, e.g. based on weather, season or external events, and respond by serving this variation more frequently to ensure a highly relevant experience.
Optimization use cases include: ideal send time of triggered emails, e.g. cart abandonment messages, best-converting titles, copy and call-to-actions, e.g. of web banners, most engaging experience, e.g. of different ‘new visitor’ experiences
Chief Technology Officer at Fresh Relevance, David Henderson, states:
“For any online retailer that is looking to test and optimize their content, the Multi-Armed Bandit will be a very welcome addition to their testing and optimization toolbox. It provides organizations with peace of mind that the assets visitors are exposed to are the assets most likely to resonate with them and reward the business.”
The Multi-Armed Bandit is the result of Fresh Relevance’s successful ongoing partnership with the University of Portsmouth in the UK. It is available to users of Fresh Relevance’s Advanced Testing & Optimization Module now.