From A to B cover image

Are Multi-Armed Bandits the Next Thing to "Kill CRO" ?

From A to B

00:00

How to Motivate Yourself for a Race

The next competitor, stepping up on the stage to try and kill experimentation is multi arm bandits. This is something that I've been seeing a lot of on LinkedIn. It's effectively using machine learning to auto allocate traffic based on whatever variation you have. That's winning the most. If it's 50 50, like I don't think I've ever actually seen a multi embedded that runs at 50 50.

Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner
Get the app