Implementation of famous Bandits algortihm: Explore then commit, UCB & Thompson sampling in python.
niravnb / multi-armed-bandit-algortihms Goto Github PK
View Code? Open in Web Editor NEWImplementation of famous Bandits algortihm: Explore then commit, UCB & Thompson sampling in python.