amuni3 / smpybandits Goto Github PK
View Code? Open in Web Editor NEWThis project forked from smpybandits/smpybandits
๐ฌ Research Framework for Single and Multi-Players ๐ฐ Multi-Arms Bandits (MAB) Algorithms, implementing all the state-of-the-art algorithms for single-player (UCB, KL-UCB, Thompson...) and multi-player (MusicalChair, MEGA, rhoRand, MCTop/RandTopM etc).. Available on PyPI: https://pypi.org/project/SMPyBandits/ and documentation on
Home Page: https://SMPyBandits.github.io/
License: MIT License