Multi-Armed Bandits

Available
0
StarStarStarStarStar
0Reviews

Multi-armed bandit problems pertain to optimal sequential decision making and learning in unknown environments.

Since the first bandit problem posed by Thompson in 1933 for the application of clinical trials, bandit problems have enjoyed lasting attention from multiple research communities and have found a wide range of applications across diverse domains. This book covers classic results and recen...

Read more
E-book
pdf
Price
41.95 £

Multi-armed bandit problems pertain to optimal sequential decision making and learning in unknown environments.

Since the first bandit problem posed by Thompson in 1933 for the application of clinical trials, bandit problems have enjoyed lasting attention from multiple research communities and have found a wide range of applications across diverse domains. This book covers classic results and recen...

Read more
Follow the Author

Options

  • Formats: pdf
  • ISBN: 9781627058711
  • Publication Date: 21 Nov 2019
  • Publisher: Morgan & Claypool Publishers
  • Product language: English
  • Drm Setting: DRM