1. Amazon SageMaker Developer Guide: "The Hyperband tuning strategy treats the automatic model tuning process as a search for the best configuration in an infinite number of configurations... Hyperband uses a successive halving algorithm to choose the configurations to advance to the next rung. It stops underperforming configurations after one rung."
Source: AWS Documentation, Amazon SageMaker Developer Guide, "How Hyperband Works".
2. Academic Publication (Original Hyperband Paper): "We introduce HYPERBAND, a novel algorithm for hyperparameter optimization... We show that HYPERBAND can provide a principled approach to early-stopping which is simple, flexible, and effective."
Source: Li, L., Jamieson, K., DeSalvo, G., Rostamizadeh, A., & Talwalkar, A. (2018). Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization. Journal of Machine Learning Research, 18(185), 1-52. (Section 1: Introduction).
3. AWS Machine Learning Blog: "Hyperband is a simple yet powerful algorithm for hyperparameter tuning that maximizes the number of configurations that can be evaluated by using a principled early stopping mechanism."
Source: AWS Machine Learning Blog, "Amazon SageMaker Automatic Model Tuning now supports Hyperband" (Nov 26, 2019).
4. University Courseware: Lecture materials on AutoML often describe Hyperband as a key algorithm for efficient hyperparameter optimization through early stopping.
Source: Carnegie Mellon University, 10-708 Probabilistic Graphical Models, Lecture on "Bayesian Optimization and Hyperband". (Discusses Hyperband as a state-of-the-art method for early stopping).