Marknadens största urval
Snabb leverans

Hyperparameter Optimization in Machine Learning

- Make Your Machine Learning and Deep Learning Models More Efficient

Om Hyperparameter Optimization in Machine Learning

ΓÇïChapter 1: HyperparametersChapter Goal: To introduce what hyperparameters are, how they can affect the model training. Also gives an intuition of how hyperparameter affects general machine learning algorithms, and what value should we choose as per the training dataset. Sub - Topics 1. Introduction to hyperparameters. 2. Why do we need to tune hyperparameters 3. Specific algorithms and their hyperparameters 4. Cheatsheet for deciding Hyperparameter of some specific Algorithms. Chapter 2: Brute Force Hyperparameter Tuning Chapter Goal: To understand the commonly used classical hyperparameter tuning methods and implement them from scratch, as well as use the Scikit-Learn library to do so. Sub - Topics: 1. Hyperparameter tuning 2. Exhaustive hyperparameter tuning methods 3. Grid search 4. Random search 5. Evaluation of models while tuning hyperparameters. Chapter 3: Distributed Hyperparameter Optimization Chapter Goal: To handle bigger datasets and a large number of hyperparameter with continuous search spaces using distributed algorithms and distributed hyperparameter optimization methods, using Dask Library. Sub - Topics: 1. Why we need distributed tuning 2. Dask dataframes 3. IncrementalSearchCV Chapter 4: Sequential Model-Based Global Optimization and Its Hierarchical Methods Chapter Goal: A detailed theoretical chapter about SMBO Methods, which uses Bayesian techniques to optimize hyperparameter. They learn from their previous iteration unlike Grid Search or Random Search. Sub - Topics: 1. Sequential Model-Based Global Optimization 2. Gaussian process approach 3. Tree-structured Parzen Estimator(TPE) Chapter 5: Using HyperOptChapter Goal: A Chapter focusing on a library hyperopt that implements the algorithm TPE discussed in the last chapter. Goal to use the TPE algorithm to optimize hyperparameter and make the reader aware of how it is better than other methods. MongoDB will be used to parallelize the evaluations. Discuss Hyperopt Scikit-Learn and Hyperas with examples. 1. Defining an objective function. 2. Creating search space. 3. Running HyperOpt. 4. Using MongoDB Trials to make parallel evaluations. 5. HyperOpt SkLearn 6. Hyperas Chapter 6: Hyperparameter Generating Condition Generative Adversarial Neural Networks(HG-cGANs) and So Forth. Chapter Goal: It is based on a hypothesis of how, based on certain properties of dataset, one can train neural networks on metadata and generate hyperparameters for new datasets. It also summarizes how these newer methods of Hyperparameter Tuning can help AI to develop further. Sub - Topics: 1. Generating Metadata 2. Training HG-cGANs 3. AI and hyperparameter tuning

Visa mer
  • Språk:
  • Engelska
  • ISBN:
  • 9781484265789
  • Format:
  • Häftad
  • Sidor:
  • 166
  • Utgiven:
  • 29. november 2020
  • Utgåva:
  • 1
  • Mått:
  • 155x235x0 mm.
  • Vikt:
  • 454 g.
  Fri leverans
Leveranstid: 2-4 veckor
Förväntad leverans: 5. augusti 2025

Beskrivning av Hyperparameter Optimization in Machine Learning

ΓÇïChapter 1: HyperparametersChapter Goal: To introduce what hyperparameters are, how they can affect the
model training. Also gives an intuition of how hyperparameter affects general machine
learning algorithms, and what value should we choose as per the training dataset.
Sub - Topics
1. Introduction to hyperparameters.
2. Why do we need to tune hyperparameters
3. Specific algorithms and their hyperparameters
4. Cheatsheet for deciding Hyperparameter of some specific Algorithms.

Chapter 2: Brute Force Hyperparameter Tuning
Chapter Goal: To understand the commonly used classical hyperparameter tuning
methods and implement them from scratch, as well as use the Scikit-Learn library to do so.
Sub - Topics:
1. Hyperparameter tuning
2. Exhaustive hyperparameter tuning methods
3. Grid search
4. Random search
5. Evaluation of models while tuning hyperparameters.

Chapter 3: Distributed Hyperparameter Optimization
Chapter Goal: To handle bigger datasets and a large number of hyperparameter
with continuous search spaces using distributed algorithms and distributed
hyperparameter optimization methods, using Dask Library.
Sub - Topics:
1. Why we need distributed tuning
2. Dask dataframes
3. IncrementalSearchCV

Chapter 4: Sequential Model-Based Global Optimization and Its Hierarchical
Methods
Chapter Goal: A detailed theoretical chapter about SMBO Methods, which uses
Bayesian techniques to optimize hyperparameter. They learn from their previous iteration
unlike Grid Search or Random Search.
Sub - Topics:
1. Sequential Model-Based Global Optimization
2. Gaussian process approach
3. Tree-structured Parzen Estimator(TPE)

Chapter 5: Using HyperOptChapter Goal: A Chapter focusing on a library hyperopt that implements the
algorithm TPE discussed in the last chapter. Goal to use the TPE algorithm to optimize
hyperparameter and make the reader aware of how it is better than other methods.
MongoDB will be used to parallelize the evaluations. Discuss Hyperopt Scikit-Learn and Hyperas with examples.
1. Defining an objective function.
2. Creating search space.
3. Running HyperOpt.
4. Using MongoDB Trials to make parallel evaluations.
5. HyperOpt SkLearn
6. Hyperas

Chapter 6: Hyperparameter Generating Condition Generative Adversarial Neural
Networks(HG-cGANs) and So Forth.
Chapter Goal: It is based on a hypothesis of how, based on certain properties of dataset, one can train neural networks on metadata and generate hyperparameters for new datasets. It also summarizes how these newer methods of Hyperparameter Tuning can help AI to develop further.
Sub - Topics:
1. Generating Metadata
2. Training HG-cGANs
3. AI and hyperparameter tuning

Användarnas betyg av Hyperparameter Optimization in Machine Learning



Hitta liknande böcker
Boken Hyperparameter Optimization in Machine Learning finns i följande kategorier:

Gör som tusentals andra bokälskare

Prenumerera på vårt nyhetsbrev för att få fantastiska erbjudanden och inspiration för din nästa läsning.