Hyperparameter Optimization in Machine Learning: Make Your Machine Learning and Deep Learning Models More Efficient

Author:   Tanay Agrawal
Publisher:   APress
Edition:   1st ed.
ISBN:  

9781484265789


Pages:   166
Publication Date:   29 November 2020
Format:   Paperback
Availability:   Manufactured on demand   Availability explained
We will order this item for you from a manufactured on demand supplier.

Our Price $145.17 Quantity:  
Add to Cart

Share |

Hyperparameter Optimization in Machine Learning: Make Your Machine Learning and Deep Learning Models More Efficient


Add your own review!

Overview

Full Product Details

Author:   Tanay Agrawal
Publisher:   APress
Imprint:   APress
Edition:   1st ed.
Weight:   0.454kg
ISBN:  

9781484265789


ISBN 10:   1484265785
Pages:   166
Publication Date:   29 November 2020
Audience:   Professional and scholarly ,  Professional & Vocational
Format:   Paperback
Publisher's Status:   Active
Availability:   Manufactured on demand   Availability explained
We will order this item for you from a manufactured on demand supplier.

Table of Contents

​Chapter 1: Hyperparameters Chapter Goal: To introduce what hyperparameters are, how they can affect themodel training. Also gives an intuition of how hyperparameter affects general machinelearning algorithms, and what value should we choose as per the training dataset.Sub - Topics1. Introduction to hyperparameters.2. Why do we need to tune hyperparameters3. Specific algorithms and their hyperparameters4. Cheatsheet for deciding Hyperparameter of some specific Algorithms. Chapter 2: Brute Force Hyperparameter TuningChapter Goal: To understand the commonly used classical hyperparameter tuningmethods and implement them from scratch, as well as use the Scikit-Learn library to do so.Sub - Topics:1. Hyperparameter tuning2. Exhaustive hyperparameter tuning methods3. Grid search4. Random search5. Evaluation of models while tuning hyperparameters. Chapter 3: Distributed Hyperparameter OptimizationChapter Goal: To handle bigger datasets and a large number of hyperparameterwith continuous search spaces using distributed algorithms and distributedhyperparameter optimization methods, using Dask Library.Sub - Topics:1. Why we need distributed tuning2. Dask dataframes3. IncrementalSearchCV Chapter 4: Sequential Model-Based Global Optimization and Its HierarchicalMethodsChapter Goal: A detailed theoretical chapter about SMBO Methods, which usesBayesian techniques to optimize hyperparameter. They learn from their previous iterationunlike Grid Search or Random Search.Sub - Topics:1. Sequential Model-Based Global Optimization2. Gaussian process approach3. Tree-structured Parzen Estimator(TPE) Chapter 5: Using HyperOptChapter Goal: A Chapter focusing on a library hyperopt that implements thealgorithm TPE discussed in the last chapter. Goal to use the TPE algorithm to optimizehyperparameter and make the reader aware of how it is better than other methods.MongoDB will be used to parallelize the evaluations. Discuss Hyperopt Scikit-Learn and Hyperas with examples.1. Defining an objective function.2. Creating search space.3. Running HyperOpt.4. Using MongoDB Trials to make parallel evaluations.5. HyperOpt SkLearn6. Hyperas Chapter 6: Hyperparameter Generating Condition Generative Adversarial NeuralNetworks(HG-cGANs) and So Forth.Chapter Goal: It is based on a hypothesis of how, based on certain properties of dataset, one can train neural networks on metadata and generate hyperparameters for new datasets. It also summarizes how these newer methods of Hyperparameter Tuning can help AI to develop further.Sub - Topics:1. Generating Metadata2. Training HG-cGANs3. AI and hyperparameter tuning

Reviews

The author keeps a firm grasp on the subject, going from a detailed description of what hyperparameter tuning is to the effective ways to use it. ... this book would be most useful to scholars and professionals working on machine learning models. Readers looking for implementational assistance with the performance of their models will be the best fit ... . (Niraj Singh, Computing Reviews, December 2, 2022)


Author Information

Tanay is a deep learning engineer and researcher, who graduated in 2019 in Bachelor of Technology from SMVDU, J&K. He is currently working at Curl Hg on SARA, an OCR platform. He is also advisor to Witooth Dental Services and Technologies. He started his career at MateLabs working on an AutoML Platform, Mateverse. He has worked extensively on hyperparameter optimization. He has also delivered talks on hyperparameter optimization at conferences including PyData, Delhi and PyCon, India. 

Tab Content 6

Author Website:  

Customer Reviews

Recent Reviews

No review item found!

Add your own review!

Countries Available

All regions
Latest Reading Guide

MRG2025CC

 

Shopping Cart
Your cart is empty
Shopping cart
Mailing List