Hyperparameter Tuning in Machine Learning
Introduction
After building and evaluating a model in Machine Learning, the next step is to improve its performance. This is done using hyperparameter tuning.
In this lesson, you will learn what hyperparameters are, why tuning is important, and how to use techniques like Grid Search and Random Search.
What are Hyperparameters?
Hyperparameters are configuration settings that control how a Machine Learning algorithm works.
They are set before training the model.
Examples
- Number of neighbors in KNN
- Depth of a Decision Tree
- Learning rate in models
Parameters vs Hyperparameters
Parameters
- Learned during training
- Example: weights in a model
Hyperparameters
- Set manually before training
- Control model behavior
Why Hyperparameter Tuning is Important
- Improves model accuracy
- Helps find optimal settings
- Prevents overfitting or underfitting
- Enhances model performance
Grid Search
Grid Search tries all possible combinations of hyperparameters.
How it Works
- Define parameter values
- Train model for each combination
- Select best performing combination
Advantage
Finds the best possible combination
Limitation
Computationally expensive
Random Search
Random Search selects random combinations of hyperparameters.
How it Works
- Define range of values
- Randomly sample combinations
- Evaluate performance
Advantage
Faster than Grid Search
Limitation
May not find the absolute best combination
Grid Search Formula Concept
Best Parameters=argmaxparamsScore(params)Best\ Parameters = \arg\max_{params} Score(params)
The goal is to find parameters that maximize model performance.
Implementation in Python
from sklearn.model_selection import GridSearchCV
from sklearn.neighbors import KNeighborsClassifier
X = [[1], [2], [3], [4]]
y = [0, 0, 1, 1]
param_grid = {‘n_neighbors’: [1, 3, 5]}
model = KNeighborsClassifier()
grid = GridSearchCV(model, param_grid, cv=3)
grid.fit(X, y)
print(grid.best_params_)
Random Search Example
from sklearn.model_selection import RandomizedSearchCV
param_dist = {‘n_neighbors’: [1, 2, 3, 4, 5]}
random_search = RandomizedSearchCV(model, param_dist, cv=3, n_iter=3)
random_search.fit(X, y)
print(random_search.best_params_)
Grid Search vs Random Search
Grid Search
- Tries all combinations
- More accurate
- Slower
Random Search
- Tries random combinations
- Faster
- Less exhaustive
When to Use Each
Use Grid Search
- When dataset is small
- When you want best results
Use Random Search
- When dataset is large
- When time is limited
Practical Tips
- Always combine tuning with cross validation
- Start with a small range
- Increase search space gradually
Conclusion
Hyperparameter tuning is essential for optimizing Machine Learning models. Techniques like Grid Search and Random Search help you achieve better accuracy and performance.
In the next module, you will learn about Deep Learning basics and neural networks.
FAQs
What are hyperparameters?
They are settings that control how a model learns.
What is Grid Search?
It tests all possible parameter combinations.
What is Random Search?
It tests random combinations of parameters.
Which is better Grid or Random Search?
Grid is more thorough, Random is faster.
Why is tuning important?
It improves model performance and accuracy.
Internal Link
To explore more courses and improve your skills, click here for more free courses



