Conclusion
Let’s revisit the core topics and the learning outcomes of this course.
In this course, we learned about optimization, which is one of the fundamental tools of mathematics and machine learning. Machine learning depends heavily on optimization because it gives the model the ability to learn from data and generate precise predictions.
Search-based methods
We started the course by highlighting the importance of optimization in machine learning and real-world scenarios. Then, we discussed how to formulate an optimization problem and what is meant by local and global optimal solutions. We classified an optimization problem into several categories based on the nature of its objective and constraints.
Here, we learned our first-order search-based optimization algorithms, like random search, grid search, and Nelder-Mead. We saw how these algorithms rely on iterating through the search space in a random or heuristic fashion to find the optimal solution to the objective function while satisfying all the constraints. However, we saw that these algorithms are inefficient when the search space is very large and are sensitive to the choice of search parameters. We also observed that algorithms like Nelder-Mead can get stuck at the local optimum and return suboptimal results.
Vector calculus
Next, we brushed up on our concepts of vector calculus by understanding and implementing gradients. We learned about derivatives and partial ...