Multiple Variables Using SciPy
Learn to apply gradient descent and Newton's method to problems with several variables.
We'll cover the following
Until now, we’ve had a broad theoretical background in optimization. We’ve understood some algorithms and the importance of derivatives, gradients, and Hessians. To better grasp all this knowledge, we’ve implemented all the algorithms we’ve seen so far. Implementing our version of the algorithms is a great exercise that allows us to develop a deeper understanding of all the different elements that are involved when solving a problem.
As we’ve seen when visiting gradient descent or Newton’s method, there are advanced versions of these algorithms out there that can deal with saddle points and convergence in a better way. Explaining the details of those advanced algorithms is beyond the scope of this course, but that doesn’t mean we can’t use them. Once again, a Python library will allow us to do it: SciPy.
Our versions have been implemented in problems with just one variable. But real-life problems, like training machine learning models, can involve hundreds or thousands of variables. The versions provided by SciPy will handle multiple variables for us. Nevertheless, to effectively use SciPy, we need to know about the best ways to operate with so many variables. So let’s dive into SciPy and its advanced algorithms.
Get hands-on with 1400+ tech skills courses.