Exercises
Explore how to apply optimization algorithms through hands-on exercises. Learn to extend binary search for multi-variable problems, adapt gradient descent for maximization, implement linear regression with SciPy, and perform ternary search for bimodal functions. Gain practical skills to solve various optimization challenges using Python.
Exercise 1: Extending binary search
We learned how to generalize gradient descent and Newton’s method to deal with more variables, but what about binary search? Well, let’s discover it ourselves. Adapt the binary search method we saw in the first lesson of this section to solve a problem with two variables. Then, solve the following problem:
Excercise 2: Maximizing
We stated that gradient descent and Newton’s method can be easily adapted to solve maximization problems. But what are the restrictions that the problem should fulfill so we can maximize it with these algorithms? Change the methods and solve the following problem with both of them:
Exercise 3: Implementing linear regression
Linear regression is one of the most popular machine learning algorithms. Given a function and a set of inputs and outputs of that function, we want to approximate it with a linear function as accurately as we can.
We call the original function and the approximation we’re looking for. The input is a vector ...