Exercises

Test what we’ve learned about optimization algorithms, SciPy, vectors, and matrices.

Exercise 1: Extending binary search

We learned how to generalize gradient descent and Newton’s method to deal with more variables, but what about binary search? Well, let’s discover it ourselves. Adapt the binary search method we saw in the first lesson of this section to solve a problem with two variables. Then, solve the following problem:

minx,yx+ys.t.:x+y>1\min_{x, y} x + y \\ s.t.: x + y > 1

Press + to interact
def f(x, y):
return x + y
def constraint(x, y):
return x + y > 1
def binary_search(a1, b1, a2, b2, f, cons, tol):
'''
Now we need two intervals, one for each variable.
'''
# Remove the following line and complete the code.
pass

Excercise 2: Maximizing

We stated that gradient descent and Newton’s method can be easily adapted to solve maximization problems. But what are the restrictions that the problem should fulfill so we can maximize it with these algorithms? Change the methods and solve the following problem with both of them:

maxx(x+3)2+5\max_x -(x + 3)^2 + 5 ...