I was trying out the minimize function from scipy
import numpy as np import scipy.optimize fn = lambda x: x[0] + x[1] ineq = lambda x: 4 - x[0] ** 2 - x[1] ** 2 # ineq = lambda x: x[0] ** 2 + x[1] ** 2 - 2 cons = [ {"type": "ineq", "fun": ineq } ] x0 = np.array([0.0, 0.0]) output = scipy.optimize.minimize(fun=fn, x0=x0, method='SLSQP', constraints=cons) this gives me the correct answer $(-\sqrt{2}, -\sqrt{2})$ for the constraint $g(x)=4-x_0^2-x_1^2\ge 0$. But it gives me the solution $(0, 0)$ with minimum $0$ for the constraint $g(x)=x_0^2+x_1^2-2\ge 0$. However, for the function $g(x)=x_0^2+x_1^2-2\ge 0$ I would have expected the solution to be $-\infty$. I think I did a mistake somewhere but I am not sure where
Also, I was wondering if the SLSQP method can go outside the constraint, i.e. if during the optimization procedure it can evaluate the function to optimize $f$ in a value $x$ for which the constrain is not verified (i.e. $g(x) < 0$)