How to enforce constraints in NMinimize

I am trying to minimize a function EtotalEM[R,Δ] that is defined only for Δ <= R/ ArcSinh[1] (approximately: Δ <= 1.15 R). I therefore try to set constraints that enforce this condition. It shouldn't matter at this point how exactly EtotalEM is defined. Here is the syntax for minimization, including the constraints: NMinimize[{10^18 EtotalEM[10^-9 R, 10^-9 Δ], {Δ <= R/ ArcSinh[1], Δ > 0, Δ < 100, R > 5, R < 10^4}}, {Δ, R} The numbers of 10^18 and 10^-9 are there to bring the variables close to one. The puzzling thing for me is now that I get an error NMinimize::nrnum: The function value 1084.22 -1.2459 I is not a real number at {R,Δ} = {5.0692,34.3423}. I don't understand why this should be a problem for NMinimize because I explicitly excluded the combination {R,Δ} = {5.0692,34.3423}. Apparently, my constraints are ignored. What is going wrong? Edit: I found a simple example that demonstrates the issue: NMinimize[{(1/Sqrt[x - y] Exp[x]) (1/y Exp[y]), {x > y, y > 0, x < 10000, x > 0}}, {x, y}]

fails with

NMinimize::nrnum: The function value 1.62712 -3.76001 I is not a real number at {x,y} = {0.069203,0.727035}.

If I omit the x>0 constraint (which is actually redundant since x>y and y>0 is already required), then the error message is

NMinimize::nrnum: The function value 312498. -4.15812*10^-244 I is not a real number at {x,y} = {-558.288,0.727042}.

In both cases, the actual result of the NMinimize command is some nonsense like x -> 860.974 instead of x -> 1. Also worth noting, without the last two constraints, the command completes without error messages and yields the correct result.

=================

  

 

I understand, that you want to keep your EtotalEM-funciton secret, but it would help to determine whats wrong there. Can you produce another, perhaps similar, function where the same happens?
– Julien Kluge
Sep 29 at 21:21

  

 

Yes, please see the edits.
– Felix
Sep 29 at 21:50

  

 

Related: mathematica.stackexchange.com/questions/42999/… are other related Q&A, too, I think.
– Michael E2
Sep 29 at 21:51

=================

1 Answer
1

=================

NMinimize tries to enforce constraints by adding a penalty function to the objective function (see Why does NMaximize does not follow the constraints that were given to it?). So it doesn’t strictly observe the constraints in its search for the minimum. One workaround is to use Piecewise to give a pretty high value outside the domain.

obj[x_, y_] :=
Piecewise[{{(1/Sqrt[x – y] Exp[x]) (1/y Exp[y]),
And @@ {x > y, y > 0, x < 10000, x > 0}}},
Exp[10000]
];
NMinimize[{obj[x, y], {x > y, y > 0, x < 10000, x > 0}}, {x, y}]
(* {12.6761, {x -> 1., y -> 0.5}} *)

  

 

Yes, it now converges to the correct result without error messages, however at a super slow speed (which becomes apparent when using a more costly function). To check, see ListPlot[Last[ Reap[NMinimize[{obj[x, y]}, {x, y}, EvaluationMonitor :> Sow[x]]]]] and compare to ListPlot[Last[ Reap[NMinimize[{(1/Sqrt[x – y] Exp[x]) (1/y Exp[y]), {x > y, y > 0}}, {x, y}, EvaluationMonitor :> Sow[x]]]]]
– Felix
Sep 29 at 22:22

  

 

Maybe this can be optimized by using the mysterious function “NonlinearInteriorPoint” instead of Exp[10000]. Unfortunately, I have no idea how to use this function.
– Felix
Sep 29 at 22:48

  

 

@Felix I think your two calls are using different methods. Try passing the same method to each, such as Method -> “NelderMead”. I get what look like identical results.
– Michael E2
Sep 29 at 23:19

  

 

Yes, indeed, with the method specified it gives both a good result. Thanks.
– Felix
Sep 29 at 23:41