Minimization problem with constraint

Hello, the problem is as follows: Minimize R Subject to: (x-a_i)^2+(y-b_i)^2 ≤ R^2
Am looking to find x, y and R, knowing that
  • a_i and b_i are known values from (100*1 matrices) each.
  • x and y are within min and max of a_i and b_i.
Is it possible to find a solution with the optimzation tool of MATLAB? If not, any suggestion for a solution ?

 Respuesta aceptada

Alan Weiss
Alan Weiss el 13 de Sept. de 2018

0 votos

I haven't tried this, but it sounds straightforward.
Decision variables: x(1) = x, x(2) = y, x(3) = R.
100 nonlinear inequality constraints: (x(1) - a(i))^2 + (x(2) - b(i))^2 - R^2 <= 0
Objective function: R = x(3)
Bounds: ll = min(min([a,b])), mm = max(max([a,b]))
lb = [ll,ll,0] , ub = [mm,mm,Inf]
Call fmincon from a reasonable start point, such as [(ll+mm)/2,(ll+mm)/2,abs(mm)]
Alan Weiss
MATLAB mathematical toolbox documentation

Más respuestas (2)

Bruno Luong
Bruno Luong el 13 de Sept. de 2018
Editada: Bruno Luong el 13 de Sept. de 2018
Here is solution
R = -Inf
x = something between min(ai,bi),max(ai,bi)
y = something between min(ai,bi),max(ai,bi)
next
Matt J
Matt J el 13 de Sept. de 2018
Editada: Matt J el 13 de Sept. de 2018

0 votos

You could probably use minboundcircle in this FEX distribution.

Productos

Versión

R2015b

Etiquetas

Preguntada:

el 13 de Sept. de 2018

Editada:

el 13 de Sept. de 2018

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by