Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
765 views
in Technique[技术] by (71.8m points)

python - Fit plane to a set of points in 3D: scipy.optimize.minimize vs scipy.linalg.lstsq

Given a set of points in 3D, the general problem is to find the a, b, c coefficients of a plane equation in the form:

z = a*x + b*y + c

such that the resulting plane is the best fit possible to that set of points.

  1. In this SO answer, the function scipy.optimize.minimize is used to solve this problem.

    It relies on initial guesses for the coefficients, and minimizes an error function that sums the distances of each point to the surface of the plane.

  2. In this code (based on this other SO answer) the scipy.linalg.lstsq function is used to tackle the same problem (when restricted to a 1st order polynomial).

    It solves for C in the equation z = A*C, where A is the concatenation of the x,y coordinates of the set of points, z is the z coordinate of the set, and C are the a,b,c coefficients.

    Unlike the code in the method above, this one does not seem to require initial guesses for the plane coefficients.

Since the minimize function requires an initial guess, this means that it may or may not converge to the optimal solution (depending on how good the guess is). Does the second method have a similar caveat or will it return an always exact solution?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

Least squares (scipy.linalg.lstsq) is guaranteed to converge. In fact, there is a closed form analytical solution (given by (A^T A)^-1 A^Tb (where ^T is matrix transpose and ^-1 is matrix inversion)

The standard optimization problem, however, is not generally solvable - we are not guaranteed to find a minimizing value. However, for the given equation, find some a, b, c such that z = a*x + b*y + c, we have a linear optimization problem (the constraints and objective are linear in the variables that we're trying to optimize for). Linear optimization problems are generally solvable, so scipy.optimize.minimize should converge to the optimal value.

Note: this is linear in our constraints even if we do z = a*x + b*y + d*x^2 + e*y^2 + f -- we don't have to restrict ourselves to a linear space of (x,y), since we will have these points (x, y, x^2, y^2) already. To our algorithm, these just look like points in the matrix A. So we can actually get a higher order polynomial using least squares!

A brief aside: In reality, all of the solvers which don't use an exact analytical solution generally stop within some acceptable range of the actual answer, so it is rarely the case that we get the exact solution, but it tends to be so close that we accept it as exact in practice. Additionally, even least-squares solvers rarely use the analytical solution and instead resort to something quicker like Newton's Method.

If you were to change the optimization problem, this would not be true. There are certain classes of problems for which we can generally find an optimal value (the largest class of these are called convex optimization problems -- although there are many non-convex problems which we can find an optimal value for under certain conditions).

If you're interested in learning more, take a look at Convex Optimization by Boyd and Vandenberghe. The first chapter doesn't require much mathematical background and it overviews the general optimization problem and how it relates to solvable optimization problems like linear and convex programming.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...