Differences
This shows you the differences between two versions of the page.
| Next revisionBoth sides next revision |
fmin [2013/11/19 15:23] – created awf | fmin [2013/11/19 15:23] – awf |
---|
Thoughts on minimization. | Thoughts on minimization. |
| |
* Just seen [[Ecker and Jepson|http://www.cs.toronto.edu/~adyecker/download/EckerJepsonPolySFS.pdf]]... We often optimize models which are polynomial in the unknowns, e.g. matrix factorization is biquadratic. Line search in such models can be solved in closed form... Don't know if it helps yet. | * Just seen [[http://www.cs.toronto.edu/~adyecker/download/EckerJepsonPolySFS.pdf|Ecker and Jepson]]... We often optimize models which are polynomial in the unknowns, e.g. matrix factorization is biquadratic. Line search in such models can be solved in closed form... Don't know if it helps yet. |
| |
* If I'm solving a nonlinear least squares problem, e.g. by Levenberg-Marquardt, and the Jacobian is such that multiplication by it can be most easily implemented by recalculating it at every call, and we're doing PCG to solve the augmented system, is there any benefit to be gained by reevaluating the Jacobian at the same point each time? | * If I'm solving a nonlinear least squares problem, e.g. by Levenberg-Marquardt, and the Jacobian is such that multiplication by it can be most easily implemented by recalculating it at every call, and we're doing PCG to solve the augmented system, is there any benefit to be gained by reevaluating the Jacobian at the same point each time? |
| |