# R Optimization

Check eigenvalues of hessian of optimized sum of squares to check for singular gradient matrix. A singular gradient matrix has infinite solutions, so the best you can do is the optimized set of values plus a linearly scaled vector. The vector is equal to the eigenvector associated with the eigenvalue that is numerically indistinguishable from zero.

So if you have

`optimfunc <- function(x, data, sevcalcfunc, optimgoal, ...) {`

calcsev <- sevcalcfunc(x, data, ...)

sumsqerr <- sum((optimgoal - calcsev)^2)

return(sumsqerr)

}

and some function, then you can optimize

`system.time(optimsimple <- optim(c(.3, -12614.716, .1), optimfunc, hessian = TRUE,data = reg.data, sevcalcfunc = calclosssimple, optimgoal = reg.data$maxloss))`

and

`eigen(optimsimple$hessian)`

`$values`

[1] 1.498617e+16 1.868481e+14 8.412726e+04

Since the 3rd eigenvalue is very small compared to the first two, adding any constant z times the 3rd eigenvector to the optimized solution doesn't really change the value of the optimized sum of squares, and can therefore be considered a solution as well.

`eigen(optimsimple$hessian)$vectors[,3]`

[1] 1.956755e-06 1.000000e+00 4.573782e-06

`> optimfunc(optimsimple$par, reg.data, calclosssimple, reg.data$maxloss)`

[1] 3.793891e+14

> optimfunc(optimsimple$par + 1000 * eigen(optimsimple$hessian)$vectors[,3], reg.data, calclosssimple, reg.data$maxloss)

[1] 3.794212e+14

> optimfunc(optimsimple$par - 1000 * eigen(optimsimple$hessian)$vectors[,3], reg.data, calclosssimple, reg.data$maxloss)

[1] 3.794211e+14

### Comments

**Tell me what you're thinking... and oh, if you want a pic to show with your comment, go get a gravatar!**