Thursday, September 1, 2011

Negative estimated variances from the likelihood surface

A user recently reported a problem with estimation of the variances from the likelihood surface in evol.vcv(). This is done by first computing the Hessian matrix (the matrix of partial second derivatives) of the likelihood surface at the optimum, and then calculating the negative inverse of this matrix. This matrix is the (asymptotic) variance-covariance matrix of our ML parameter estimates. The element-wise square root of the diagonal should then provide the (asymptotic) standard errors.

The user reported problem is that in some rare cases an estimated variance obtained this way is negative. Although via troubleshooting I have managed to eliminate some possible reasons for this problem (for instance, failure to converge or errors in the calculation of the Hessian), I still have not managed to figure out why this is the case. I have not eliminated the possibility that this might have something to do with reparameterizing the likelihood optimization in terms of the Cholesky matrices. [Note that I then compute the Hessian from the likelihood function in terms of the VCV matrix.] Any suggestions (e.g., Carl?) are welcome!


  1. hey Liam,

    sorry I cant help, and instead I'm bringing you another issue... I am trying to use the evol.vcv function and I keep getting the error:

    Error in x$hessian : object of type 'closure' is not subsettable

    if I use v0.1 it runs ok... any ideas? Thanks!

  2. Hi Rafael.

    I have not been able to reproduce your error. If you send me your data and tree I will try with that (

    - Liam

  3. in case anyone runs into the same problem, in my case it was because I was using the hessian() function from the maxLik package, and not from the numDeriv package as Liam suggested in his post (I ran evol.vcv and got an error that the function wasn't found, so I searched ??hessian and the hit I got was for the maxLik package). Once I loaded the numDeriv package, everything ran smoothly. Thanks!