Thursday, April 9, 2020

More fixes for custom models in mcmcMk

Yesterday I claimed to have fixed a bug in mcmcMk (which does Bayesian MCMC for the Mk model); however, as it turns out, I was mistaken. The case I showed worked fine, but I had not fixed the bug in the function for all use cases. This was discovered and kindly pointed out to me this morning by Jose Eduardo (“Dudu”) Meireles at the University of Maine.

Basically, the bug has to do with how the likelihood function exported by geiger::fitDiscrete (which mcmcMk now uses internally by default) compares to that exported by fitMk (which mcmcMk used before). Without getting into detail, the parameters are indexed in very different ways by the two different functions by default.

The specific update (which is quite simple, really, as it just had to do with indexing) can be viewed here. To get it, I recommend just updating phytools from GitHub using devtools.

Here's a demo in which I will use a weird transition matrix & then compare fits obtained using ML to mcmcMk to make sure it is now working.

First, I'll load phytools & simulate some data. I'm going to use a weird model in which c \(\to\) b \(\to\) a \(\to\) c can all occur, though at different rates, while other types of changes are not allowed.

library(phytools)
packageVersion("phytools")
## [1] '0.7.26'
## simulate tree & data
tree<-pbtree(n=500)
Q<-matrix(c(-2,0,2,
    1,-1,0,
    0,0.5,-0.5),3,3,byrow=TRUE,
    dimnames=list(letters[1:3],letters[1:3]))
Q
##    a    b    c
## a -2  0.0  2.0
## b  1 -1.0  0.0
## c  0  0.5 -0.5
x<-sim.Mk(tree,Q,anc="c")

First, I will start out by fitting the ARD model using Maximum Likelihood (in fitMk) and then using mcmcMk.

fitARD<-fitMk(tree,x,model="ARD")
fitARD
## Object of class "fitMk".
## 
## Fitted (or set) value of Q:
##           a         b         c
## a -2.182931  0.000000  2.182931
## b  0.926885 -0.938648  0.011763
## c  0.006106  0.384836 -0.390942
## 
## Fitted (or set) value of pi:
##         a         b         c 
## 0.3333333 0.3333333 0.3333333 
## 
## Log-likelihood: -391.228308 
## 
## Optimization method used was "nlminb"
par(cex=1.5)
plot(fitARD,show.zeros=FALSE,mar=rep(0,4))

plot of chunk unnamed-chunk-3

Cool, likelihood gets pretty close to the correct model. Now, let's fit the same model using MCMC and see how we do:

mcmcARD<-mcmcMk(tree,x,model="ARD",ngen=200000,
    plot=FALSE,prior=list(alpha=0.1),
    auto.tune=0.23,print=2000)
## Running MCMC....
## gen  [a,b]   [a,c]   [b,a]   [b,c]   [c,a]   [c,b]   logLik  accept
## 1    0.2394  0.2394  0.2394  0.2394  0.2394  0.2394  -433.2326
## 2001 0.0355  1.6343  0.6641  0.4271  0.0064  0.4586  -393.5216   0.09
## 4001 1.0006  0.7051  0.8515  0.249   0.0731  0.3344  -400.0403   0.17
## 6001 2.06    0.4998  0.7174  0.6889  0.3327  0.0663  -397.0039   0.15
## 8001 0.2087  1.7205  1.1202  0.0072  0.0383  0.3778  -393.8621   0
## 10001    0.405   1.8779  0.5252  0.4085  0.0453  0.257   -397.4406   0.17
## 12001    0.0654  3.3548  0.6647  0.0229  0.0623  0.3114  -398.8199   0.4
## 14001    0.1351  2.0866  0.9033  1e-04   0.0027  0.3116  -392.1486   0
## 16001    0.3397  1.7696  1.0745  0.3229  0.0326  0.508   -395.3114   0.41
## 18001    1e-04   2.4052  1.3211  0.0991  0.0068  0.5453  -395.4835   0.11
## 20001    0.0529  2.1223  1.1661  0.1471  0.0392  0.5739  -396.4925   0.53
## 22001    0.9631  3.0569  1.4886  0.0368  0.2295  0.3416  -397.6864   0.31
## 24001    1.3685  1.5114  0.6248  0.3379  0.0675  0.259   -396.1328   0.35
## 26001    1.4781  0.981   1.2162  0.4226  0.0637  0.2895  -394.7209   0.3
## 28001    1.2811  1.3335  0.5666  0.3989  0.148   0.1457  -395.2675   0.3
## 30001    1.8177  2.4393  1.4417  0.0949  0.345   0.2103  -396.9524   0.32
## 32001    1.1229  0.9655  0.5646  0.5988  0.1442  0.25    -392.8853   0.15
## 34001    0.0566  1.7278  0.6094  0.4759  0.0204  0.4289  -393.9948   0.29
## 36001    0.4249  2.7001  0.9092  0.0797  0.1553  0.4112  -394.6042   0.29
## 38001    0.2251  2.3163  0.8171  0.0748  6e-04   0.3072  -393.5481   0.23
## 40001    0.0085  2.9373  1.5021  0.0489  3e-04   0.7159  -399.1921   0.21
## 42001    0.1648  1.9905  0.8469  0.2595  0.009   0.497   -393.3668   0.5
## 44001    0.4062  1.3011  0.9068  0.1345  0   0.3436  -393.3044   0
## 46001    0.0839  2.0849  1.0659  0.0448  0.001   0.426   -391.9067   0.31
## 48001    1e-04   1.6547  0.6366  0.4547  0.0632  0.5112  -395.4476   0.12
## 50001    0.437   2.1685  1.3103  0.0023  0.0054  0.3226  -394.6798   0.17
## 52001    0.4134  1.8812  0.874   0.3571  0   0.4298  -393.3216   0
## 54001    0.0911  2.5639  0.6294  0.0025  0.1345  0.2959  -395.2391   0.25
## 56001    0.2205  1.8539  0.8815  0.0124  0.0181  0.2975  -392.2555   0.25
## 58001    0.0785  2.0366  0.9895  0.2416  0.0795  0.4306  -394.3615   0.23
## 60001    0.0062  2.2564  1.0367  0.0751  0.0045  0.2484  -400.0448   0.16
## 62001    0.5995  1.6334  1.0894  0.0506  0.0117  0.3797  -392.7542   0.21
## 64001    0.0079  3.1095  0.9727  0.0043  0.0014  0.411   -393.7935   0.02
## 66001    0.0663  3.2821  0.8964  0.0134  0.0535  0.2267  -400.736    0.65
## 68001    0.2723  1.6223  0.8087  0.6835  0.0996  0.5653  -398.9767   0.31
## 70001    0   1.8501  0.8627  0.0063  0.073   0.2861  -393.856    0
## 72001    1.0759  1.5586  0.947   0.1521  0.1552  0.2615  -393.3886   0.18
## 74001    0.8192  2.54    0.6672  0.347   0.1779  0.2981  -396.247    0.33
## 76001    1.2682  2.8382  0.7989  0.0899  0.3169  0.0808  -401.1158   0.33
## 78001    0.8726  2.4258  0.9226  0.2388  0.2876  0.2463  -395.7843   0.24
## 80001    0.2066  3.9423  0.8164  0.088   0.1794  0.3534  -398.0631   0.08
## 82001    0.4157  1.9027  1.1953  0.253   0.0037  0.3987  -394.8138   0.22
## 84001    0.003   2.3144  1.1209  0.0363  1e-04   0.4126  -392.3617   0
## 86001    0.0321  1.4339  0.7075  0.0333  0.0289  0.3526  -394.8363   0.25
## 88001    0.3491  1.812   1.3892  0.0058  0.0089  0.368   -395.5763   0.23
## 90001    0.3451  1.4012  0.8528  0.5282  0.0023  0.5254  -395.5484   0.46
## 92001    0.196   1.125   0.6498  0.4478  0.0101  0.4404  -394.8839   0.23
## 94001    0.0113  2.6859  0.8931  0.0222  0.2331  0.3929  -395.1396   0.44
## 96001    0.8078  1.2137  0.4966  0.4294  0.1437  0.2194  -392.8927   0.36
## 98001    0.5607  1.7523  0.9561  0.031   0.0262  0.2824  -392.4398   0.37
## 100001   0.585   2.2673  1.1719  0.1103  0.0018  0.3555  -393.5428   0.15
## 102001   1.5149  0.5379  0.0076  0.8237  0.4258  0.0628  -400.4975   0.21
## 104001   2.7855  0.8951  0.3542  0.692   0.382   0.0373  -398.9953   0.28
## 106001   2.475   1.5866  1.3775  0.0202  0.2843  0.0497  -399.1934   0.17
## 108001   1.3336  0.1337  0.1361  0.979   0.1096  0.2059  -401.318    0.19
## 110001   0.5859  1.7219  0.6708  0.0529  0.1642  0.1749  -396.1678   0.16
## 112001   0.002   2.2001  0.5789  0.2121  0.0549  0.383   -393.6431   0.32
## 114001   0.3662  2.0026  1.0399  0.0164  0.0138  0.3109  -392.2474   0.15
## 116001   2.3048  3.427   1.6336  0.0128  0.1751  0.274   -399.1746   0.41
## 118001   2.1795  0.4938  0.6506  0.677   0.1108  0.2728  -395.2656   0.22
## 120001   1.9419  1.3633  1.2794  0.5892  0.0623  0.3875  -395.9455   0.29
## 122001   0.6827  2.0316  0.8006  0.1388  0.0468  0.3555  -393.4905   0.1
## 124001   0.4078  2.0196  0.5691  0.1672  0.0609  0.2224  -396.155    0.26
## 126001   1.6036  2.1045  0.6787  0.3125  0.3678  0.2656  -400.0396   0.39
## 128001   0.0347  2.4217  0.9346  0.0012  0.0561  0.3619  -391.5799   0.07
## 130001   0.441   1.4162  0.8661  0.7994  0.0854  0.3787  -401.3852   0.43
## 132001   0.0353  2.9087  1.0613  0.0098  0.0314  0.3836  -393.0533   0.48
## 134001   0.2526  3.3762  1.1082  0.0139  0.1314  0.5553  -397.9016   0.38
## 136001   0.6705  1.9755  1.0858  0.0029  3e-04   0.3357  -392.1889   0
## 138001   0.5895  1.0178  0.2649  0.4834  0.0136  0.3082  -404.0795   0.42
## 140001   1.3004  0.335   0.4352  0.9506  0.0657  0.5181  -400.2082   0.11
## 142001   0.7747  1.1322  0.7422  0.0863  0.2706  0.1418  -402.6567   0.31
## 144001   6e-04   2.3278  1.3561  0.0197  0.0446  0.478   -396.0254   0.14
## 146001   0.4997  1.4058  0.5935  0.476   0.0628  0.3187  -393.2015   0.25
## 148001   1.0477  1.0441  1.0415  0.3811  0.0474  0.3659  -394.3784   0.37
## 150001   2e-04   1.9853  0.9395  0.0126  0.0218  0.3662  -391.6284   0.31
## 152001   2.1385  1.8712  1.3807  0.1784  0.1889  0.2426  -394.7626   0.27
## 154001   2.5524  0.006   0.4175  0.899   0.1762  0.1909  -397.4173   0.02
## 156001   3.0862  1.3384  0.8986  0.2026  0.21    0.1686  -402.4173   0.15
## 158001   3.3364  2.3727  1.8661  0.0042  0.0032  0.329   -398.4107   0
## 160001   3.4753  1.4637  2.6145  6e-04   0.0175  0.3679  -399.1451   0
## 162001   2.673   2.4554  2.1342  0.0058  0.0686  0.3545  -396.8483   0.27
## 164001   3.7604  0.4316  0.915   0.7235  0.2416  0.0305  -399.8443   0.19
## 166001   1.1617  2.2026  1.077   0.138   0   0.4339  -395.4851   0
## 168001   0.407   1.5563  0.7539  6e-04   0.0926  0.2958  -395.0838   0.06
## 170001   0.5992  1.5538  0.8548  0.3618  0.0024  0.3327  -393.5775   0.15
## 172001   1.6465  0.9991  1.1493  0.264   0.0087  0.3495  -395.4684   0.14
## 174001   4.0412  1.0312  2.0863  0.549   0.0074  0.4023  -397.914    0.01
## 176001   3.7394  0.1042  0.3867  0.8843  0.3898  0.0035  -400.1002   0.04
## 178001   2.0915  1.2567  1.4815  0.5687  0.0117  0.4867  -396.9679   0.15
## 180001   0.6156  2.4757  1.4667  0.105   0.0227  0.6184  -398.1782   0.12
## 182001   0.9893  1.0498  0.6733  0.587   0.0771  0.3287  -393.1038   0.09
## 184001   1.6615  1.1097  0.1536  0.5622  0.312   0.079   -400.15 0.34
## 186001   1.2144  1.1732  0.5195  0.5778  0.05    0.3542  -396.5859   0.25
## 188001   0.8906  1.8051  1.2541  0.0014  0.0057  0.3171  -392.7109   0
## 190001   0.6456  2.2247  1.0399  0.0078  0.0337  0.2867  -392.7961   0.17
## 192001   6e-04   2.1492  0.8851  0.0387  0.1523  0.5361  -399.4342   0.71
## 194001   1.4413  2.1379  1.4293  0.0081  0.0027  0.3101  -394.1837   0.53
## 196001   2.7784  3.6261  1.6657  0.0873  0.0835  0.4646  -402.9422   0.26
## 198001   2.3004  1.0822  1.0996  0.4932  0.3431  0.0549  -397.4566   0.21
## Done.
plot(mcmcARD)

plot of chunk unnamed-chunk-4

Based on the likelihood profile & the profile plots of our parameters, this looks like reasonably good convergence. Let's look at the parameter estimates using summary:

summary(mcmcARD)
## Assuming 20% burn-in as no burn-in was specified....
## 
## Mean value of Q from the post burn-in posterior sample:
##            a          b          c
## a -2.8672826  1.0843348  1.7829477
## b  0.9860650 -1.2687875  0.2827225
## c  0.1122233  0.3196107 -0.4318340
## 
## Median value of Q from the post burn-in posterior sample:
##             a          b          c
## a -2.48439826  0.6736814  1.8107168
## b  0.92338590 -1.1005437  0.1771578
## c  0.06965303  0.3288264 -0.3984795
## 
## 95% HPD interval computed either from the post burn-in
## samples or using 'coda':
##              lower     upper
## [a,b] 2.488534e-05 3.4721228
## [a,c] 2.962950e-04 3.2906002
## [b,a] 2.436985e-01 1.9284228
## [b,c] 5.030798e-06 0.8989150
## [c,a] 9.346201e-06 0.3736453
## [c,b] 1.768073e-04 0.5154772

Note, that we should not be too alarmed to find that parameters from our fitted model with ML values near the lower limit of qi,j (i.e., zero) are higher when estimated from the Bayesian posterior sample.

We can also plot the posterior density using density and our corresponding plot method:

plot(density(mcmcARD))
## Assuming 20% burn-in as no burn-in was specified....

plot of chunk unnamed-chunk-6

Now, let's repeat all of this analysis - but in which we specify the correct model in which and transition c \(\to\) b \(\to\) a \(\to\) c are allowed to occur. The design matrix for this model can be specified as follows:

model<-matrix(c(0,0,1,
    2,0,0,
    0,3,0),3,3,byrow=TRUE,
    dimnames=list(levels(x),levels(x)))
model
##   a b c
## a 0 0 1
## b 2 0 0
## c 0 3 0

I'll fit this model using ML just as we did with the ARD model.

fitModel<-fitMk(tree,x,model=model)
fitModel
## Object of class "fitMk".
## 
## Fitted (or set) value of Q:
##           a         b         c
## a -2.193256  0.000000  2.193256
## b  0.945358 -0.945358  0.000000
## c  0.000000  0.388236 -0.388236
## 
## Fitted (or set) value of pi:
##         a         b         c 
## 0.3333333 0.3333333 0.3333333 
## 
## Log-likelihood: -391.232567 
## 
## Optimization method used was "nlminb"

Now, let's proceed to run the MCMC.

This problem is not as hard as our ARD model was, so I'm not going to run the MCMC for nearly as many generations.

mcmcModel<-mcmcMk(tree,x,model=model,ngen=100000,
    plot=FALSE,prior=list(alpha=0.1),
    auto.tune=0.4,print=1000)
## Running MCMC....
## gen  [a,c]   [b,a]   [c,b]   logLik  accept
## 1    0.2394  0.2394  0.2394  -448.1148
## 1001 2.0245  0.7487  0.3587  -392.6804   0.19
## 2001 1.8109  0.9614  0.3162  -393.066    0.17
## 3001 1.993   0.9061  0.4256  -391.9479   0.44
## 4001 3.2672  1.2138  0.5628  -395.2097   0.37
## 5001 2.0784  0.9966  0.4341  -391.7218   0.55
## 6001 2.4412  0.9065  0.4065  -391.7019   0.41
## 7001 2.3611  1.1701  0.4212  -392.4499   0.45
## 8001 1.5823  0.9288  0.4481  -394.8203   0.3
## 9001 2.2225  0.923   0.3113  -392.5594   0.39
## 10001    2.5966  1.4255  0.5136  -395.2551   0.44
## 11001    1.8495  0.9584  0.3179  -392.855    0.45
## 12001    2.7542  0.7718  0.4649  -396.5221   0.38
## 13001    2.2099  0.9172  0.3788  -391.2682   0.42
## 14001    2.117   1.0579  0.429   -391.781    0.33
## 15001    2.4798  0.9203  0.3219  -392.7284   0.27
## 16001    2.3455  0.9703  0.2889  -394.2497   0.32
## 17001    2.5171  0.834   0.36    -392.4534   0.52
## 18001    1.8323  1.1282  0.3773  -393.6565   0.44
## 19001    2.638   1.0241  0.5278  -393.7198   0.13
## 20001    1.5004  0.7644  0.4204  -395.8565   0.38
## 21001    2.4316  0.9435  0.4419  -391.8917   0.4
## 22001    2.8198  1.1393  0.5976  -395.4506   0.5
## 23001    2.1888  0.9124  0.3211  -392.1385   0.42
## 24001    1.9351  0.7157  0.259   -395.0836   0.26
## 25001    1.1723  0.6451  0.265   -399.9841   0.2
## 26001    2.8863  1.3019  0.4082  -395.2335   0.48
## 27001    3.2268  1.0189  0.5215  -395.3091   0.36
## 28001    3.6091  1.5132  0.5604  -397.7624   0.59
## 29001    2.3252  0.881   0.3274  -392.1309   0.22
## 30001    1.9606  1.2053  0.4215  -393.7736   0.37
## 31001    3.2893  0.9678  0.4306  -394.7221   0.47
## 32001    2.2813  0.936   0.3967  -391.2895   0.32
## 33001    2.4911  1.0014  0.411   -391.5435   0.44
## 34001    2.2364  0.8805  0.3864  -391.4396   0.36
## 35001    1.9219  0.8868  0.3851  -391.634    0.32
## 36001    1.5514  0.5903  0.3171  -397.3683   0.31
## 37001    2.3074  1.1265  0.4038  -392.1529   0.32
## 38001    2.3048  0.8971  0.3452  -391.7044   0.46
## 39001    1.9141  1.0952  0.3753  -392.8628   0.47
## 40001    1.7878  0.7597  0.3718  -392.9641   0.35
## 41001    1.3276  0.7263  0.3556  -396.3436   0.43
## 42001    2.1523  0.9068  0.4357  -391.915    0.32
## 43001    1.7596  0.899   0.3415  -392.2757   0.24
## 44001    1.8433  1.066   0.3357  -393.6269   0.27
## 45001    2.3495  1.2222  0.4349  -392.9647   0.59
## 46001    2.1805  0.8764  0.3828  -391.4129   0.35
## 47001    1.8514  0.9814  0.3791  -392.0007   0.3
## 48001    2.3129  0.9823  0.4084  -391.3131   0.29
## 49001    2.8666  0.9072  0.4117  -393.1268   0.42
## 50001    1.9917  0.8876  0.3679  -391.4399   0.37
## 51001    1.9227  0.8544  0.3302  -391.927    0.5
## 52001    2.1888  1.0741  0.459   -392.0554   0.48
## 53001    2.7164  0.9607  0.3938  -392.2064   0.39
## 54001    2.1129  0.8835  0.3421  -391.5787   0.36
## 55001    2.6412  1.243   0.5002  -393.2448   0.51
## 56001    1.7964  0.7065  0.3858  -394.2487   0.5
## 57001    2.1029  0.9874  0.3962  -391.3612   0.51
## 58001    2.7452  0.9228  0.4308  -392.6784   0.33
## 59001    3.2376  0.7379  0.4101  -398.7406   0.31
## 60001    2.7315  0.9172  0.3941  -392.4459   0.31
## 61001    3.2406  1.0662  0.5181  -394.8512   0.51
## 62001    2.3151  1.0862  0.3311  -393.5319   0.4
## 63001    1.8247  0.9982  0.3549  -392.4396   0.39
## 64001    1.8385  0.6593  0.32    -394.5783   0.51
## 65001    2.6945  1.1752  0.4912  -392.8085   0.33
## 66001    2.059   0.8549  0.3642  -391.5008   0.28
## 67001    2.4009  0.821   0.354   -392.2588   0.4
## 68001    1.7228  0.802   0.2886  -393.5751   0.3
## 69001    3.1803  1.1802  0.4785  -394.0006   0.48
## 70001    1.9188  0.6589  0.3154  -394.592    0.29
## 71001    1.9475  1.0498  0.4735  -393.0513   0.5
## 72001    2.7221  1.4401  0.5221  -395.4086   0.39
## 73001    2.2013  1.1763  0.3378  -394.6947   0.48
## 74001    2.6558  1.2252  0.4181  -393.3831   0.42
## 75001    3.2679  1.0941  0.4968  -394.4599   0.4
## 76001    2.6678  1.083   0.383   -392.5739   0.44
## 77001    2.4616  0.822   0.3185  -392.9762   0.45
## 78001    2.3768  1.0703  0.3408  -393.008    0.57
## 79001    1.9389  0.9984  0.3993  -391.7831   0.41
## 80001    2.242   0.9716  0.4405  -391.6404   0.47
## 81001    1.5054  0.6764  0.2705  -396.2069   0.42
## 82001    2.9465  0.9854  0.4299  -392.9968   0.45
## 83001    2.635   1.0487  0.5166  -393.2294   0.35
## 84001    1.422   1.0945  0.3823  -396.7386   0.5
## 85001    1.9318  0.9562  0.5031  -394.2419   0.58
## 86001    2.4296  0.8684  0.4229  -392.2506   0.4
## 87001    2.3078  0.9475  0.3622  -391.4785   0.4
## 88001    2.2326  0.8908  0.2875  -393.4744   0.33
## 89001    1.9534  0.8837  0.4102  -391.8793   0.36
## 90001    2.4186  0.8787  0.4001  -391.8196   0.41
## 91001    1.7618  0.9298  0.381   -392.2039   0.39
## 92001    2.0285  0.8511  0.35    -391.589    0.4
## 93001    2.1447  0.6792  0.3052  -394.4921   0.35
## 94001    2.4626  1.1002  0.4264  -391.8554   0.54
## 95001    2.6518  0.8478  0.3448  -392.9866   0.42
## 96001    2.6387  1.0209  0.4792  -392.5011   0.38
## 97001    1.8533  1.0082  0.3476  -392.5504   0.38
## 98001    3.2761  0.8285  0.3952  -396.3493   0.4
## 99001    2.2419  1.208   0.345   -394.9628   0.43
## Done.
plot(mcmcModel)

plot of chunk unnamed-chunk-9

summary(mcmcModel)
## Assuming 20% burn-in as no burn-in was specified....
## 
## Mean value of Q from the post burn-in posterior sample:
##            a          b          c
## a -2.2538776  0.0000000  2.2538776
## b  0.9791789 -0.9791789  0.0000000
## c  0.0000000  0.4023976 -0.4023976
## 
## Median value of Q from the post burn-in posterior sample:
##            a          b          c
## a -2.2129832  0.0000000  2.2129832
## b  0.9615335 -0.9615335  0.0000000
## c  0.0000000  0.3950336 -0.3950336
## 
## 95% HPD interval computed either from the post burn-in
## samples or using 'coda':
##           lower     upper
## [a,c] 1.3765155 3.2159464
## [b,a] 0.6428080 1.3326689
## [c,b] 0.2549093 0.5552811
plot(density(mcmcModel))
## Assuming 20% burn-in as no burn-in was specified....

plot of chunk unnamed-chunk-10

Ooh. That's pretty cool.

Finally, it doesn't matter which order we put the indices of our design matrix in. For instance, the following matrix should give us the same result:

model<-matrix(c(0,0,3,
    1,0,0,
    0,2,0),3,3,byrow=TRUE,
    dimnames=list(levels(x),levels(x)))
model
##   a b c
## a 0 0 3
## b 1 0 0
## c 0 2 0
mcmc<-mcmcMk(tree,x,model=model,ngen=100000,
    plot=FALSE,prior=list(alpha=0.1),
    auto.tune=0.4,print=FALSE)
## Running MCMC....
## Done.
summary(mcmc)
## Assuming 20% burn-in as no burn-in was specified....
## 
## Mean value of Q from the post burn-in posterior sample:
##           a          b          c
## a -2.235269  0.0000000  2.2352686
## b  0.963232 -0.9632320  0.0000000
## c  0.000000  0.3960514 -0.3960514
## 
## Median value of Q from the post burn-in posterior sample:
##            a          b          c
## a -2.1841703  0.0000000  2.1841703
## b  0.9464934 -0.9464934  0.0000000
## c  0.0000000  0.3887074 -0.3887074
## 
## 95% HPD interval computed either from the post burn-in
## samples or using 'coda':
##           lower     upper
## [a,c] 1.3633020 3.1565336
## [b,a] 0.6558864 1.3026019
## [c,b] 0.2594952 0.5511159

(That's good.)

No comments:

Post a Comment

Note: due to the very large amount of spam, all comments are now automatically submitted for moderation.