Notes on Chapter 7

Back to notes on chapter 6

• p. 232: other simulated annealing codes exist. For example, Canham and Uriarte cite Goffe, W. L., G. D. Ferrier, and J. Rogers. 1994. Global optimization of statistical functions with simulated annealing. Journal of Econometrics 60: 65-99, (link to PDF). They have R code too.
• p. 239 footnote: "a text file description" is not terribly clear. I mean that I am talking about definitions of statistical models coded as program statements in text files, rather than (e.g.) a graphical interface like DoodleBUGS.
• p. 240 See McCarthy 2007 for much more on WinBUGS.
• p. 242 I should give an example of defining the gradient in an optimization problem. (I'm also not 100% sure this really works in mle2, although at least one person seems to have got it going.)
• p. 243 A (sort of) competitor to EMD in R exists, also in press from Princeton University Press: Ben Klemens' (PUP link, book web site, PDF). It assumes a much higher degree of computational and mathematical knowledge than EMD in R, but if you're comfortable with that you might find it interesting. It emphasizes computational efficiency more than I do.
• p. 255: to be honest, I'm not really comfortable with my definition of "population prediction intervals". Sometimes this term is used to distinguish between statements of uncertainty that refer only to model uncertainty vs. those that include sampling/error uncertainty as well. If anyone has a better idea what to call the result of estimating uncertainty by resampling the sampling distribution of the parameters (based on an approximate variance-covariance matrix), I'd love to know.
• p. 259, last code chunk: the code
````obs.dev = 2 * (logLik(fit.nb) - logLik(fit.pois))`
```

is redundant with the code chunk on the previous page
````devdiff = 2 * (logLik(fit.nb) - logLik(fit.pois))`
```

(oops.)
• p. 262 the optimization notes should really come before the R supplement to this chapter …

On to notes on chapter 8

page revision: 1, last edited: 03 Jul 2008 16:10