FAQ (Frequently asked questions)
  • Q:Can negative log-likelihoods (or AICs) be negative?

A: no, not technically, but there are two cases where we can get negative values for the "negative log-likelihood" function we use (which in these cases isn't exactly a negative log-likelihood function). (These in turn can give rise to negative AICs (i.e., -2*log(L)+2*k<0).)

1. For continuous response variables, what we are writing down is really a negative log-likelihood density function, rather than a negative log-likelihood function. For example, here's a picture of the normal density with $\mu=0, \sigma=0.1$.

dnorm.png

You can see that it goes above 1, which means that its log is $>0$, which means that the negative log-likelihood is negative. This will happen any time the likelihood curve is very narrow.

2. Sometimes we drop an additive constant, which does not affect the maximum likelihood estimate or the confidence intervals, from the negative log-likelihood expression. For example, the binomial likelihood is

(1)
\begin{align} \binom{N}{k} p^k (1-p)^{N-k}; \end{align}

the negative log-likelihood is

(2)
\begin{align} -\log \binom{N}{k} - k \log p - (N-k) \log (1-p) \end{align}

The first term, $\log \binom{N}{k}$, depends on the data but not at all on the parameters. Since likelihood inference only depends on differences between likelihoods, not on the overall magnitude, leaving it out doesn't change anything.
(I can't at the moment find an example/prove that one exists in this case [where dropping the normalization coefficient changes the expression from positive to negative], but this is the general idea). Actually, I'm now unsure that this ever causes negative log-likelihoods — maybe it only happens in case #1 above …

The bottom line is that the value of the negative log-likelihood doesn't really matter, only differences between the negative log-likelihoods.

Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License