more on log determinants

You see, the full likelihood of the model I’m working with is something like:

$$|I - \rho W| \times \mathcal{N}(X\beta + \rho W y, \sigma^2) $$

And, since (W) and (I) are constants, I actually need the gradient of the log determinant with respect to ( \rho ), not (A). Since the derivative of the log determinant with respect to (A), the full laplacian matrix, is an (N \times N) matrix, I was getting a conformality error in PyMC3, which wanted the derivative with respect to $\rho$ and was expecting a scalar.

So, if I just declare (W) as a constant and proceed with the Op like a Sparse_LapDet, to denote that I’m actually interested in computing the derivative of a laplacian log determinant, where the laplacian is defined in terms of scalar variable, this should work.

What a frustrating time troubleshooting this was, though. Theano definintly feels like a different frame of mind sometimes.

imported from: yetanothergeographer