It’s been a while since I’ve posted anything, so here are a few quick notes about things I’ve been learning and working on. I’ll probably write about these fairly soon.
1. Orthogonal polynomials are helpful when looking at Gaussian quadratures, an elegant extension of the standard quadratures I was taught. Not quite statistics, but approximation theory is close enough.
2. Karhunen-Loeve expansions, which can be thought of as the equivalent of Principal Component Analysis for continuous stochastic processes. Instead of finding the eigenvectors of a covariance matrix, in order to put data along independent axes ordered by the amount of variance they account for, you’re finding the eigenfunctions of an autocovariance function, with independent random weights. This leads to slightly strange formulations, where Brownian motion, a process with no derivatives, can be written as an infinite sum of sine functions, that has infinitely-many derivatives but almost surely converges to the Brownian motion everywhere.
3. I’ve tried adding another distributions to the R script for plotting ABC posteriors. Specifically, I’ve added the case with a Possion likelihood, since this is the only other one that can be done without adding distributions from other packages. The sliders I’m currently using make moving around discrete distributions rather odd to work with, though, so a new version’s on hold until I get around to adding options for other continuous cases.
4. The series on our paper from last year hit a roadblock when I started fussing over how much detail to give concerning asymptotics and Taylor’s theorem. The answer for the latter is probably a lot less than I’m trying to do, so once I’ve done a few quick posts on the topics above, I’ll get back to finishing it off.