Recursive updating the eigenvalue decomposition of a covariance matrix
The same thing happens in Singular Value Decomposition (SVD).
It is often the case that only a Partial SVD or Truncated SVD is needed, and moreover the matrix is usually stored in sparse format.
Base R does not provide functions suitable for these special needs.
And this is why the RSpectra package was developed.
My first thought was to apply the Bunch-Nielsen-Sorensen formula for a rank-1 update to a spectral decomposition, which allows one to compute the eigenvalues and eigenvectors to $A$ for $D$ diagonal and $\rho\in\mathbb$ $$ A = D \rho zz^\top $$ (Some algebra extends this to updating $\Sigma$.) However, BNS requires that $z_i0 ~\forall ~ i$, and that does not hold for the mean vector $\mu$ in general.
As a work-around, I could apply two successive BNS updates, one for a $\tilde$ which shifts the nonpositive elements of $\mu$ to be positive, and a second which offsets that shift.
Adaptive beamforming techniques have been used to form a beam having one or more nulls pointing in the direction of one or more respective jammers.
When there is no jamming, Taylor and Bayliss weightings are typically used for sum beams and difference beams, respectively, so as to have a narrow mainlobe and low sidelobes.
The sampled aperture data include data that do not correspond to echo returns from a beam transmitted by the antenna. Yu addresses a problem wherein the monopulse technique for direction of arrival (DOA) estimation failed when there was sidelobe jamming (SLJ) and/or main lobe jamming (MLJ).
$$ \Sigma = \frac\left(M^\top M - n\mu\mu^\top\right) = V\frac V^\top - \frac\mu\mu^\top $$ From there, I can compute the spectral decomposition explicitly.
But it seems like there should be a better way, one that doesn't involve two separate decomposition steps.
Eigenvalue decomposition is a commonly used technique in numerous statistical problems.
For example, principal component analysis (PCA) basically conducts eigenvalue decomposition on the sample covariance of a data matrix: the eigenvalues are the component variances, and eigenvectors are the variable loadings.