Wednesday 18 October 2006

pr.probability - Brownian Bridge under observational error

if you consider a Gaussian vector $V=(X,Y) in mathbb{R}^{d=m+n}$, you know how to find the conditional distribution of $X$ knowing the value of $Y=y$, right ? This is exactly the same thing here.



For example, let us suppose that $a=0, b=N+1$:



  • you have a noisy observation $Y=(y_1, y_2)=(O_a, O_b)$ with know covariance matrix $Sigma_Y$

  • the data you are looking for, $X=(z_1, ldots, z_N) in mathbb{R}^N$, have a known covariance matrix $Sigma_X$

  • the covariance matrix $E[X Y^t] = Sigma_{X,Y}$ is also known.

A quick way to find the conditional distribution of $X$ knowing $Y$ is to write
$$X = AU + BV$$
$$Y=CU$$
where $U,V$ are independent standard Gaussian random variable of size $2$ and $N$ respectively, while $A in M_{N,2}(mathbb{R})$ and $B in M_{N,N}(mathbb{R})$ and $C in M_{2,2}(mathbb{R})$. Because



  • $CC^t = Sigma_Y$ gives you $C=Sigma_y^{frac{1}{2}}$,

  • $AC^t = Sigma_{X,Y}$ then gives you $A=Sigma_{X,Y}Sigma_y^{-frac{1}{2}}$,

  • $AA^t + BB^t = Sigma_{X}$ then gives you $B=(Sigma_{X}-Sigma_{X,Y}Sigma_y^{-1}Sigma_{X,Y})^{frac{1}{2}}$,

the $3$ matrices are easily computable, and $C$ is invertible in the case you are considering. This shows that if you know that $Y=y$, the conditional law of $X | Y=y$ is given by
$$X = AC^{-1}y + BV,$$
which is a Gaussian vector with mean $AC^{-1}y = Sigma_{X,Y}Sigma_y^{-1}y$ and covariance $BB^t = Sigma_{X}-Sigma_{X,Y}Sigma_y^{-1}Sigma_{X,Y}$

No comments:

Post a Comment