Wednesday, 18 October 2006

pr.probability - Brownian Bridge under observational error

if you consider a Gaussian vector V=(X,Y)inmathbbRd=m+n, you know how to find the conditional distribution of X knowing the value of Y=y, right ? This is exactly the same thing here.



For example, let us suppose that a=0,b=N+1:



  • you have a noisy observation Y=(y1,y2)=(Oa,Ob) with know covariance matrix SigmaY

  • the data you are looking for, X=(z1,ldots,zN)inmathbbRN, have a known covariance matrix SigmaX

  • the covariance matrix E[XYt]=SigmaX,Y is also known.

A quick way to find the conditional distribution of X knowing Y is to write
X=AU+BV
Y=CU
where U,V are independent standard Gaussian random variable of size 2 and N respectively, while AinMN,2(mathbbR) and BinMN,N(mathbbR) and CinM2,2(mathbbR). Because



  • CCt=SigmaY gives you C=Sigmayfrac12,

  • ACt=SigmaX,Y then gives you A=SigmaX,YSigmayfrac12,

  • AAt+BBt=SigmaX then gives you B=(SigmaXSigmaX,YSigmay1SigmaX,Y)frac12,

the 3 matrices are easily computable, and C is invertible in the case you are considering. This shows that if you know that Y=y, the conditional law of X|Y=y is given by
X=AC1y+BV,
which is a Gaussian vector with mean AC1y=SigmaX,YSigmay1y and covariance BBt=SigmaXSigmaX,YSigmay1SigmaX,Y

No comments:

Post a Comment