I am not entirely certain what your question is. It might be
(i) Is it always possible to find a joint distribution of $(alpha, beta)$ for any prescribed distributions of $alpha, beta$ and $alpha / beta$ ?
(ii) Is it possible to find/calculate a joint distribution from the three distributions when you know the joint distribution exists e.g. because these are observations in an experiment?
(i) is not possible in general. Set $alpha = exp(X)$ and $beta = exp(-Y)$ then $log(alpha / beta) = X + Y$.
Now let $X$ and $Y$ be uniform on $[0,1]$ and choose a distribution for $X+Y$ so that $P(X+Y < 0.5) = 1$. This means $P(X > 0.5) = 0$ a contradiction to uniform. A way to visualize this might be looking at mass distributions on the square $[0,1]times[0,1]$. Prescribing the margins (here uniform) is a restriction on the projections to the axes (i.e. $0times[0,1]$ and $[0,1]times 0$) and the remaining freedom is distributing the mass in the square.
(ii) Looks more like statistics than probability. There are a number of ways of coming up with a joint distribution. But you would need to specify more context to find a reasonable approach.
No comments:
Post a Comment