1
$\begingroup$

If I have two different discrete distributions of random variables X and Y, such that their probability mass functions are related as follows:

$P(X=x_i) = \lambda\frac{P (Y=x_i)}{x_i} $

what can I infer from this equation? Any observations or interesting properties that you see based on this relation?

What if,
P($X=x_i$) = $\lambda\sqrt{\frac{P (Y=x_i)}{x_i} }$

In both cases, $\lambda$ is a constant.

  • 1
    It would be clearer to write this with two random variables and one measure $P$; i.e. $P(X = x_i) = \lambda P(Y=x_i)/x_i$. Also, "what can I infer" is very vague; what sorts of things would you like to infer?2010-11-04
  • 0
    @Nate:Any observation will do. I am applying a polynomial decay on a pmf to get another pmf. Something like 'A uniform Y will result in a geometric distribution(?) for X.'2010-11-04
  • 0
    Also on MathOverflow: http://mathoverflow.net/questions/44902/relationship-between-these-two-probability-mass-functions (repaired clipboard failure)2010-11-05

3 Answers 3

2

if you rewrite the equation as $P(Y = x) = xP(X = x)/\lambda$, then the distribution of $Y$ is called the length-biased distribution for $X$. it arises, for example, if one has a bunch of sticks in a bag and reaches in and selects one at random - where the probability a particular stick is selected is proportional to its length. if the lengths of the sticks are realizations of the random variable $X$, the distribution of the length of the selected stick is that of $Y$.

1

I don't have enough reputation yet, or this would be a comment.

A random variable can have only one probability mass function, so it is not clear what you are asking. Where do these equations come from?

  • 0
    I am considering two different systems which operate on the same random variable X and have two different pmfs' u and v. Edited the question to reflect this.2010-11-04
1

First, you don't have the freedom to choose $\lambda$. (Maybe you already realize that?) In order for $P(X = x_i)$ to be a true probability distribution, you must have $$\lambda = \left(\sum_{x_i} \frac{P(Y = x_i)}{x_i}\right)^{-1}.$$

Given that, if $Y$ is geometric with success probability $p$, and $P(X = x_i) = \lambda P(Y = x_i)/x_i$ then $X$ does have the logarithmic distribution with parameter $q = 1-p$.

Here's why: $P(Y = k) = (1-p)^{k-1}p$, for $k = 1, 2, \ldots$. Thus $$P(X = k) = \lambda \frac{P(Y = k)}{k} = \lambda \frac{q^{k-1} p}{k} = \frac{\lambda p}{q} \frac{q^k}{k}.$$ The expression on the right is the logarithmic probability mass function with $\lambda = \frac{-q}{p \ln p}$.