5
$\begingroup$

If I have a stochastic matrix $X$- the sum of each row is equal to $1$ and all elements are non-negative.

Given this property, how can I show that:

$x'X=x'$ , $x\geq 0$

Has a non-zero solution?

I'm assuming this has something to do with proving a feasible dual, but not may be wrong..

3 Answers 3

5

Here's the linear programming approach (with considerable help from another user, Fanfan). Consider the LP

$$\min 0^T y$$ subject to $$(X - I)y \geq 1,$$ where $0$ and $1$ are, respectively, vectors containing all 0's and all 1's.

By Fanfan's answer to my question this LP is infeasible. Thus its dual, $$\max 1^T x$$ subject to $$x^T (X-I) = 0^T,$$ $$x \geq 0,$$ is either infeasible or unbounded. But $x = 0$ is a solution, and so this dual problem is feasible. Thus it must be unbounded. But that means there must be a nonzero solution $x_1$ in its feasible region. Thus we have $x_1^T X = x_1^T$ with $x_1 \geq 0$, $x_1 \neq 0$.

(Fanfan's answer to my question also includes another answer to your question - one that uses Farkas's Lemma rather than LP duality. It ends up being quite similar to my answer here, as, of course, Farkas's Lemma and LP duality are basically equivalent.)

3

If your matrix $X$ is $n\times n$, then $X$ is the transition matrix of a Markov chain with state space $\lbrace 1 , 2 , \dots , n \rbrace$. The vector $x$ that you are looking for is a stationary measure for the Markov chain.

The existence of such a non-zero $x$ can be proven using probability theory, but a purely linear approach falls under Perron-Frobenius theory.

For instance, the averages ${1\over n}\sum_{k=1}^n X^k$ of the powers of $X$ converge to a matrix $M$ (this statement needs proof!). Then, it is easy to see that any row of $M$ can serve as $x'$ and solves $x'X=x'$. Note that the row sums of $M$ all equal 1, so $x\neq 0$.

As you suggest, maybe there is also an approach via linear programming.

1

Here is a hint for you:

Take $v = (1, 1, 1, \dots, 1)'$, the vector with only ones. By definition of your Matrix, you have $Xv = v$. Thus, $v$ is an eigenvector with eigenvalue 1.

Now change the first basis vector of your vector space to $v$ and keep the other unchanged. In this new basis, what will $X$ looks like?

  • 0
    So you're saying to replace Row 1 of A with v? Then, when looking at x'X we'll have sum of all x as the resulting first row, I'm still now sure how this fits in..2010-11-02
  • 0
    @Mark: Keep the same notation please. The matrix is $X$, not $A$. I don't suggest to replace the first row with vector $v$. A square matrix is always given in a vector space relatively to a basis. Each column represent the result of the linear map expressed as a combination of the basis vector. Since $Xv = v$, this would imply that the first column of the matrix would be $(1, 0, 0, \dots, 0)$ if you take $v$ as the first vector of the basis.2010-11-02