7
$\begingroup$

In least-squares approximations the normal equations act to project a vector existing in N-dimensional space onto a lower dimensional space, where our problem actually lies, thus providing the "best" solution we can hope for (the orthogonal projection of the N-vector onto our solution space). The "best" solution is the one that minimizes the Euclidean distance (two-norm) between the N-dimensional vector and our lower dimensional space.

There exist other norms and other spaces besides $\mathbb{R}^d$, what are the analogues of least-squares under a different norm, or in a different space?

  • 0
    If you consider it equivalent to say you are minimizing error induced by noise, then euclidean distance is correct when the noise is Gaussian, but if it is Laplacian then the absolute value would be better. I'm not sure if you would consider this answer valid but if you want I can expand it as full answer.2010-07-25
  • 0
    I would like to see that whenever you get a chance. (This is a real question of mine, btw.)2010-07-25
  • 0
    If no one beats me to it I'll put it together after I rewrite my bent coin question.2010-07-25
  • 0
    Great, I want to see a solution to that before I want to see one here -that problem looked outrageous!2010-07-25

3 Answers 3