5
$\begingroup$

The Wronskian lets us determine if a set of functions (possibly the solutions to a differential equation) are linearly dependent or not. But, for every example in the book, it is very obvious if one of the functions is a linear combination of the others. The examples in the book use 3-5 functions. What would be an example of a small number of functions where this isn't obvious?

Or is the application of the Wronskian mostly to deal with large sets of functions... where the sheer number makes it hard to tell if they are dependent or not?

3 Answers 3

6

It can be concealed in various ways. If the functions are expressed as trig functions the dependence gets hidden easily. Think of $\sin^2(x)$ and $\cos(2x)$. As the functions get messier it gets easier to hide.

The problem with the Wronskian is that the functions must be sufficiently differentiable and you need to be able to calculate it. Think of the indicator function on the rationals and the indicator function on the irrationals. These are dependent, but the Wronskian won't help.

  • 0
    Thanks! For sin^2(x) and cos(2x) to be in a dependent set we need to have cos^2(x) too right?2010-10-20
  • 1
    No, $cos(2x)=1-2sin^2(x)$, which was the point that it can hide dependence. If they have to add to zero, you need a constant as well.2010-10-20
  • 2
    Well, including cos^2 x is equivalent to including a constant since sin^2x + cos^2 x = 1.2010-10-20
  • 0
    Yes, I should have included the constant function in both examples.2010-10-20
  • 1
    Trig identities could conceal linear dependence, and more advanced relationships -- say between various Bessel functions -- might be even harder to spot. A related problem in applications is numerical dependence, functions that are theoretically independent but they can't be distinguished numerically. For example, sinh(x) and cosh(x) for large arguments.2010-10-20
  • 0
    To expand on John's comment, this is the reason for one to talk about "numerically satisfactory" solutions to a differential equation. Using $y^{\prime\prime}=y$ as an example, $\exp(x)$ and $\exp(-x)$ are a numerically satisfactory pair, while $\cosh(x)$ and $\sinh(x)$ aren't. The more complicated the ODE, the less likely you'll see if the Wronskian is singular at a mere glance.2010-10-21
  • 0
    When it says the indicator functions on the rationals and irrationals are dependent, would that imply $c_1\chi_{\Bbb{Q}}+c_2\chi_{\Bbb{R-Q}}=0$, but picking a rational value tells you that $c_1=0$, and an irrational value tells you $c_2=0$? Assuming the indicator function of $I$, $\chi_I(x):=\begin{cases}1 & x\in I \\ 0 & x\not\in I \end{cases}$2016-10-03
  • 0
    As with the trig example, you need a constant function for them to sum to zero. You can't compute the Wronskian here because you can't compute the derivatives.2016-10-03
6

The cute application of the Wronskian I like to keep in mind is that the invertibility of the Vandermonde matrix implies that the set $\{ e^{\lambda z} : \lambda \in \mathbb{C} \}$ is linearly independent. The Wronskian also shows up, for example, in the method of variation of parameters.

  • 0
    My book said they had to be "real valued" functions. Now I'll need to look for the reason why they said that before I mess with this.2010-10-20
  • 0
    No, they can be complex-valued real functions. Or you can pretend I said lambda in R. (This is less fun, though, because there's a fairly simple way to prove this directly: if you have any purported linear dependence, taking z large enough is a contradiction.)2010-10-20
3

Have you tried to prove by hand (i.e., only using the definition of linear independence) that $\sin \theta$ and $ \cos \theta$ are linearly independent? Of course, this can be done with the help of the Wronskian.

And what about $e^{i\theta}$ and $ e^{i (\theta + \frac{\pi}{2})}$? This is geometrically clear, but: can you see the difference between linear independence over the real and complex numbers?

EDIT. Just to add still a more elementary example: what about $\sin^2\theta $ and $\cos^2\theta$?

  • 0
    For both questions it suffices to evaluate at theta = 0 and theta = pi/2. Not sure I follow.2010-10-20
  • 0
    @Qiaochu: isn't that doing it by hand? :)2010-10-20
  • 1
    Well, I thought the point of the answer was that it is hard to prove these things by hand. But for just two particular functions it really seems quite straightforward to me.2010-10-20
  • 0
    @Qiaochu. No, it wasn't my point. I'm not sure about the level of knowledge of a little don. So I proposed him the most elementary exercises I know about linear independence in spaces of functions.2010-10-20
  • 0
    @Qiaochu. Btw, yours is a nice example with the Vandermonde matrix.2010-10-20