1
$\begingroup$

If i know the means and standard deviations of 2 sets of data, and i know the slope of the regression line, how can I find the correlation?

edit

Sample 1

SD: 12.37

Sample 2

SD: 7.00

Slope of regression line: 0.789

  • 0
    There is actually a Statistics Stack Overflow -- http://stats.stackexchange.com/2010-12-09

1 Answers 1

3

The formula is $$r = b_1 \frac{s_x}{s_y},$$ where $r$ is the correlation, $b_1$ is the slope, and $s_x$ and $s_y$ are the standard deviations of the independent $(x)$ and dependent $(y)$ variables, respectively.

A reference is Wikipedia's page on simple linear regression. See the formula for $\hat{\beta}$.

  • 0
    so the means don't matter if i already know the standard deviations?2010-12-08
  • 0
    also i'm getting a value > 1 when I use that formula...must be something wrong ?2010-12-08
  • 0
    @Sev: The means are part of the slope calculation. So all the information you need from the means to find the correlation is already contained in the slope.2010-12-08
  • 0
    @Sev: What are your numbers for the standard deviations and the slope? (Edit them in to your original question.)2010-12-08
  • 0
    @Mike: done....2010-12-08
  • 0
    @Sev: There must be a calculation error somewhere, since (as you pointed out) you can't have $r > 1$. Try rechecking your calculations. Or did you regress Sample 1 on Sample 2 rather than Sample 2 on Sample 1?2010-12-08
  • 0
    I'm going to assume something is wrong with the data. thanks for the help!2010-12-08