GOOGLE IT
ADJUSTED R_SQUARE
R_Square (the Coefficient of Determination) is the percent of the Total Sum
of Squares that is explained; i.e., Regression Sum of Squares (explained
deviation) divided by Total Sum of Squares (total deviation). This
calculation yields a percentage. It also has a weakness. The denominator is
fixed (unchanging) and the numerator can ONLY increase. Therefore, each
additional variable used in the equation will, at least, not decrease the
numerator and will probably increase the numerator at least slightly,
resulting in a higher R_Square, even when the new variable causes the
equation to become less efficient(worse).
In theory, using an infinite number of independent variables to explain the
change in a dependent variable would result in an R_ Square of ONE. In other
words, the R_Square value can be manipulated and should be suspect.
The Adjusted R_Square value is an attempt to correct this short_coming by
adjusting both the numerator and the denominator by their respective degrees
of freedom.
_
R2 = 1- (1 - R2 )((n - 1)/(n - k - 1))
where: R2 = Coefficient of Determination
_
R2 = Adjusted Coefficient of Determination
n = number of observations
k = number of Independent Variables
for example: when R2 =.9; n=100; and k=5; then
_
R2 = 1 - (1 - .9)((100 - 1)/(100 - 5 - 1))
= 1 - (1 - .9)(99/94)
= 1 - (.1)(1.05319)
= 1 - .105319
= .89468
Unlike the R_Square, the Adjusted R_Square can decline in value if the
contribution to the explained deviation by the additional variable is less
than the impact on the degrees of freedom. This means that the Adjusted
R_Square will react to alternative equations for the same dependent variable
in a manner similar to the Standard Error of the Estimate; i.e., the equation
with the smallest Standard Error of the Estimate will most likely also have
the highest Adjusted R_Square.
A final caution, however, is that while the R_Square is a percent, the
Adjusted R_Square is NOT and should be referred to as an index value.