After running a repeated measures ANOVA model, we are often interested in performing post hoc contrasts between various cell means or effects to estimate specific effects or differences in effects. To specify a contrast, some of the coefficients will be weighted with positive weights (e.g. 1) and others with negative weights (e.g. 1).
In SPSS, the repeated measures ANOVA model is a multivariate model (multiple outcomes).
To demonstrate, we will run a repeated measures ANOVA model with one betweensubjects factor with 2 levels, A, and one withinsubjects factor with 4 levels, which we will call time. The levels of the withinsubjects factor, time, varies across the 4 measured outcomes.
Below are the data:
s  a  y1  y2  y3  y4 

1  1  3  4  7  7 
2  1  6  5  8  8 
3  1  3  4  7  9 
4  1  3  3  6  8 
5  2  1  2  5  10 
6  2  2  3  6  10 
7  2  2  4  5  9 
8  2  2  3  6  11 
The variables are:
 s: subject identifier
 a: betweensubjects factor
 y1, y2, y3, y4: repeated measures of the outcome, representing the 4 levels of the withinsubjects factor time
To perform a repeated measures ANOVA in SPSS, we will use this syntax:
GLM y1 y2 y3 y4 BY a /WSFACTOR=time 4 /WSDESIGN=time /DESIGN=a /PRINT=parameter.
Let’s look at the output table
Dependent Variable  Parameter  B  Std. Error  t  Sig.  95% Confidence Interval  
Lower Bound  Upper Bound  
y1  Intercept  1.750  .559  3.130  .020  .382  3.118  
[a=1]  2.000  .791  2.530  .045  .066  3.934  
[a=2]  0^{a}  .  .  .  .  .  
y2  Intercept  3.000  .408  7.348  .000  2.001  3.999  
[a=1]  1.000  .577  1.732  .134  .413  2.413  
[a=2]  0^{a}  .  .  .  .  .  
y3  Intercept  5.500  .354  15.556  .000  4.635  6.365  
[a=1]  1.500  .500  3.000  .024  .277  2.723  
[a=2]  0^{a}  .  .  .  .  .  
y4  Intercept  10.000  .408  24.495  .000  9.001  10.999  
[a=1]  2.000  .577  3.464  .013  3.413  .587  
[a=2]  0^{a}  .  .  .  .  .  
a. This parameter is set to zero because it is redundant.  
From the Parameter Estimates table, we can see that we SPSS parameterizes this model like so:
$$y_{1i} = \beta_{01} + \beta_{11}(a_i=1) + \beta_{21}(a_i=2) + \epsilon_{1i}$$ $$y_{2i} = \beta_{02} + \beta_{12}(a_i=1) + \beta_{22}(a_i=2) + \epsilon_{2i}$$ $$y_{3i} = \beta_{03} + \beta_{13}(a_i=1) + \beta_{23}(a_i=2) + \epsilon_{3i}$$ $$y_{4i} = \beta_{04} + \beta_{14}(a_i=1) + \beta_{24}(a_i=2) + \epsilon_{4i}$$
or for short:
$$y_{ti} = \beta_{01} + \beta_{1t}(a_i=1) + \beta_{2t}(a_i=2) + \epsilon_{1t}$$
Each of the \(\hat{\beta}_{2t}\) are constrained to be 0 because of collinearity.
We can represent the model in matrix form as:
$$\mathbf{Y = XB + E}$$
where \(\mathbf{Y}\) is \(n\times t\), \(\mathbf{X}\) is \(n\times p\), \(\mathbf{B}\) is \(p\times t\), and \(\mathbf{E}\) is \(n\times t\), and here
 \(n=8\), number of subjects
 \(t=4\), number of time points (levels of withinsubjects factor)
 \(p=3\), number of predictors (including intercept and coefficients constrained to 0)
We will be using the matrix \(\mathbf{B}\) for our custom contrasts. For our model above, \(\mathbf{B}\) looks like this:
\begin{bmatrix}\hat{\beta}_{01} & \hat{\beta}_{02} & \hat{\beta}_{03} & \hat{\beta}_{04}\\\hat{\beta}_{11} & \hat{\beta}_{12} & \hat{\beta}_{13} & \hat{\beta}_{14}\\\hat{\beta}_{21} & \hat{\beta}_{22} & \hat{\beta}_{23} & \hat{\beta}_{24}\end{bmatrix}
Now we are ready to use LMATRIX and MMATRIX to perform our contrasts.
The matrix equation that guides the coding of our contrasts is this:
$$\mathbf{L’BM = K}$$
where for a single contrast,
 \(\mathbf{L}\) is a \(p\times 1\) vector of contrast coefficients, primarily used to specify betweensubjects contrasts
 \(\mathbf{B}\) is a \(p\times t\) matrix of the model parameters (regression coefficients)
 \(\mathbf{M}\) is a \(t\times 1\) vector of “transformation” coefficients, which correspond to the repeated outcomes and is primarily used to specify withinsubjects contrasts
 \(\mathbf{K}\) is a \(1\times 1\) vector of contrasts results, which are the values against which the \(c\) contrasts are specified, which are typically zeroes.
Each of \(\mathbf{L}\), \(\mathbf{M}\), and \(\mathbf{K}\) can be expanded to have \(c\) columns to represent \(c\) simultaneous contrasts.
Betweensubjects contrast
Let’s start with the a between subjects contrast. It is not terribly interesting since the regression table already has all of these contrasts, but it is still instructive as to how to use LMATRIX and MMATRIX.
Imagine we wanted to test whether the mean of $y$ for A=1 is different from A=2 when t=1 (the first time point).
Our predicted value for \(y_1\) given A=1 is:
$$\hat{y}_{1A=1} = \hat{\beta}_{01} + \hat{\beta}_{11}$$
The predicted value for \(y_1\) given A=2 is:
$$\hat{y}_{1A=2} = \hat{\beta}_{01} + \hat{\beta}_{21}$$
The effect of A for t=1 is just the difference between these 2 predicted values:
\begin{align} \hat{y}_{1A=2} – \hat{y}_{1A=1} & = (\hat{\beta}_{01} + \hat{\beta}_{21}) – (\hat{\beta}_{01} + \hat{\beta}_{11}) \\ & = \hat{\beta}_{21} – \hat{\beta}_{11} \end{align}
Intuitively, then, this contrast should be between the second and third parameter for the first equation. If we set up the \(\mathbf{L}\) matrix to reflect this contrast, knowing that it is used for betweensubjects contrast, it would look like this:
$$\mathbf{L} = \begin{bmatrix}0\\1\\1\end{bmatrix}$$
Then multiplying \(\mathbf{L’B}\) we get this:
\begin{align} \mathbf{L’B} & = \begin{bmatrix}0 & 1 & 1 \end{bmatrix}\times\begin{bmatrix}\hat{\beta}_{01} & \hat{\beta}_{02} & \hat{\beta}_{03} & \hat{\beta}_{04}\\\hat{\beta}_{11} & \hat{\beta}_{12} & \hat{\beta}_{13} & \hat{\beta}_{14}\\\hat{\beta}_{21} & \hat{\beta}_{22} & \hat{\beta}_{23} & \hat{\beta}_{24}\end{bmatrix} \\ & = \begin{bmatrix}(\hat{\beta}_{11} + \hat{\beta}_{21}) & (\hat{\beta}_{12} + \hat{\beta}_{22}) & (\hat{\beta}_{13} + \hat{\beta}_{23}) & (\hat{\beta}_{14} + \hat{\beta}_{24})\end{bmatrix} \end{align}
We see the quantity we want (\(\hat{\beta}_{21} – \hat{\beta}_{11}\)) as the first element of \(\mathbf{L’B}\). We can use the matrix \(\mathbf{M}\) to select this first element like so:
$$\mathbf{M} = \begin{bmatrix}1\\0\\0\\0\end{bmatrix}$$
The result of \(\mathbf{L’BM}\) is thus:
\begin{align} \mathbf{L’BM} & = \begin{bmatrix}(\hat{\beta}_{11} + \hat{\beta}_{21}) & (\hat{\beta}_{12} + \hat{\beta}_{22}) & (\hat{\beta}_{13} + \hat{\beta}_{23}) & (\hat{\beta}_{14} + \hat{\beta}_{24})\end{bmatrix} \times \begin{bmatrix}1\\0\\0\\0\end{bmatrix}\\ & = \begin{bmatrix}(\hat{\beta}_{11} + \hat{\beta}_{21})\end{bmatrix} \end{align}
Now we have the quantity we want isolated, we can test it against zero with the \(\mathbf{K}\) matrix, which looks like this:
$$\mathbf{K} = \begin{bmatrix}0\end{bmatrix}$$
So then the test we have is this:
\begin{align} \mathbf{L’BM} & = \mathbf{K} \\ \begin{bmatrix}(\hat{\beta}_{11} + \hat{\beta}_{21})\end{bmatrix} & = 0 \end{align}
Since all of the \(\hat{\beta}_{2t}\) are constrained to be \(0\), this is essentially a test of \(\hat{\beta}_{11}\) against zero.
Here is the SPSS code:
GLM y1 y2 y3 y4 BY a /WSFACTOR=time 4 /WSDESIGN=time /DESIGN=a /PRINT=parameter /LMATRIX = intercept 0 a 1 1 /MMATRIX = y1 1 y2 0 y3 0 y4 0.
And the resulting output:
Contrast  Transformed Variable  
T1  
L1  Contrast Estimate  2.000  
Hypothesized Value  0  
Difference (Estimate – Hypothesized)  2.000  
Std. Error  .791  
Sig.  .045  
95% Confidence Interval for Difference  Lower Bound  3.934  
Upper Bound  .066  
a. Based on the userspecified contrast coefficients (L’) matrix number 1  
We can see that the “Contrast Estiamte” is indeed equal to \(\hat{\beta}_{11}\).
Withinsubjects contrasts
Now let’s look at the effect of the withinsubjects factor, time.
We’ll start simple, looking at the mean of the outcome at time=1 vs the mean of the outcome at time=4, given that a=2.
We are testing this hypothesis then:
$$H_0: \mu_{4A=2} – \mu_{1A=2} = 0$$
Out predicted value for $\mu_{4a=2}$ and $\mu_{1A=2}$ are: $$\hat{y}_{4A=2} = \hat{\beta}_{04} + \hat{\beta}_{24}$$ $$\hat{y}_{1A=2} = \hat{\beta}_{01} + \hat{\beta}_{21}$$
To estimate the contrast then, we want:
$$\hat{y}_{4A=2} – \hat{y}_{1A=2} = \hat{\beta}_{04} + \hat{\beta}_{24} – (\hat{\beta}_{01} + \hat{\beta}_{21})$$
Regarding the $\hat{\beta}$ matrix, we see that we want to select and sum the first and third elements of the first and fourth columns. To extract those elements with the $L$ matrix, we specify $L$ as:
$$\mathbf{L} = \begin{bmatrix}1\\0\\1\end{bmatrix}$$
Multiplying $L’B$:
\begin{align} \mathbf{L’B} & = \begin{bmatrix}1 & 0 & 1 \end{bmatrix}\times\begin{bmatrix}\hat{\beta}_{01} & \hat{\beta}_{02} & \hat{\beta}_{03} & \hat{\beta}_{04}\\\hat{\beta}_{11} & \hat{\beta}_{12} & \hat{\beta}_{13} & \hat{\beta}_{14}\\\hat{\beta}_{21} & \hat{\beta}_{22} & \hat{\beta}_{23} & \hat{\beta}_{24}\end{bmatrix} \\ & = \begin{bmatrix}(\hat{\beta}_{01} + \hat{\beta}_{21}) & (\hat{\beta}_{02} + \hat{\beta}_{22}) & (\hat{\beta}_{03} + \hat{\beta}_{23}) & (\hat{\beta}_{04} + \hat{\beta}_{24})\end{bmatrix} \end{align}
In the resulting $L’B$ matrix (1X4 vector), we see the two quantities of our contrast, the first and fourth elements. We want to subtract the first element from the fourth, which we specify with this $M$ matrix:
$$\mathbf{M} = \begin{bmatrix}1\\0\\0\\1\end{bmatrix}$$
Multiplying $L’B$ by $M$:
\begin{align} \mathbf{L’BM} & = \begin{bmatrix}(\hat{\beta}_{01} + \hat{\beta}_{21}) & (\hat{\beta}_{02} + \hat{\beta}_{22}) & (\hat{\beta}_{03} + \hat{\beta}_{23}) & (\hat{\beta}_{04} + \hat{\beta}_{24})\end{bmatrix} \times \begin{bmatrix}1\\0\\0\\1\end{bmatrix} \\ & = \begin{bmatrix}(\hat{\beta}_{01} + \hat{\beta}_{21}) + (\hat{\beta}_{04} + \hat{\beta}_{24})\end{bmatrix} \\ & = (\hat{\beta}_{04} + \hat{\beta}_{24}) – (\hat{\beta}_{01} + \hat{\beta}_{21}) \end{align}
That is the contrast that we want.
Let’s perform this contrast in SPSS. Again we specify the weights for LMATRIX
and MMATRIX
, and we also output the estimated cell means with EMMEANS
to confirm our contrast.
GLM y1 y2 y3 y4 BY a /WSFACTOR=time 4 /WSDESIGN=time /DESIGN=a /PRINT=parameter /LMATRIX = intercept 1 a 0 1 /EMMEANS tables(a*time) /MMATRIX = y1 1 y2 0 y3 0 y4 1.
Contrast  Transformed Variable  
T1  
L1  Contrast Estimate  8.250  
Hypothesized Value  0  
Difference (Estimate  Hypothesized)  8.250  
Std. Error  .692  
Sig.  .000  
95% Confidence Interval for Difference  Lower Bound  6.556  
Upper Bound  9.944  
a. Based on the userspecified contrast coefficients (L') matrix number 1  
The contrast estimate is 8.25. Let's confirm this with the EMMEANS
table:


a  time  Mean  Std. Error  95% Confidence Interval  
Lower Bound  Upper Bound  
1  1  3.750  .559  2.382  5.118  
2  7.000  .354  6.135  7.865  
3  4.000  .408  3.001  4.999  
4  8.000  .408  7.001  8.999  
2  1  1.750  .559  .382  3.118  
2  5.500  .354  4.635  6.365  
3  3.000  .408  2.001  3.999  
4  10.000  .408  9.001  10.999  
The estimated cell means are $\hat{y}_{4A=2}=10$ and $\hat{y}_{1A=2}=1.75$, yielding the difference estimate: $\hat{y}_{4A=2}\hat{y}_{1A=2}=8.25$
Simple effect contrasts: differences of differences
The GLM command will automatically interact the betweensubjects factors with the withinsubjects factors in a repeatedmeasures design, allowing the effect of the betweensubjects factor to be different across levels of the withinsubjects factor and viceversa. We can see the test of the interaction in the output table "Tests of WithinSubjects Effects". Here is the table from the model above:


Source  Type III Sum of Squares  df  Mean Square  F  Sig.  
time  Sphericity Assumed  194.500  3  64.833  127.890  .000  
GreenhouseGeisser  194.500  1.752  110.992  127.890  .000  
HuynhFeldt  194.500  2.830  68.738  127.890  .000  
Lowerbound  194.500  1.000  194.500  127.890  .000  
time * a  Sphericity Assumed  19.375  3  6.458  12.740  .000  
GreenhouseGeisser  19.375  1.752  11.056  12.740  .002  
HuynhFeldt  19.375  2.830  6.847  12.740  .000  
Lowerbound  19.375  1.000  19.375  12.740  .012  
Error(time)  Sphericity Assumed  9.125  18  .507  
GreenhouseGeisser  9.125  10.514  .868  
HuynhFeldt  9.125  16.978  .537  
Lowerbound  9.125  6.000  1.521  
We see that the test of the interaction is significant with p<0.05, suggesting that the effects of A (betweensubjects factor) across timepoints are different, and that the effect of time is different for each level of A.
Let's do a contrast of the simple effect of A at time=2 vs the simple effect of A at time=4. This tests whether the effect of A is different at the two time points. We'll first formulate the contrast in terms of the regression coefficients.
The simple effect of A at time=2 is this difference: \begin{align} \hat{y}_{2A=2}  \hat{y}_{2A=1} & = (\hat{\beta}_{02} + \hat{\beta}_{22})  (\hat{\beta}_{02} + \hat{\beta}_{12}) \\ & = \hat{\beta}_{22}  \hat{\beta}_{12} \end{align}
The simple effect of A at time=4 is this difference: \begin{align} \hat{y}_{4A=2}  \hat{y}_{4A=1} & = (\hat{\beta}_{42} + \hat{\beta}_{42})  (\hat{\beta}_{42} + \hat{\beta}_{42}) \\ & = \hat{\beta}_{24}  \hat{\beta}_{14} \end{align}
The contrast of these simple effects is another difference: $$(\hat{y}_{4A=2}  \hat{y}_{4A=1})  (\hat{y}_{2A=2}  \hat{y}_{2A=1}) = (\hat{\beta}_{24}  \hat{\beta}_{14})  (\hat{\beta}_{22}  \hat{\beta}_{12}) $$
This contrast, then, is a contrast of two betweensubjects contrasts. Because the individual elements are betweensubjects contrasts, we will need to specify contrast coding in LMATRIX
(i.e. use a matrix with 1 and 1 elements). But then, those contrasts are themselves contrasted, which can be accomplished with contrast coding MMATRIX
.
The individual simple effects of A for time=2 and time=4 can be specified with matrix $L$ that subtracts the second element from the third:
$$\mathbf{L} = \begin{bmatrix}0\\1\\1\end{bmatrix}$$
Multiplying $L'B$ (we have seen this before):
\begin{align} \mathbf{L'B} & = \begin{bmatrix}0 & 1 & 1 \end{bmatrix}\times\begin{bmatrix}\hat{\beta}_{01} & \hat{\beta}_{02} & \hat{\beta}_{03} & \hat{\beta}_{04}\\\hat{\beta}_{11} & \hat{\beta}_{12} & \hat{\beta}_{13} & \hat{\beta}_{14}\\\hat{\beta}_{21} & \hat{\beta}_{22} & \hat{\beta}_{23} & \hat{\beta}_{24}\end{bmatrix} \\ & = \begin{bmatrix}(\hat{\beta}_{11} + \hat{\beta}_{21}) & (\hat{\beta}_{12} + \hat{\beta}_{22}) & (\hat{\beta}_{13} + \hat{\beta}_{23}) & (\hat{\beta}_{14} + \hat{\beta}_{24})\end{bmatrix} \end{align}
Now we want to subtract the second element of $L'B$ from the fourth. We can specify this contrast in $M$:
$$\mathbf{M} = \begin{bmatrix}0\\1\\0\\1\end{bmatrix}$$
Multiplying $L'B$ by $M$:
\begin{align} \mathbf{L'BM} & = \begin{bmatrix}(\hat{\beta}_{11} + \hat{\beta}_{21}) & (\hat{\beta}_{12} + \hat{\beta}_{22}) & (\hat{\beta}_{13} + \hat{\beta}_{23}) & (\hat{\beta}_{14} + \hat{\beta}_{24})\end{bmatrix} \times \begin{bmatrix}0\\1\\0\\1\end{bmatrix}\\ & = \begin{bmatrix}(\hat{\beta}_{12} + \hat{\beta}_{22}) + (\hat{\beta}_{14} + \hat{\beta}_{24})\end{bmatrix}\\ & = (\hat{\beta}_{14} + \hat{\beta}_{24})  (\hat{\beta}_{12} + \hat{\beta}_{22}) \end{align}
That is the contrast (of contrasts) that we want.
Let's try it in SPSS:
GLM y1 y2 y3 y4 BY a /WSFACTOR=time 4 /WSDESIGN=time /DESIGN=a /PRINT=parameter /LMATRIX = intercept 0 a 1 1 /EMMEANS tables(a*time) /MMATRIX = y1 0 y2 1 y3 0 y4 1.
Our contrast estimate:
Contrast  Transformed Variable  
T1  
L1  Contrast Estimate  3.500  
Hypothesized Value  0  
Difference (Estimate  Hypothesized)  3.500  
Std. Error  .645  
Sig.  .002  
95% Confidence Interval for Difference  Lower Bound  1.921  
Upper Bound  5.079  
a. Based on the userspecified contrast coefficients (L') matrix number 1  
The contrast estimate is 3.5. Let's confirm with the EMMEANS
table.


a  time  Mean  Std. Error  95% Confidence Interval  
Lower Bound  Upper Bound  
1  1  3.750  .559  2.382  5.118  
2  7.000  .354  6.135  7.865  
3  4.000  .408  3.001  4.999  
4  8.000  .408  7.001  8.999  
2  1  1.750  .559  .382  3.118  
2  5.500  .354  4.635  6.365  
3  3.000  .408  2.001  3.999  
4  10.000  .408  9.001  10.999  
From the table above: \begin{align} (\hat{y}_{4A=2}  \hat{y}_{4A=1})  (\hat{y}_{2A=2}  \hat{y}_{2A=1}) & = (10  8)  (5.5  7)\\ & = 3.5 \end{align}
Confirmed!