Wald test

This function grew out of my frustration with the lack of simple Wald testing options in R. In some cases, the implementations have had bugs, so I wrote my own to implement the formula in Cameron and Trivedi (2005). There are three versions of the Wald test in tstools.

As of this writing (November 2021) these functions have only been lightly tested. There is no guarantee the output is correct, but I have done plenty of tests for the common use cases. Please let me know of any problems you encounter.

testzero

This is the simplest to use but also the most limited. You are testing if a group of coefficients in a linear regression model is equal to zero. Suppose you have \[y_{t} = \beta_{0} + \beta_{1} x_{1t} + \beta_{2} x_{2t} + \beta_{3} x_{3t} + \varepsilon_{t}\]

If the output of that regression is in fit, you can do a test of \(\beta_{2} = \beta_{3} = 0\) using the testzero function:

testzero(fit, c(3,4))

The second argument holds the index of the coefficients equal to zero under the null hypothesis. \(\beta_{2}\) is the third coefficient and \(\beta_{3}\) is the fourth, so the second argument is c(3,4).

The above uses the OLS covariance matrix for \(\theta\), so it may not be valid in the presence of heteroskedasticity and serial correlation. To use the default White correction for heteroskedasticity, set white=TRUE:

testzero(fit, c(3,4), white=TRUE)

If you want the default Newey-West correction, use

testzero(fit, c(3,4), nw=TRUE)

wald

You can always write a set of linear restrictions on the coefficients in the form \[R \theta = q,\] where \(R\) is a matrix, \(q\) is a vector, and \(\theta\) is the vector of coefficients. Once you’ve done that, you can call the wald function like this:

wald(fit, R, q)

To correct for heteroskedasticity, use

wald(fit, R, q, white=TRUE)

To do a Newey-West correction, use

wald(fit, R, q, nw=TRUE)

wald.test

This is the lowest-level of the test. It is the most flexible, in the sense that you can provide the coefficients from any estimated model (not just a linear regression) and any covariance matrix (make any correction you want). You need not even estimate the model with R to use wald.test. If you put your constraint in the form \[R \theta = q\] and have covariance matrix \(V\), you call

wald.test(R, q, theta, V)

Which to use?

In terms of convenience, testzero is the easiest, wald the second easiest, and wald.test not convenient at all. Note, however, that the same ordering applies to generality. wald.test can be used for any linear restrictions on any estimated model, while testzero can only be used to test that the coefficients in a linear regression model are zero.