Identification

Published on February 2018 | Categories: Documents | Downloads: 57 | Comments: 0 | Views: 625
of 20
Download PDF   Embed   Report

Comments

Content

Standard tests Identification and Overidentification Testing Overidentifying Restrictions

IV/GMM Inference. Identification and Overidentification Walter Sosa-Escudero Econ 507. Econometric Analysis. Spring 2009

March 8, 2009

Walter Sosa-Escudero

IV/GMM Inference. Identification and Overidentification

Standard tests Identification and Overidentification Testing Overidentifying Restrictions

Standard tests Standard inference comes directly from the asymptotic normality result √ d n (βˆg − β0 ) → N (0, AV(βˆg )) Then, it is easy to see that Under H0 : βk = βk0 tk =

√ βˆgk − βk0 p nq → N (0, 1) c βˆgk ) AV(

c βˆgk ) is any consistent estimator. where AV( Under H0 : Rβ0 − r = 0 p

c βˆg )R0 ]−1 (Rβˆg − r) → χ2 (q) W = n(Rβˆg − r)0 [RAV( with R, r and q defined as in our original large sample inference problem. Walter Sosa-Escudero

IV/GMM Inference. Identification and Overidentification

Standard tests Identification and Overidentification Testing Overidentifying Restrictions

Overidentification: Prelude

Suppose there are two instruments z1 and z2 , for a model y = xβ0 + u where x is a single explanatory variable. Suppose ‘invalid instrument’ means that for no β E(zij ui ) = 0, j = 1, 2. You use only one instrument, say z1 and get u ˆi = yi − xi βˆ ˆ where β is the IV estimator using z1 as instrument. What can you learn about ‘instrument validity’ from the correlation between u ˆi and z2i ?.

Walter Sosa-Escudero

IV/GMM Inference. Identification and Overidentification

Standard tests Identification and Overidentification Testing Overidentifying Restrictions

Identification and Overidentification Recall that our moment conditions for the IV case are E[zi (yi − x0i β0 )] = 0, and our problem was that the sample counterpart n

1X zi (yi − x0i b) = 0 n i=1

implies a system of p linear equations with K unknowns, which cannot produce a solution when p > K. If all the moment conditions are valid we could discard p − K’ moment conditions in any arbitrary way and produce a MM estimator using the K moment conditions retained. Walter Sosa-Escudero

IV/GMM Inference. Identification and Overidentification

Standard tests Identification and Overidentification Testing Overidentifying Restrictions

Another thing we can do is to combine the p moment conditions linearly so as to produce K linearly independent moment conditions, that is, use: BE[z(yi − x0i β0 )] = 0 where B is any K × q matrix with ρ(B) = K In fact ‘throwing away’ moment conditions implies a particular choice of B! (which one?). Note that the sample counterpart n

1 X B zi (yi − x0i b) = 0 n i=1

produces a single solution (why?). Walter Sosa-Escudero

IV/GMM Inference. Identification and Overidentification

Standard tests Identification and Overidentification Testing Overidentifying Restrictions

GMM as a particular case of MM? GMM is choosing a particular B. Let ui (b) ≡ yi − x0i b and u(b) ≡ Y − Xb. Recall that the GMM estimator minimizes:     1 0 1 0 u(b) Z Wn Z u(b) J(b) = n n n so the FOC’s are:

X 0Z Z 0 u(b) Wn =0 n n which can be seen as the sample counterpart of: E[xi zi0 ] W E[zi ui (β0 )] = 0 | {z } | {z } K×p

p×1

which are K moment conditions! Walter Sosa-Escudero

IV/GMM Inference. Identification and Overidentification

Standard tests Identification and Overidentification Testing Overidentifying Restrictions

E[xi zi0 ] W E[zi ui (β0 )] = 0 | {z } | {z } K×p

p×1

Then GMM is choosing a particular way of linearly combining the p moment conditions so the system becomes exactly identified.

Walter Sosa-Escudero

IV/GMM Inference. Identification and Overidentification

Standard tests Identification and Overidentification Testing Overidentifying Restrictions

Identifying and Overidentifying Restrictions It is very illuminating to explore how these linear combinations are produced. 0

First note that W can be written as W = W 1/2 W 1/2 , where W 1/2 is an invertible p × p matrix. It will be more convenient to re-express our original p moment conditions as follows: W 1/2 E[zi ui (β0 )] = 0 Note that this does not alter at all the informational structure of the problem (why?).

Walter Sosa-Escudero

IV/GMM Inference. Identification and Overidentification

Standard tests Identification and Overidentification Testing Overidentifying Restrictions

Now write the GMM version of the moment conditions as: E[xi zi0 ] W E[zi ui (β0 )] = 0 0

E[xi zi0 ] W 1/2 W 1/2 E[zi ui (β0 )] = 0 F 0 W 1/2 E[zi ui (β0 )] = 0 0

where F 0 ≡ E[xi zi ] W 1/2 . Note F 0 is K × p with rank K. Let h ≡ W 1/2 E[zi ui (β0 )], so the moment conditions imply h = 0.

Walter Sosa-Escudero

IV/GMM Inference. Identification and Overidentification

Standard tests Identification and Overidentification Testing Overidentifying Restrictions

h = W 1/2 E[zi ui (β0 )] is a p vector, and it can be decomposed orthogonally as h = Ph + Mh where P and M are any pair of orthogonal projection matrices that project h orthogonally onto some subspace and its orthogonal complement. Note that the moment condition implies P h = 0 and M h = 0. Also, note that the decomposition holds for any pair of projection matrices.

Walter Sosa-Escudero

IV/GMM Inference. Identification and Overidentification

Standard tests Identification and Overidentification Testing Overidentifying Restrictions

In particular for P = F (F 0 F )−1 F 0 ≡ Pf , where 0 F 0 = E[xi zi0 ]W 1/2 .

Note F 0 is K × p with rank K, hence Pf is the projection matrix that projects vectors of dimension p onto the span of the K column vectors in F , hence Pf has rank K and Mf = I − F (F 0 F )−1 F 0 has rank p − K (Remember the dimension theorem...).

Walter Sosa-Escudero

IV/GMM Inference. Identification and Overidentification

Standard tests Identification and Overidentification Testing Overidentifying Restrictions

Then F (F 0 F )−1 F 0 h = 0, forms a system of K linearly independent equations with K unkowns. Now note since F (F 0 F )−1 F 0 has rank K, F (F 0 F )−1 F 0 h = 0 whenever F 0h = 0 which are the ‘MM’ conditions derived from the GMM procedure. This leds us to a very important result.

Walter Sosa-Escudero

IV/GMM Inference. Identification and Overidentification

Standard tests Identification and Overidentification Testing Overidentifying Restrictions

What GMM is doing in when p > K is decomposing the moment conditions, h, in two orthogonal parts. One that is used to exactly identify the relevant parameters (Pf h = 0) and another part which is left unused (Mf h = 0). The moment conditions Pf h = 0 are called the identifying conditions and the conditions Mf h = 0 are called the overidentifying conditions. In a certain sense, we can see GMM as using a particular set of K moment conditions and discarding the remaining p − K.

Walter Sosa-Escudero

IV/GMM Inference. Identification and Overidentification

Standard tests Identification and Overidentification Testing Overidentifying Restrictions

Another interesting intuition is the following. ˆ ≡ Wn1/2 n−1 Pn zi ui (βˆg ). Then the minimized GMM Let h i=1 objective function can be expressed as: ˆ 0h ˆ J(βˆg ) = n h 1/20

Let Fn ≡ n−1 X 0 Z Wn , Pf n = Fn (Fn0 Fn )−1 Fn0 and Mf n = I p − P f n . ˆ = (Pf n + Mf n )h. ˆ Replacing By orthogonal decomposition, h above and using the properties of these matrices: h i ˆ 0 (Pf n + Mf n )(Pf n + Mf n )h ˆ J(βˆg ) = n h h i 0 ˆ ˆ = n h (Pf n + Mf n )h h i ˆ 0 Pf n h ˆ+h ˆ 0 Mf n h ˆ = n h Walter Sosa-Escudero

IV/GMM Inference. Identification and Overidentification

Standard tests Identification and Overidentification Testing Overidentifying Restrictions

i h ˆ ˆ+h ˆ 0 Mf n h ˆ 0 Pf n h J(βˆg ) = n h ˆ = 0, which leds to a The FOC of the GMM problem sets Pf n h very useful intuituion about how GMM works and its usefulness for testing: The GMM estimator satisfies strictly the identifying restrictions and tries to make the overidentifying restrictions as small as possible. ˆ 0 Mf n h ˆ measures how far The minimized value of J(βˆg ) = n h is the sample from satisfying the overidentifying restrictions.

Walter Sosa-Escudero

IV/GMM Inference. Identification and Overidentification

Standard tests Identification and Overidentification Testing Overidentifying Restrictions

Testing overidentifying restrictions Intuituion: The fact that J(βˆg ) measures ‘how far is the sample from satisfying the overidentifying restrictions’ can be exploited to design a formal specification test. If all the assumptions of the overidentified GMM model hold, then asymptotically the overidentifying conditions should be satisfied: the identifying conditions succeed in producing a consistent estimator and hence force all moment conditions to hold. If the J(βˆg ) is too large then some of the conditions that guarantee consistency and asymptotic normality are likely to be false (more on this later).

Walter Sosa-Escudero

IV/GMM Inference. Identification and Overidentification

Standard tests Identification and Overidentification Testing Overidentifying Restrictions

The ‘J’ test: the test statistic for the overidentifying restrictions test is: u(βˆn )0 Z ˆ−1 Z 0 u(βˆi ) Sn n n and under H0 : E[zi ui (β0 )] = 0 it converges in distribution to χ2 (p − K). J = Jn (βˆg ) = n

First note that the test uses the optimal GMM estimator, that is, Wn = Sˆn−1 . Intuition: the J test checks if the GMM is small. According to our previous result, this means checking if the overidentifying restrictions are small. Under the null that the model is correctly specified (all GMM assumptions hold), GMM is consistent and hence the overidentifying restrictions should be close to zero. Walter Sosa-Escudero

IV/GMM Inference. Identification and Overidentification

Standard tests Identification and Overidentification Testing Overidentifying Restrictions

What happens under the alternative hypothesis? Suppose the rank condition holds. The sample version of the p identifying conditions must be satisfied, so βˆg → β+ . Asymptotic normality of the moment conditions also holds, but centered at some other place that is not zero. Rewrite the statistic as: J =n

u(βˆn )0 Z ˆ−1 Z 0 u(βˆi ) √ √ Sn n n

Then when H0 does not hold, n → ∞ and the remaining part converges to something not centered at zero, then J → ∞. It is a global test for misspecification. It may mean that some moment conditions are invalid and/or that the model is misspecified. Walter Sosa-Escudero

IV/GMM Inference. Identification and Overidentification

Standard tests Identification and Overidentification Testing Overidentifying Restrictions

Proof: By our previous result i h ˆ ˆ 0 Mf n h Jn (βˆg ) = n h ˆ ≡ Wn1/2 Z u(βˆg ) , so with h n )0 ( ) ( 0 0 ˆ ˆ 1/2 Z u(βg ) 1/2 Z u(βg ) √ √ Mf n W n Jn = Wn n n 0

From the asymptotic normality proof of GMM we got Z 0 u(βˆg ) d √ → N (0, S). n 1/2

If Wn = Sˆn−1 and Wn Wn1/2

= Sˆ−1/2

Z 0 u(βˆg ) Z 0 u(βˆg ) d √ = Sˆ−1/2 √ → N (O, Ip ) n n Walter Sosa-Escudero

IV/GMM Inference. Identification and Overidentification

Standard tests Identification and Overidentification Testing Overidentifying Restrictions

Hence, the J test is an quadratic form of a p vector of independent normal variables, normed by an idempotent matrix with rank p − K, then the result follows.

Recall that if y ∼ N (0, Ip ) and A is an idempotent matrix with rank q, then y 0 Ay ∼ χ2 (q) (Hayashi (2000, p. 37)).

Walter Sosa-Escudero

IV/GMM Inference. Identification and Overidentification

Sponsor Documents

Or use your account on DocShare.tips

Hide

Forgot your password?

Or register your new account on DocShare.tips

Hide

Lost your password? Please enter your email address. You will receive a link to create a new password.

Back to log-in

Close