35 We then use the Slutsky's theorem to conclude that √元(Xm)-0o)bN(0,I(0o)-1)
35 We then use the Slutsky’s theorem to conclude that √n(ˆθ(Xn) − θ0) D→ N(0, I(θ0)−1)
36 Establishing F1 The random vectors (X1;00),...(Xn;0o)are i.i.d.We need to show that they have mean zero.Then,I(00)will be the covariance matrix of (X;00)and an application of the multivariate central limit theorem for i.i.d.random vectors gives the desired result. We will show something stronger,namely Eol(X;0)]=0 for all 0EN.Condition v.guarantees that we can interchange integration and differentiation.Consider the case where k=1.We know that 1=fp(x;0)du(x)for all 0EN.This implies that 0=品∫p(r;f)du(c.Let's show that 是∫p(ac;)d(x)=∫品p(x;)du(x.Choose a sequence On∈W such that n0.Then,by definition of a derivative,we know that 0-典0a二0 for all rc de m→ 0m-0
36 Establishing F1 The random vectors ψ(X1; θ0),...ψ(Xn; θ0) are i.i.d. We need to show that they have mean zero. Then, I(θ0) will be the covariance matrix of ψ(X; θ0) and an application of the multivariate central limit theorem for i.i.d. random vectors gives the desired result. We will show something stronger, namely Eθ[ψ(X; θ)] = 0 for all θ ∈ N . Condition v. guarantees that we can interchange integration and differentiation. Consider the case where k = 1. We know that 1 = p(x; θ)dµ(x) for all θ ∈ N . This implies that 0 = ddθ p(x; θ)dµ(x). Let’s show that d dθ p(x; θ)dµ(x) = ddθ p(x; θ)dµ(x). Choose a sequence θn ∈ N such that θn → θ. Then, by definition of a derivative, we know that dp(x; θ) dθ = limn→∞{p(x; θn) − p(x; θ) θn − θ } for all x ∈ X