dealing with high dimensional fixed effects

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

dealing with high dimensional fixed effects

statalist
I am looking for suggestions for estimating LSDV/fixed effect models for which there are potentially large numbers of fixed effects.

I have a very large dataset, and estimating all these parameters, even after differencing out one set (areg/xtreg), is not possible on my UNIX system. All this being said, I am not really interested in the parameter estimates as much as controlling for unobserved heterogeneity.

Do people have suggestions for this? I know about the various two-way FE estimators (felsdvreg, a2reg, gpreg), but these are missing various features that I am interested in preserving (namely clustering and weighting). One could possibly cluster ("wild bootstrap") with these estimators, but weighting is not possible as far as I can tell.

Perhaps there is any easier way? I have not been able to reproduce estimates using a two-step, Frisch Waugh Lovell approach with weights, and I am wondering if I am missing something here...

xtreg y [aweight=weight], i(workerid) fe
predict y_res, ue
xtreg x [aweight=weight], i(workerid) fe
predict x_res, ue
xtreg y_res x_res [aweight=weight], i(firmid) fe cluster(group)

Any other suggestions for dealing with these types of (seemingly more and more common) issues?


*
*   For searches and help try:
*   http://www.stata.com/help.cgi?search
*   http://www.stata.com/support/statalist/faq
*   http://www.ats.ucla.edu/stat/stata/
Reply | Threaded
Open this post in threaded view
|

Re: dealing with high dimensional fixed effects

priyanka
This post has NOT been accepted by the mailing list yet.
I am having a similar problem in that I am running an OLS regression with 2 high-dimensional fixed effects, but would like to weight my analysis using pweights and cluster my standard errors. Does anybody know how to estimate this? It appears that a2reg and reg2hdfe do not allow for weights.

Thank you!