How can I adjust a coefplot for the constant value of categorical variable estimation? - stata

I have a dataset in Stata that looks something like this
Variable | Obs Mean Std. dev. Min Max
-------------+---------------------------------------------------------
dv2 | 1,904 .5395645 .427109 -1.034977 1.071396
xvar | 1,904 3.074055 1.387308 1 5
with xvar being a categorical independent variable and dv2 a dependent variable of interest.
I am estimating a simple model with the categorical variable as a dummy:
reg dv2 ib4.xvar
eststo myest
Source | SS df MS Number of obs = 1,904
-------------+---------------------------------- F(4, 1899) = 13.51
Model | 9.60846364 4 2.40211591 Prob > F = 0.0000
Residual | 337.540713 1,899 .177746558 R-squared = 0.0277
-------------+---------------------------------- Adj R-squared = 0.0256
Total | 347.149177 1,903 .182422058 Root MSE = .4216
------------------------------------------------------------------------------
dv2 | Coefficient Std. err. t P>|t| [95% conf. interval]
-------------+----------------------------------------------------------------
xvar |
A | .015635 .0307356 0.51 0.611 -.044644 .075914
B | .1435987 .029325 4.90 0.000 .0860861 .2011113
C | .1711176 .0299331 5.72 0.000 .1124124 .2298228
E | .1337754 .0295877 4.52 0.000 .0757477 .1918032
|
_cons | .447794 .020191 22.18 0.000 .4081952 .4873928
------------------------------------------------------------------------------
These are the results. As you can see B, C and E have larger effect than D which is the excluded category.
However, coefplot does not account for the in categorical variable the coefficient is composite true_A=D+A.
coefplot myest, scheme(s1color) vert
As you can see the plot shows the constant to be the largest coefficient, while the other to be smaller.
Is there a systematic way I can adjust for this problem and plot the true coefficients and SEs of each category?
Thanks a lot for your help

In response to your second comment, here is an example of how you can use marginsplot to plot estimated effects from a linear regression.
sysuse auto, clear
replace price = price/100
reg price i.rep78, cformat(%9.2f)
------------------------------------------------------------------------------
price | Coefficient Std. err. t P>|t| [95% conf. interval]
-------------+----------------------------------------------------------------
rep78 |
2 | 14.03 23.56 0.60 0.554 -33.04 61.10
3 | 18.65 21.76 0.86 0.395 -24.83 62.13
4 | 15.07 22.21 0.68 0.500 -29.31 59.45
5 | 13.48 22.91 0.59 0.558 -32.28 59.25
|
_cons | 45.65 21.07 2.17 0.034 3.55 87.74
------------------------------------------------------------------------------
margins i.rep78, cformat(%9.2f)
------------------------------------------------------------------------------
| Delta-method
| Margin std. err. t P>|t| [95% conf. interval]
-------------+----------------------------------------------------------------
rep78 |
1 | 45.65 21.07 2.17 0.034 3.55 87.74
2 | 59.68 10.54 5.66 0.000 38.63 80.73
3 | 64.29 5.44 11.82 0.000 53.42 75.16
4 | 60.72 7.02 8.64 0.000 46.68 74.75
5 | 59.13 8.99 6.58 0.000 41.18 77.08
------------------------------------------------------------------------------
marginsplot
Note that these values are the constant plus the appropriate coefficient.
And then using the marginsplot command we can produce the following plot, which includes the marginal estimates and confidence intervals:

Related

How can I store estimates for one of the equations contained in SUR Estimation

I need to compare two different estimation methods and see if there are statistically same or not. However one of my estimation methods is SUR (Seemingly Unrelated Regression). And, I estimated the my 11 different models using
sureg (Y1 trend X1 .... X106) (Y2 trend X1..... X181) ..... (Y11 trend X1 .... X 130)
Then I estimated single OLS model as shown in following
glm(Y1 trend X1 ...... X106)
Now I need to test if parameter estimates of X1 to X106 comming from sureg is equal to glm estimates of same variables or not? I need to use Haussman specification test. I couldn't figure how can I store parameters estimates for specific equation in an SUR system estimation.
I couldn't find object should I add to estimates store XXX to subset a part of SUR estimates.
It's not easy to give a working example using my own crowded data, but let me present same problem using stata's auto data.
. sysuse auto (1978 automobile data)
. sureg (price mpg headroom) (trunk weight length) (gear_ratio turn headroom)
Seemingly unrelated regression
------------------------------------------------------------------------------ Equation Obs Params RMSE "R-squared" chi2 P>chi2
------------------------------------------------------------------------------ price 74 2 2576.37 0.2266 21.24
0.0000 trunk 74 2 2.912933 0.5299 82.93 0.0000 gear_ratio 74 2 .3307276 0.4674 65.12 0.0000
------------------------------------------------------------------------------
------------------------------------------------------------------------------
| Coefficient Std. err. z P>|z| [95% conf. interval]
-------------+---------------------------------------------------------------- price |
mpg | -258.2886 57.06953 -4.53 0.000 -370.1428 -146.4344
headroom | -419.4592 390.4048 -1.07 0.283 -1184.639 345.7201
_cons | 12921.65 2025.737 6.38 0.000 8951.277 16892.02
-------------+---------------------------------------------------------------- trunk |
weight | -.0010525 .0013499 -0.78 0.436 -.0036983 .0015933
length | .1735274 .0471176 3.68 0.000 .0811785 .2658762
_cons | -15.6766 5.182878 -3.02 0.002 -25.83485 -5.518345
-------------+---------------------------------------------------------------- gear_ratio |
turn | -.0652416 .0097031 -6.72 0.000 -.0842594 -.0462238
headroom | -.0601831 .0505198 -1.19 0.234 -.1592001 .0388339
_cons | 5.781748 .3507486 16.48 0.000 5.094293 6.469202
------------------------------------------------------------------------------
. glm (price mpg headroom)
Iteration 0: log likelihood = -686.17715
Generalized linear models Number of obs = 74 Optimization : ML Residual df = 71
Scale parameter = 6912463 Deviance = 490784895.4 (1/df) Deviance = 6912463 Pearson = 490784895.4 (1/df) Pearson = 6912463
Variance function: V(u) = 1 [Gaussian] Link function : g(u) = u [Identity]
AIC = 18.62641 Log likelihood = -686.1771533 BIC = 4.91e+08
------------------------------------------------------------------------------
| OIM
price | Coefficient std. err. z P>|z| [95% conf. interval]
-------------+----------------------------------------------------------------
mpg | -259.1057 58.42485 -4.43 0.000 -373.6163 -144.5951
headroom | -334.0215 399.5499 -0.84 0.403 -1117.125 449.082
_cons | 12683.31 2074.497 6.11 0.000 8617.375 16749.25
------------------------------------------------------------------------------
as you see for the price model (glm) parameter estimate of mpg coef is -259.10 and parameter for same variable estimated in SUR system is -258.288
Now I wanted to test if parameter estimates of GLM and SUR methods are statistically equal or not.

How to include dummy variables in ivreg model?

I have the following model:
ivreg ldemand social_housing transport year (lprice = utilities)
However, I want to make year as a dummy variable.
How can I do it in Stata?
Using i.year doesn't work for the ivreg command.
Cross-posted on Statalist.
The command ivreg does not allow factor variables:
. webuse hsng2, clear
. ivreg rent pcturban i.region (hsngval = faminc)
factor variables not allowed
r(101);
However, you can still use the xi prefix to create dummies on the fly:
. xi: ivreg rent pcturban i.region (hsngval = faminc)
i.region _Iregion_1-4 (naturally coded; _Iregion_1 omitted)
Instrumental variables (2SLS) regression
Source | SS df MS Number of obs = 50
-------------+---------------------------------- F(5, 44) = 9.10
Model | 12735.4667 5 2547.09334 Prob > F = 0.0000
Residual | 48507.6533 44 1102.44667 R-squared = 0.2079
-------------+---------------------------------- Adj R-squared = 0.1179
Total | 61243.12 49 1249.85959 Root MSE = 33.203
------------------------------------------------------------------------------
rent | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
hsngval | .0038683 .0008958 4.32 0.000 .0020629 .0056737
pcturban | -.4980121 .5179779 -0.96 0.342 -1.541928 .5459039
_Iregion_2 | 1.528672 15.14086 0.10 0.920 -28.98572 32.04306
_Iregion_3 | 7.74279 15.10906 0.51 0.611 -22.70752 38.1931
_Iregion_4 | -40.61235 19.60999 -2.07 0.044 -80.13369 -1.091002
_cons | 88.26681 31.69154 2.79 0.008 24.39671 152.1369
------------------------------------------------------------------------------
Instrumented: hsngval
Instruments: pcturban _Iregion_2 _Iregion_3 _Iregion_4 faminc
------------------------------------------------------------------------------
It is important to note that according to the command's help file:
Out-of-date command
ivreg is an out-of-date command as of Stata 10. ivreg has been replaced with the ivregress command.
Thus, it is best to switch to ivregress instead:
. ivregress 2sls rent pcturban i.region (hsngval = faminc), small
Instrumental variables (2SLS) regression
Source | SS df MS Number of obs = 50
-------------+------------------------------ F( 5, 44) = 9.10
Model | 12735.4667 5 2547.09334 Prob > F = 0.0000
Residual | 48507.6533 44 1102.44667 R-squared = 0.2079
-------------+------------------------------ Adj R-squared = 0.1179
Total | 61243.12 49 1249.85959 Root MSE = 33.203
------------------------------------------------------------------------------
rent | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
hsngval | .0038683 .0008958 4.32 0.000 .0020629 .0056737
pcturban | -.4980121 .5179779 -0.96 0.342 -1.541928 .5459039
|
region |
N Cntrl | 1.528672 15.14086 0.10 0.920 -28.98572 32.04306
South | 7.74279 15.10906 0.51 0.611 -22.70752 38.1931
West | -40.61235 19.60999 -2.07 0.044 -80.13369 -1.091002
|
_cons | 88.26681 31.69154 2.79 0.008 24.39671 152.1369
------------------------------------------------------------------------------
Instrumented: hsngval
Instruments: pcturban 2.region 3.region 4.region faminc
Type help ivregress from Stata's command prompt for more details.

Stata Predict GARCH

I want to do something very easy, but it doesnt work!
I need to see the predictions (and errors) of a GARCH model. The Main Variable es "dowclose", and my idea is look if the GARCH model has a good fitting on this variable.
Im using this easy code, but the prediction are just 0's
webuse dow1.dta
arch dowclose, noconstant arch(1) garch(1)
predict dow_hat, y
ARCH Results:
ARCH family regression
Sample: 1 - 9341 Number of obs = 9341
Distribution: Gaussian Wald chi2(.) = .
Log likelihood = -76191.43 Prob > chi2 = .
------------------------------------------------------------------------------
| OPG
dowclose | Coef. Std. Err. z P>|z| [95% Conf. Interval]
-------------+----------------------------------------------------------------
arch |
L1. | 1.00144 6.418855 0.16 0.876 -11.57929 13.58217
|
garch |
L1. | -.001033 6.264372 -0.00 1.000 -12.27898 12.27691
|
_cons | 56.60589 620784.7 0.00 1.000 -1216659 1216772
------------------------------------------------------------------------------
This is to be expected: you have no covariates and no intercept, so there's nothing to predict.
Here's a simple OLS regression that makes the problem apparent:
. sysuse auto
(1978 Automobile Data)
. reg price, nocons
Source | SS df MS Number of obs = 74
-------------+------------------------------ F( 0, 74) = 0.00
Model | 0 0 . Prob > F = .
Residual | 3.4478e+09 74 46592355.7 R-squared = 0.0000
-------------+------------------------------ Adj R-squared = 0.0000
Total | 3.4478e+09 74 46592355.7 Root MSE = 6825.9
------------------------------------------------------------------------------
price | Coef. Std. Err. t P>|t| [95% Conf. Interval]
------------------------------------------------------------------------------
. predict phat
(option xb assumed; fitted values)
. sum phat
Variable | Obs Mean Std. Dev. Min Max
-------------+--------------------------------------------------------
phat | 74 0 0 0 0

Retrieving standard errors after the command nlcom

In Stata the command nlcom employs the delta method to test nonlinear hypotheses about estimated coefficients. The command displays the standard errors in the results window, though unfortunately does not save them anywhere.
What is available after estimation is just the matrix r(V), but I cannot figure out how to use it to compute the standard errors.
You need to use the post option, like this:
. sysuse auto
(1978 Automobile Data)
. reg price mpg weight
Source | SS df MS Number of obs = 74
-------------+------------------------------ F( 2, 71) = 14.74
Model | 186321280 2 93160639.9 Prob > F = 0.0000
Residual | 448744116 71 6320339.67 R-squared = 0.2934
-------------+------------------------------ Adj R-squared = 0.2735
Total | 635065396 73 8699525.97 Root MSE = 2514
------------------------------------------------------------------------------
price | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
mpg | -49.51222 86.15604 -0.57 0.567 -221.3025 122.278
weight | 1.746559 .6413538 2.72 0.008 .467736 3.025382
_cons | 1946.069 3597.05 0.54 0.590 -5226.245 9118.382
------------------------------------------------------------------------------
. nlcom ratio: _b[mpg]/_b[weight], post
ratio: _b[mpg]/_b[weight]
------------------------------------------------------------------------------
price | Coef. Std. Err. z P>|z| [95% Conf. Interval]
-------------+----------------------------------------------------------------
ratio | -28.34844 58.05769 -0.49 0.625 -142.1394 85.44254
------------------------------------------------------------------------------
. di _se[ratio]
58.057686
This standard error is the square root of the entry from the variance matrix r(V):
. matrix list r(V)
symmetric r(V)[1,1]
ratio
ratio 3370.6949
. di sqrt(3370.6949)
58.057686
Obviously you need to take square roots of the diagonal elements of r(V). Here's an approach that returns the standard errors as variables in a one-observation data set.
sysuse auto, clear
reg mpg weight turn
nlcom (v1: 1/_b[weight]) (v2: _b[weight]/_b[turn])
mata: se = sqrt(diagonal(st_matrix("r(V)")))'
clear
getmata (se1 se2 ) = se /* supply names as needed */
list

Using user-written command chest in Stata for change-in-estimate effects

I'm using the user-written command chest in Stata to look at the change-in-estimate with the variables in my model.
After running the linear regression of
regress age allelecount gender htn_g dm_g lipid_g i.hx_smoking b_bmi hx_med_asa if cadhx2==0
I run the chest command
chest allelecount, backward nograph
but I only get output for one variable
chest allelecount, backward
Change-in-estimate
regress regression. Outcome: age
number of obs = 476 Exposure: allelecount
----------------------------------------------------------
Variables |
removed | Coef. [95% Conf. Interval] Change, %
----------+-----------------------------------------------
Adj.All | -0.3691 -0.6819 -0.0564
-lipid_g | -0.3688 -0.6804 -0.0571 -0.0996
----------------------------------------------------------
Can anyone explain this?
Using the auto data of Stata, I find no problem:
sysuse auto
regress price mpg rep78 headroom
Source | SS df MS Number of obs = 69
-------------+------------------------------ F( 3, 65) = 7.51
Model | 148497605 3 49499201.8 Prob > F = 0.0002
Residual | 428299354 65 6589220.82 R-squared = 0.2575
-------------+------------------------------ Adj R-squared = 0.2232
Total | 576796959 68 8482308.22 Root MSE = 2566.9
------------------------------------------------------------------------------
price | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
mpg | -289.3462 62.53921 -4.63 0.000 -414.2456 -164.4467
rep78 | 670.8971 343.5213 1.95 0.055 -15.16242 1356.957
headroom | -300.0293 398.0516 -0.75 0.454 -1094.993 494.9346
_cons | 10921.33 2153.003 5.07 0.000 6621.487 15221.17
chest mpg,backward
Change-in-estimate
regress regression. Outcome: price
number of obs = 69 Exposure: mpg
----------------------------------------------------------
Variables |
removed | Coef. [95% Conf. Interval] Change, %
----------+-----------------------------------------------
Adj.All | -289.3462 -411.9208 -166.7715
-headroom | -271.6425 -384.8719 -158.4132 -6.1185
-rep78 | -226.3607 -332.1613 -120.5600 -16.6697
----------------------------------------------------------