wykład ekonometria 1

 0    40 kartičky    beatabalcerzak
stáhnout mp3 Vytisknout hrát zkontrolovat se
 
otázka American English odpověď American English
(Data) A panel is called balanced if all micro-units (cross-sectional data) have measurements in all periods.
začněte se učit
TRUE
(Components of the regression model) In the model: y = B1+ B2xi, +ei, the variable x can be called a dependent variable.
začněte se učit
FALSE, In this model, 𝑦 y is the dependent variable, while 𝑥 x is the independent or explanatory variable.
(Components of the regression model) In the model: y = B+ B+e, 8, is the slope.
začněte se učit
FALSE, B1 is the intercept in this model, while B2 is the slope
(Assumptions of the regression model) Multicollinearity of explanatory variables is one of the assumptions underlying a multiple regression model.
začněte se učit
FALSE, Multicollinearity is not an assumption of the regression model; rather, it's a problem when explanatory variables are highly correlated, violating the assumption of no perfect multicollinearity.
(The Gauss-Markov theorem) The Gauss-Markov theorem states that the OLS estimator is best because, under specific assumptions, it is unbiased.
začněte se učit
TRUE
(Ordinary least squares) OLS estimates are selected in such a way that the sum of residuals was the smallest.
začněte se učit
FALSE, OLS minimizes the sum of the squared residuals, not just the sum of the residuals.
(Coefficient of determination) If the model does not contain an intercept parameter, SST ≠ SSR+SSE.
začněte se učit
TRUE
(Statistical tests) The level of significance of a test is the probability of committing an error consisting in rejecting the null hypothesis which is true.
začněte se učit
TRUE
(t-tests) When testing the null hypothesis Ho: Bk = c against the alternative hypothesis H1: Bk>c you should reject the null hypothesis if the test statistic t=<t with subscript "(1-alpha; N-K)"
začněte se učit
FALSE, You reject the null hypothesis if the test statistic 𝑡 is greater than the critical value 𝑡 with subscript (1 − 𝛼; 𝑁 − 𝐾) in a one-tailed test, not less than.
(Prediction) For a simple regression model: the variance of the forecast error depends on the variation in the explanatory variable.
začněte se učit
TRUE
(F-tests) In general, an F-test statistic value depends on restricted estimation results only.
začněte se učit
FALSE, The F-test statistic depends on both restricted and unrestricted models since it compares the two.
(Restricted estimation) The restricted least squares estimator stays unbiased, even if the constraints that are imposed are false.
začněte se učit
FALSE, If the imposed constraints are incorrect, the estimator will generally be biased because the true model is mis-specified.
(Nonlinear models) In the log-log model the slope is constant.
začněte se učit
FALSE, In a log-log model, the elasticity (percentage change in y with respect to percentage change in x) is constant, but the slope itself is not constant.
(The Jarque-Bera test) The Jarque-Bera test statistic depends on skewness and kurtosis of the data.
začněte se učit
TRUE
(Specification errors) The omitted-variable bias occurs if the omitted variable is corre- lated with the variables included in the model.
začněte se učit
TRUE
(Collinearity) One of the consequences of strong linear dependencies between explanatory variables is that the standard errors are small.
začněte se učit
FALSE, Strong multicollinearity actually leads to inflated (large) standard errors, making it harder to detect significant relationships.
(Heteroskedasticity) The Breusch-Pagan test uses a variance function including all explanatory variables from the model under investigation.
začněte se učit
TRUE
(Dummy variables) A slope-indicator variable allows for a change in the intercept.
začněte se učit
FALSE, A slope-indicator variable allows for a change in the slope, not the intercept. It interacts with an explanatory variable to change the slope for different groups.
(Dummy variables) The value 0 for a dummy variable defines the reference group, or base group.
začněte se učit
TRUE
(Autocorrelation) One consequence of autocorrelated errors is that the least squares estimator is no longer best.
začněte se učit
TRUE
(Types of data) Annual profit for each of 400 randomly chosen micro enterprises from Poland for the year 2022 is an example of cross sectional series.
začněte se učit
TRUE
(Components of the regression model) Regressand can be otherwise referred to as an explanatory variable.
začněte se učit
FALSE, Regressand refers to the dependent variable, not the explanatory variable
(Components of the regression model) In the model: yi = β1 +β2xi +ei, β1 and β2 are random variables.
začněte se učit
FALSE, β1 and β2 ​are parameters, not random variables
(Assumptions of the regression model) Homoskedasticity of the error term is one of the assumptions underlying a multiple regression model.
začněte se učit
TRUE
(The Gauss-Markov theorem) The Gauss-Markov theorem implies that the OLS estimator is better than any nonlinear unbiased estimator.
začněte se učit
FALSE, The Gauss-Markov theorem only applies to linear unbiased estimators, and it does not state that OLS is better than any nonlinear estimator​
(Ordinary least squares) Standard errors are square roots of estimated variances of the OLS estimators.
začněte se učit
TRUE
(Coefficient of determination) The value of R2 can decrease if we add an insignificant explanatory variable to the model.
začněte se učit
FALSE, The value of 𝑅 2 R 2 cannot decrease by adding an explanatory variable, even if it is insignificant
(Confidence intervals) For a given dataset and model, a 99% interval estimate of a parameter of the model is wider than a 95% interval.
začněte se učit
TRUE
(t-tests) Using a t-test we can test whether all the variables in the multiple regression model are jointly insignificant.
začněte se učit
FALSE, A t-test tests the significance of individual variables, while an F-test is used to test whether all variables in the model are jointly insignificant
(Prediction) For a simple regression model: the variance of the forecast error depends on the value of explantory variable used to compute the prediction.
začněte se učit
TRUE
(Testing) In an F-test a p-value of 0.02 leads to the rejection of the null hypothesis at 5% significance level.
začněte se učit
TRUE
(Scaling the variables) In the simple regression model: if the scale of y and x is changed by the same factor then the estimated intercept will change.
začněte se učit
TRUE
(Nonlinear models) In the model ln(yi) = β1 + β2ln(xi) + ei, the parameter β2 is elasticity.
začněte se učit
TRUE
(The Jarque-Bera test) The null hypothesis in the Jarque-Bera test concerns the normal distribution of the variable being tested.
začněte se učit
TRUE
(Specification errors) Including some unnecessary regressors in the multiple regression model produces biased estimators of the coefficients of the regressors that belong in the equation.
začněte se učit
FALSE, Adding unnecessary variables to a regression model increases the variance of the estimates but does not affect the accuracy (unbiasedness) of the estimates for the important variables already in the model.
(Multicollinearity) It is not possible to estimate the model by least squares when there is exact multicollinearity.
začněte se učit
TRUE
(Model selection) The AIC would choose, from models with the same sum of squared residuals, the model with the smallest number of parameters.
začněte se učit
FALSE, The AIC penalizes models for the number of parameters, but it doesn’t necessarily choose the model with the smallest number of parameters
(Heteroskedasticity) Heteroskedasticity tests include: the Breusch-Pagan test and the Durbin-Watson test.
začněte se učit
FALSE, The Durbin-Watson test is for autocorrelation, not heteroskedasticity. Breusch-Pagan is a test for heteroskedasticity
(Heteroskedasticity) One consequence of heteroskedasticity is that the usual standard errors are incorrect and should not be used.
začněte se učit
TRUE
(Dummy variables) A dummy variable trap means that the model cannot be estimated using ordinary least squares because of an incorrect use of indicator variables.
začněte se učit
TRUE

Chcete-li přidat komentář, musíte se přihlásit.