Schedule 

Lectures
Tuesday 15:40 - 17:10 K3  
Friday 13:10 - 14:40 K2  
Tutorial Classes (Link to Moodle)
Wednesday   9:00 - 10:30 K11 Instructor: Šárka Hudecová
Wednesday 10:40 - 12:10 K11 Instructor: Marek Omelka

Course Materials

The course contents has been revamped since its previous form (2021 and earlier). Hence, no course notes adapted to the current syllabus will be available this year. Students can use the link below to download the course notes from the previous year. Almost all the topics are covered there, although in a different order and a different level of detail.

Requirements for Credit/Exam 

Tutorial Credit:

The credit for the tutorial sessions will be awarded to the student who satisfies the following two conditions:

  1. Regular small assignments: A student needs to prepare acceptable solutions to at least 10 out of 12 tutorial class assignments. An assignment can be solved either during the corresponding tutorial class or the solution needs to be submitted within a pre-specified deadline.
  2. Project: A student needs to submit a project satisfying the requirements given in the assignment. A corrected version of an unsatisfactory project can be resubmitted once.

The nature of these requirements precludes any possibility of additional attempts to obtain the tutorial credit (with the exceptions listed above).

Exam:

The exam has two parts: written and oral, both conducted on the same day.

Detailed Course Syllabus  

  1. Introduction
    • Simple linear regression: technical and historical view
      Lecture 1, Sep. 30
  2. Linear regression model
    • Definition, assumptions
      Lecture 1, Sep. 30
    • Interpretation of regression parameters
      Lecture 2, Oct. 4
    • Least squares estimation (LSE)
      Lecture 2 , Oct. 4
    • Residual sums of squares, fitted values, hat matrix
      Lecture 3, Oct. 7
    • Geometric interpretation of LSE
      Lecture 3, Oct. 7
    • Equivalence of LR models
      Lecture 3, Oct. 7
    • Model with centered covariates
      Lecture 4, Oct. 11
    • Decomposition of sums of squares, coefficient of determination
      Lecture 4-5, Oct. 11 and 14
    • LSE under linear restrictions
      Lecture 5, Oct. 14
  3. Properties of LS estimates
    • Moment properties
      Lecture 6, Oct. 18
    • Gauss-Markov theorem
      Lecture 6, Oct. 18
    • Properties under normality
      Lecture 6, Oct. 18
  4. Statistical inference in LR model
    • Exact inference under normality
      Lecture 7, Oct. 21
    • Submodel testing
      Lecture 8, Oct. 25
    • One-way ANOVA model
      Lecture 8, Oct. 25
    • Connections to maximum likelihood theory
      Lecture 9, Nov. 1
    • Asymptotic inference with random covariates
      Lecture 9-10, Nov. 1 and 4
    • Asymptotic inference with fixed covariates
      Lecture 10, Nov. 4
  5. Predictions
    • Possible objectives or regression analysis.
      Lecture 10, Nov. 4
    • Pitfalls of predictions
      Lecture 10, Nov. 4
    • Confidence interval for estimated conditional mean of an existing/future observation
      Lecture 10, Nov. 4
    • Confidence interval for the response of a future observation
      Lecture 11, Nov. 8
  6. Model Checking and Diagnostic Methods I.
    • Residuals, standardized/studentized residuals
      Lecture 11, Nov. 8
    • Residual plots, QQ plots
      Lecture 11, Nov. 8
    • Checking homoskedasticity
      Lecture 11, Nov. 8
  7. Transformation of the response
    • Interpretation of log-transformed model
      Lecture 12, Nov. 11
    • Box-Cox transformation
      Lecture 12, Nov. 11
  8. Parametrization of a single covariate
    • Single factor covariate (one-way ANOVA model)
      Lecture 12-13, Nov. 11 and 15
    • Single numerical covariate
      Lecture 14-15, Nov. 18 and 22
  9. Multiple tests and simultaneous confidence intervals
    • Bonferroni method
      Lecture 15-16, Nov. 22 and 25
    • Tukey method
      Lecture 16-17, Nov. 25 and 29
    • Scheffé method
    • Confidence band for the whole regression surface
  10. Interactions
    • Interactions of two factors: two-way ANOVA
    • Interactions of two numerical covariates
    • Interactions of a numerical covariate with a factor
  11. Regression model with multiple covariates
    • Decomposition of the model with additional covariate
    • Effects on fitted values, residuals, RSS, coef. of determination
    • Effects on parameter estimates
    • Orthogonal covariates
    • Decomposition of regression sum of squares
    • Multicollinearity
    • Confounding bias
    • Mediation
    • Assessment of causality
  12. Analysis of variance (ANOVA) models
    • One-way ANOVA
    • Two-way ANOVA without interactions
    • Two-way ANOVA with interactions
  13. Model-building strategies
    • Model choice based on sequential submodel testing
    • Functional form of numerical covariates
    • Inclusion of interactions
    • Goodness of fit measures
    • Step-wise procedures
    • Comparison to AI methods
  14. Model Checking and Diagnostic Methods II.
    • Independence of error terms
    • Leverage points, outliers
    • Influential observations
    • Jackknife residuals
    • DFBetas
    • Cook's distance
  15. Weighted least squares
  16. Dealing with heteroskedasticity: sandwich estimation
    • Linear model without equal variance assumption - asymptotics
    • White estimator
  17. Covariate measurement errors
  18. Missing data issues in regression models