TU Wien:Advanced Methods for Regression and Classification VU (Filzmoser)/Oral exam 2019-12-19

Aus VoWi
Zur Navigation springen Zur Suche springen

11:00[Bearbeiten | Quelltext bearbeiten]

Participant 1:

  • Trees: Regression trees, classification trees. General idea, criterion to minimized, for both cases. What measures of node impurity are available? How to avoid overfitting (pruning).
  • SVM: criterion for linearly separable, non-linearly separable case. Kernel trick, kernel functions.
  • GAM: for regression. How does the model look like, what functions mimimize criterion.

Participant 2:

  • Mulitple regression model: Ordinary LS solution, how to arrive at it, what to do in near singularity of X^T X, R
  • Ridge Regression, Lasso Regression, how does that look like, what is is different to OLS
  • Spline regression: Criterion to minimize (with penalization of curvature), what functions do minimize this (natural cubic splines)

11:30[Bearbeiten | Quelltext bearbeiten]

Participant 1:

  • Define PCR and weighted least squares. Compare these two methods
  • Define SVM for the linearly seperable and for the inseperable case.