TU Wien:Advanced Methods for Regression and Classification VU (Filzmoser)/Oral exam 2020-02-04
Zur Navigation springen
Zur Suche springen
04.02.20 - 10:00[Bearbeiten | Quelltext bearbeiten]
Participant 1:
- SVM: SVM linear seperable and non seperable. Explain the optimization formula and how you get there for both, L_p, L_d, with constraints.
- Splines & Smooting Splines: Explain splines in general, i.e. knots, knot selection and df. Explain smoothing splines, optimization formula and general idea
Participant 2:
- Logistic Regression: Explain logistic regression with focus on the two group case.
- Explain regression trees with focus on the splitting criteria.
Participant 3:
- Explain Lasso and Ridge regression. Formulas, Lambda parameter, why does Lasso set coeficients exactly to 0, show it visually. The formula for beta^hat_ridge.
- LDA,QDA,GDA: What is the idea of LDA. What is the Bayesian Theorem what are its assumptions and formula (the phi formula). Formula for LDA and GDA. What are the components of LDA and how do you estimate them. Why can u estimate them?
- Random Forests: explain the Random Forest algorithm, bagging, selection of features, T_b, pi_bj, theta_bj