TU Wien:Machine Learning VU (Musliu)/Exam 2019-10-18

Aus VoWi
Zur Navigation springen Zur Suche springen

1) True / False Questions (not very exact, don't remember all of them): (per question: 2pts for correct, -1 point for wrong, 0 points if not answered)

  • Random forests heterogenous ensemble learner?
  • Perceptron with Softmax solves XOR-Problem
  • D-Separation used in neighbourhood searches in a Bayes net
  • Paired t-tests used for folds verification in holdout method (train /test split)
  • If information gain a method for unsupervised feature selection?
  • Is Bayesian optimization used for construction Bayesian networks?
  • SVM: Does the SOM (?) method/algorithm guarantee convergence to globally optimal solution?
  • ?
  • ?
  • ?

Regular questions, with some space to give written ans wer (may also not be 100% exactly worded like this)

2) Explain Rice's framework for algorithm selection. Can it be used for (Bayesian?) optimization of hyperparameters?

3) Difference between Ridge regression and Lasso regression

4) Given were 4 plots of the same dataset: 2 concentric rings of data points, were the elements of one ring each constitute a class. Draw the decision boundaries that a i) perceptron, ii) SVM with quadratic kernel, iii) 1-NN (i.e. k-NN with k=1) and iv) a decision tree would create (learn).

5) About boosting. A classifier called "stump" classifier is given, i.e. a 1-level decision tree (can make only one split). Given was an x-axis with 3 points x1= 1, x2 = 3, x3 = 5 (or something like that), with associated class labels -1, 1, -1.

  • what is the weight of each of the data points before classification?
  • let the first stump classifier split into two regions, i.e. draw a decision boundary
  • circle that data point that will get a higher weight for the second classification stage

6) Given was a 2D-plot with 4 data points of 2 classes (2 points each class). We have an SVM classifier available.

  • circle the data points that will be the support vectors.
  • draw the decision boundary of the SVM and give its slope.

7) Example data set, with some calculations to do. 9 given data points, 3 new data points given. It was explicitly asked to show the calculations necessary for arriving at the results.

  • for the 3 new data points, use 3-NN to predict the class labels. Define a suitable distance function, and explain why you chose it. (5pts)
  • one specifc variable was to be removed, now predict again, changes? (this was worded differently, but that's basically what was asked) (2pts)
  • Use Naive Bayes classifier to classify the 3 new observations, then calculate precisison and recall of this classifier. Laplace correction should not be used. (6pts)