TU Wien:Machine Learning VU (Musliu)/Exam 2021-10-21

Aus VoWi
Zur Navigation springen Zur Suche springen

Multiple Choice (Answer: True/False):

1. Is ensamble boosting easily parallelizable

2. Usually state of the art AutoML systems use grid search to find best hyperparameters

3. In AdaBoost, the weights are uniformly initialised


Free Text:

1. Describe a local search algorithm for Bayesian Network creation.

2. Given are 1000 observations, from which you want to train a decision tree. As pre-pruning the following parameters are set: The minimum number of observations required to split a node is set to 200. The minimum leaf size (number of obs.) to 300.

3. Goal and settings of classification. To what tasks does it relate and from which it differs in machine learning ?

4. What is the Chain rule and how is it used in Bayesian Networks?

5. What approaches can be chosen for linear regression? Describe them.

6. What methods are there for combatting overfitting in Neural Networks?

7. What is the difference between Lasso and Ridge regression?

8. Describe at least three methods for hyperparameter optimization.

9. What are the implications of the No Free Lunch theorem?

10. XOR dataset, which of perceptron, decision tree, SVM 1-NN can achieve 0 error rate?

11. Describe in detail the random forest algorithm.