NIU CSCI 490 Quiz 3
Between Random Forests and Gradient Boosting algorithms, which one is generally faster to train? a. Random Forests b. Gradient Boosting c. Both are having similar training time d. None of these options are correct
a
Which of the following is/are true about Random Forests? 1. In Random Forests, each decision trees are independent of each other. 2. Random Forests enhance the accuracy of the model by combining the results of individual decision tree. a. 1 and 2 b. 1 c. None of them d. 2
a
Which one of these classification algorithms are relatively better calibrated in predicting probabilities of the output class? a. Logistic Regression b. Decision Tree c. Gradient Boosting d. Random Forest
a
Select algorithm(s) that is/are sensitive to data scaling. a. Logistic Regression b. K-Nearest Neighbors c. Decision Tree d. Random Forests
a and b
What is/are the advantages of Polynomial features? a. Helps in classifying non-linear dataset with linear boundary by projecting them in higher dimensional subspace b. Generally, improves the accuracy c. Reduce the training time d. None of them
a and b
What happens when you don't prune the decision tree? a. Most likely model will overfit b. Most likely model will undefit c. Training accuracy will be 100% d. Validation accuracy will be 100%
a and c
Which of the following algorithm are not ensemble learning algorithms? a. Decision Tree b. Extra Trees c. Random Forests d. Support Vector Machine
a and d
Select the algorithm which is more stable in deciding feature importance a. Decision Tree b. Random Forests c. Both are stable d. None of them
b
Which of the following is/are true about gradient boosting algorithm? 1. Individual decision trees are independent of each other. 2. It improves the performance by combining the results of individual tree. a. 1 and 2 b. 1 c. 2 d. None of them
c
Which one of them doesn't use learning rate in training? a. Gradient Boosting b. Logistic Regression c. Random Forest d. None of them
c