Practice Quiz Module 9

अब Quizwiz के साथ अपने होमवर्क और परीक्षाओं को एस करें!

Which of the following are advantages of stacking? 1. More robust model 2. Better prediction 3. Lower time of execution 1 and 2 2 and 3 1 and 3 All of the above

1 and 2

Which of the following option is / are correct regarding benefits of ensemble model? 1. Better performance 2. Generalized models 3. Better interpretability 1 and 2 1 and 3 2 and 3 1, 2 and 3

1 and 2

Which of the following can be true for selecting base learners for an ensemble? 1. Different learners can come from same algorithm with different hyper parameters 2. Different learners can come from different algorithms 3. Different learners can come from different training spaces 1 and 3 1 1, 2 and 3 2

1, 2 and 3

Suppose, you are working on a binary classification problem. And there are 3 models each with 70% accuracy. If you want to ensemble these models using majority voting method. What will be the maximum accuracy you can get? 70% 100% 44% 78.38 %

100% (not 78.4% , why)

Which of the following is a correct statement about stacking? A machine learning model is trained on predictions of multiple machine learning models A Logistic regression will definitely work better in the second stage as compared to other classification methods

A machine learning model is trained on predictions of multiple machine learning models

Which of the following algorithm corrects its predecessor by paying a bit more attention to the training instances that the predecessor underfitted? Gradient Boosting AdaBoost Bagging Random forests Pasting

AdaBoost

Which of the following algorithm is not an example of an ensemble method? Gradient Boosting Random Forest Decision Tree Bagging

Decision Tree

If we didn't assign a base estimators to the bagging classifier it will use by default: Linear regression KNN Logistic regression Decision tree

Decision tree

Ensembles will yield bad results when there is significant diversity among the models. Note: All individual models have meaningful and good predictions. True False

False

In boosting, individual base learners can be learned parallel. True False

False

Which of the following algorithm tries to fit the new predictor to the residual errors made by the previous predictor? Random forests Gradient Boosting Bagging AdaBoost Pasting

Gradient Boosting

Generally, an ensemble method works better, if the individual base models have ____________? Less correlation among predictions High correlation among predictions Correlation does not have any impact on ensemble output None of the above

Less correlation among predictions

Which of the following algorithm introduces extra randomness when growing trees: instead of searching for the very best feature when splitting a node, it searches for the best feature among a random subset of features? Pasting Bagging Boosting Random forests

Random forests

True or False: In bagging, individual base learners can be learned parallel.

True

In which of the following method, sampling is performed with replacement? pasting bagging

bagging

In an election, N candidates are competing against each other and people are voting for either of the candidates. Voters don't communicate with each other while casting their votes. Which of the following ensemble method works similar to above-discussed election procedure? Hint: Persons are like base models of ensemble method. Bagging Boosting A or B None of these

bagging(why)

In which of the following method, sampling is performed without replacement? pasting bagging

pasting


संबंधित स्टडी सेट्स

Saunders sickle cell and hemophilia

View Set

Chapter 1 Quiz: The Science of Psychology

View Set

Chapter 8 - Corporate Strategy: Vertical Integration and Diversification

View Set

Finan 5000 chapter 1 multiple choice

View Set

Chapter 16 Cell Signaling questions

View Set