# Stanford University Statistical Learning Quiz Answer | Tree-Based Methods

**Stanford University Statistical Learning Quiz Answer |**

**Tree-Based Methods**

In this article i am gone to share Stanford University Statistical Learning Quiz Answer | Tree-Based Methods with you..

**Tree-Based Methods Quiz**

### 8.1.R1

**Using the decision tree on page 5 of the notes, what would you predict for the log salary of a player who has played for 4 years and has 150 hits?:**

- 5.11
- 5.55
- 6.0
- 6.74

**More Details on Trees**

### 8.2.R1

**Imagine that you are doing cost complexity pruning as defined on page 18 of the notes. You fit two trees to the same data: T_1 is fit at alpha = 1 and T_2 is fit at alpha = 2. Which of the following is true?**

- T_1 will have at least as many nodes as T_2
- T_1 will have at most as many nodes as T_2
- Not enough information is given in the problem to decide

**Classification Trees Quiz**

### 8.3.R1

**You have a bag of marbles with 64 red marbles and 36 blue marbles.**

**What is the value of the Gini Index for that bag? Give your answer to the nearest hundredth:**

- .4608

### 8.3.R2

**What is the value of the Cross-Entropy? Give your answer to the nearest hundredth (using log base e, as in R):**

- .653

**Bagging and Random Forest Quiz**

### 8.4.R1

**Suppose we produce ten bootstrap samples from a data set containing red and green classes. We then apply a classification tree to each bootstrap sample and, for a specific value of X, produce 10 estimates of P(Class is Red|X):**

0.1,0.15,0.2,0.2,0.55,0.6,0.6,0.65,0.7, text{ and } 0.75

There are two common ways to combine these results together into a single class prediction.

There are two common ways to combine these results together into a single class prediction.

One is the majority vote approach discussed in the notes. The second approach is to classify based on the average probability.

What is the final classification under the majority vote method?:

- red

### 8.4.R2

**What is the final classification under the average probability method?:**

- green

**Boosting Quiz**

### 8.5.R1

**In order to perform Boosting, we need to select 3 parameters: number of samples B, tree depth d, and step size lambda.**

**How many parameters do we need to select in order to perform Random Forests?:**

- 2

**Tree-Based Methods in R**

### 8.R.R1

**You are trying to reproduce the results of the R labs, so you run the following command in R:**

- library(tree)
- As a response, you see the following error message:
- Error in library(tree) : there is no package called ‘tree’

**What went wrong?**

- You meant to use ‘require(tree)’
- You meant to use ‘library(“tree”)’
- The tree package is not installed on your computer
- Nothing is wrong, that error message could not be produced by R

**Chapter 8 Quiz**

### 8.Q1

**The tree building algorithm given on pg 13 is described as a Greedy Algorithm. Which of the following is also an example of a Greedy Algorithm?:**

- The Lasso
- Support Vector Machines
- The Bootstrap
- Forward Stepwise Selection

### 8.Q2

**Examine the plot on pg 23. Assume that we wanted to select a model using the one-standard-error rule on the Cross-Validated error. What tree size would end up being selected?:**

- 1
- 2
- 3
- 10

### 8.Q3

**Suppose I have two qualitative predictor variables, each with three levels, and a quantitative response. I am considering fitting either a tree or an additive model. For the additive model, I will use a piecewise-constant function for each variable, with a separate constant for each level. Which model is capable of fitting a richer class of functions:**

- Tree
- Additive Model
- They are equivalent