ISLR - An Introduction to Statistical Learning
Sample (from 182 notes)
|Front||When making predictions based on a model, confidence intervals account for the reducible error in the model, but not for the irreducible error in the model.True or False?|
|Front||Describe the main element of bagging / bootstrapped aggregation for decision trees.|
|Back||1) Create several bootstrapped training sets2) Fit a separate decision tree on each bootstrapped training sets(low bias, but high variance of those separate models)3) Average over the separate models to get a model with lower variance|
|Front||Backward stepwise model selection: Describe the main process.|
|Back||1. Start with a model that includes all predictors.2. Compute all models with one fewer predictor and select the model which yields the smallest residual sum of squares. (This identifies the predictor with the least additional value.)3. Continue the second step until we have a model without any predictors.4. Select the best overall model based on cross-validation prediction error, AIC/BIC or similar measures.|
After the file is downloaded, double-click on it to open it in the desktop program.
At this time, it is not possible to add shared decks directly to your AnkiWeb account - they need to be added from the desktop then synchronized to AnkiWeb.