## Course: Essentials of Probability and Statistical Inference IV

# Schedule

Session | Topic | Activites |
---|---|---|

N/A | Review | Lecture: Stuff you should know: Basics of probability, the central limit theorem, and inference. |

1 | Introduction to Regression and Prediction | Lecture: We will describe linear regression in the context of a prediction problem. |

2 | Overview of Supervised Learning | Lecture: Regression for predicting bivariate data, K nearest neighbors (KNN), bin smoothers, and an introduction to the bias/variance trade-off. |

3 | Linear Methods for Regression | Lecture: Subset selection and ridge regression. We will use singular value decomposition (SVD) and principal component analysis (PCA) to understand these methods. |

4 | Linear Methods for Regression | Lecture: Subset selection and ridge regression. We will use singular value decomposition (SVD) and principal component analysis (PCA) to understand these methods. |

5 | Linear Methods for Classification | Lecture: Linear Regression, Linear Discriminant Analysis (LDA), and Logisitc Regression |

6 | Kernel Methods | Lecture: Kernel smoothers including loess. We will briefly describe 2 dimensional smoothers. We will also define degrees of freedom in the context of smoothing and learn about density estimators. |

7 | Model Assessment and Selection | Lecture: We revist the bias-variance tradeoff. We describe how monte-carlo simulations can be used to assess bias and variance. We then introduce cross-validation, AIC, and BIC. |

8 | The Bootstrap | Lecture: We give a short introduction to the bootstrap and demonstrate its utility in smoothing problems. |

9 | Splines, Wavelets, and Friends | Lecture: We give intuitive and mathematical description of Splines and Wavelets. We use the SVD to understand these better and see connections with signal processing methods. |

10 | Splines, Wavelets, and Friends | Lecture: We give intuitive and mathematical description of Splines and Wavelets. We use the SVD to understand these better and see connections with signal processing methods. |

11 | Additive Models, GAM and Neural Networks | Lecture: We move back to cases with many covariates. We introduce projection pursuit, additive models as well as generalized additive models. We breifly describe neural networks and explain the connection to projection pursuit. |

12 | Additive Models, GAM and Neural Networks | Lecture: We move back to cases with many covariates. We introduce projection pursuit, additive models as well as generalized additive models. We breifly describe neural networks and explain the connection to projection pursuit. |

13 | Model Averaging | Lecture: Bayesian Statistics, Boosting and Bagging. |

14 | CART, Boosting and Additive Trees | Lecture: We introduce classification algorithms and regression trees (CART) as well as the more modern versions such as random forrests. |

15 | CART, Boosting and Additive Trees | Lecture: We introduce classification algorithms and regression trees (CART) as well as the more modern versions such as random forrests. |

16 | Clustering Algorithms | Lecture |