Gelman pre-posts his posts a week, even weeks, in advance! I would link to his post on this but I can't find it. Good luck! This is definitely better than Twitter and I'm learning a lot.
This blog is packed.sorry for my terse response: Eagerly awaiting your blogs on the workshop. In two weeks, I’ll begin teaching, for the fourth time, my (large) introductory course on stat ML here at ASU. Thanks for the book plug; I’ll be checking it out (which I am sorry but I wasn’t aware of!) and may request a copy.
Quite a few as no one text covers the set of topics I wish to present to the students. I use Murphy for the Bayesian aspects including ML estimators and properties, Hastie et al for the bulk of it (but I am not a fan of the organization of materials in the book), and SS Schwartz and SN David for GD, SGD and the associated proofs including introduction to Rademacher complexity. I wish there was one book that covered it all. As an introductory course, I spend ample time on linear and logistic regression walking them through the value of loss functions, feature choice (ridge, lasso), kernels (including your test of time work on RBF/Gaussian kernels), SVM (realizable and with errors), it’s connections to the perceptron algorithm, difference between these and logistic. This is followed by Bayesian approaches including hypothesis testing and quick review of p-values. PCA, clustering, spectral
Methods, and intro to DNNs wraps it up. How do you teach a course like this? I hardly get time to discuss my own work on robust losses or privacy and fairness. But that’s where term papers come in. (I don’t expect them to read my papers but offer them a wide range of topics with some classic suggestions).
The book with Moritz is arranged more or less how we taught the class, though we added some extra material to the text that we didn't cover in lecture. A free version is here if you'd like to have a look.
Gelman pre-posts his posts a week, even weeks, in advance! I would link to his post on this but I can't find it. Good luck! This is definitely better than Twitter and I'm learning a lot.
Weeks in advance? Yikes. Gelman is a machine.
I'll keep at it. Thanks for reading!
This blog is packed.sorry for my terse response: Eagerly awaiting your blogs on the workshop. In two weeks, I’ll begin teaching, for the fourth time, my (large) introductory course on stat ML here at ASU. Thanks for the book plug; I’ll be checking it out (which I am sorry but I wasn’t aware of!) and may request a copy.
Thanks, Lalitha! What other texts do you use for your class?
Quite a few as no one text covers the set of topics I wish to present to the students. I use Murphy for the Bayesian aspects including ML estimators and properties, Hastie et al for the bulk of it (but I am not a fan of the organization of materials in the book), and SS Schwartz and SN David for GD, SGD and the associated proofs including introduction to Rademacher complexity. I wish there was one book that covered it all. As an introductory course, I spend ample time on linear and logistic regression walking them through the value of loss functions, feature choice (ridge, lasso), kernels (including your test of time work on RBF/Gaussian kernels), SVM (realizable and with errors), it’s connections to the perceptron algorithm, difference between these and logistic. This is followed by Bayesian approaches including hypothesis testing and quick review of p-values. PCA, clustering, spectral
Methods, and intro to DNNs wraps it up. How do you teach a course like this? I hardly get time to discuss my own work on robust losses or privacy and fairness. But that’s where term papers come in. (I don’t expect them to read my papers but offer them a wide range of topics with some classic suggestions).
The book with Moritz is arranged more or less how we taught the class, though we added some extra material to the text that we didn't cover in lecture. A free version is here if you'd like to have a look.
https://mlstory.org/
I'll be redoing the class in real time this fall, so stay tuned for more updates.
Congrats Ben! I’m glad you decided to give this a try.
Thanks for the steadfast encouragement, Brian!