"all observations are generated by having god randomly generate an iid sample from a probability distribution governed by a few parameters. " This world view is so confusing. "Random variables" in statistical world view seem to be super zombie which make everything rv. Any constant/object + random variable is a random variable. Random variable infects everything ! This worldview is good for mathematical analysis or exploration in some context. Generalizing this idea is so weird.
Interesting read. Would you mind going a little deeper on your last paragraph?
I’m interested in what the notion of MLE being on unstable ground implies about the philosophical implications of say the standard model in physics. Is that, too, non-rigorous from this perspective for the following reasons 1) the assumptions of the standard model itself represent an over simplification of the world and 2) it is experimentally verified using the methods of maximum likelihood inference (which as you say is unreliable).
I guess the question is how much of science becomes non-rigorous when these standards are held to different fields than statistics
Hey Ben, I was recently discussing the use of personal belief probabilities with someone espousing Bayesian reasoning (i.e. “My prediction that X candidate will win Y race is 30%” with implied conditioned on components) and was delighted to learn about the Stanford Encyclopedia of Philosophy. Have you seen or engaged with this: https://plato.stanford.edu/entries/probability-interpret/ ? It covers a lot of your objections and some that I expect you have but haven’t posted yet.
At the end of the frontmatter of "Elements of Statistical Learning" (Hastie, Tibshirani, Friedman) there is a quote from Ian Hackman: "The quiet statisticians have changed our world; not by discovering new facts or technical developments, but by changing the ways that we reason, experiment and form our opinions". The more you read this substack series, the meaning of that quote becomes increasingly double-sided.
"We make some untestable assumptions about the world in order to tell a story about data." I feel this gives a good summary of typical social science. Whether that's deemed useful seems to vary a lot by opinion. That's a genuine issue, any research topic is important if a few smart people say it's important.
"The postulate of randomness thus resolves itself into the question, ‘Of what population is this a random sample?’" This quote is great too. You can always infer from your sample to The World by defining (or being purposefully vague about) your population as that which your sample is a random sample of. Hence, you do science.
Isn't basically all of ML based on the assumption that there exists some unknown distribution over basically everything?
"all observations are generated by having god randomly generate an iid sample from a probability distribution governed by a few parameters. " This world view is so confusing. "Random variables" in statistical world view seem to be super zombie which make everything rv. Any constant/object + random variable is a random variable. Random variable infects everything ! This worldview is good for mathematical analysis or exploration in some context. Generalizing this idea is so weird.
Interesting read. Would you mind going a little deeper on your last paragraph?
I’m interested in what the notion of MLE being on unstable ground implies about the philosophical implications of say the standard model in physics. Is that, too, non-rigorous from this perspective for the following reasons 1) the assumptions of the standard model itself represent an over simplification of the world and 2) it is experimentally verified using the methods of maximum likelihood inference (which as you say is unreliable).
I guess the question is how much of science becomes non-rigorous when these standards are held to different fields than statistics
I'm really enjoying your journey through these old papers, and the cogent summaries you add to it.
Hey Ben, I was recently discussing the use of personal belief probabilities with someone espousing Bayesian reasoning (i.e. “My prediction that X candidate will win Y race is 30%” with implied conditioned on components) and was delighted to learn about the Stanford Encyclopedia of Philosophy. Have you seen or engaged with this: https://plato.stanford.edu/entries/probability-interpret/ ? It covers a lot of your objections and some that I expect you have but haven’t posted yet.
At the end of the frontmatter of "Elements of Statistical Learning" (Hastie, Tibshirani, Friedman) there is a quote from Ian Hackman: "The quiet statisticians have changed our world; not by discovering new facts or technical developments, but by changing the ways that we reason, experiment and form our opinions". The more you read this substack series, the meaning of that quote becomes increasingly double-sided.
"We make some untestable assumptions about the world in order to tell a story about data." I feel this gives a good summary of typical social science. Whether that's deemed useful seems to vary a lot by opinion. That's a genuine issue, any research topic is important if a few smart people say it's important.
"The postulate of randomness thus resolves itself into the question, ‘Of what population is this a random sample?’" This quote is great too. You can always infer from your sample to The World by defining (or being purposefully vague about) your population as that which your sample is a random sample of. Hence, you do science.
By the end of this road, we will hit the self-averaging argmin post.
(Fascinating as always. Love this new thread of posts on the he mis-foundations of probabilistic thinking.)