8 Comments
Mar 13Liked by Ben Recht

"Better uncertainty quantification is not going to improve emergency surgeries. It’s not going to improve macroeconomic policy either." These statements seem too broad to me. To say that better uncertainty quantification can't improve settings like macroeconomic policy making sounds like you're saying there is no value in considering how predicted distributions across different models compare. But my experience with policy makers (such as with central bankers, who have invited me to their conferences several times based on their interests in expressing uncertainty) is that decision makers often perceive value in attempts to quantify uncertainty, even if they know the assumptions behind any particular quantification can't be verified. I.e., the problems we tend to see with uncertainty quantification in some of these contexts are not that uncertainty quantification isn't useful or we shouldn't be trying to improve on our methods, it's that some people expect the "small world" view of uncertainty that we can quantify within some model to capture all of our uncertainty. So I wouldn't say better uncertainty quantification is necessarily useless in these settings.

Expand full comment
author

I have a dimmer view of macroeconomics, where validation seems to be nonexistent. In most policy settings better uncertainty *articulation* would be valuable, but why should it be *quantifiable?*

Expand full comment
Mar 15Liked by Ben Recht

If we allow that people might be helped by consulting predictions from (multiple, wrong) models in a case like macroeconomic policy making, then it seems hard to me to say that uncertainty information can’t help them, since it provides more information about what any given model is saying.

The spirit of your recent posts is reminding me of something I realized a few years ago, after I heard someone say uncertainty communication is a moral imperative and had a strong negative reaction. It got me thinking under what properties of a decision process uncertainty is not helpful or even harmful. I agree there are a lot of cases where it won't help, and macroeconomic policy and emergency surgery may be among them. But I have also seen modeling inform decision processes in a lot of ways that are more complex than people simply taking the model predictions (or uncertainty) at face value. So it seems hard to say it won't be useful without being precise about how exactly we think the uncertainty will be used. Formally working out those arguments could be valuable.

Expand full comment
author

Yes, I totally agree with you that we need to be precise about how uncertainty quantification would be used. This is also why I'm trying to distinguish between uncertainty articulation, i.e., "we aren't really sure this will work" against uncertainty quantification "There is a 25-75% chance this will work." Is the latter better than the former? Why and in which cases?

We' probably have to go through some concrete hypotheticals to make a bit clearer where adding numbers is useful for decisions that have substantial impact and no possibility of recourse.

Expand full comment

Like Ben, I have been thinking about these things for a long time and, as someone who works on stochastic systems, doing a lot of soul-searching about uses and misuses of probability. Also, like Ben, I think we should put a lot more effort into articulating uncertainty, but, unlike Ben, I see probability theory as a valuable metalanguage for appraising propositions about various scenarios pertaining to the system under consideration. As long as we are careful not to regard probabilities as some sort of an objective feature of observable phenomena, but rather as a convenient framework for assigning relative magnitudes to propositions about the world, I think there is value in quantification. I am still thinking through it, and wrote about some of my current thinking here: https://realizable.substack.com/p/probabilities-coherence-correspondence

Expand full comment
author

With regards to "As long as we are careful not to regard probabilities as some sort of an objective feature of observable phenomena, but rather as a convenient framework for assigning relative magnitudes to propositions about the world, I think there is value in quantification."

I couldn't agree more. But you can't deny that once you let people argue with statistics, you start seeing probabilities quantified to three decimal points.

I would be less against prediction intervals and probabilistic uncertainty quantification if people would treat them as semi-quantitative. I only get annoyed when people start making extreme claims about probabilities of coverage, etc.

Expand full comment

Oh yeah, I agree with this completely. I believe that more effort should be put into constructing the information structure (sample space, sigma-algebra, etc.), as opposed to fixating on quantifying probabilities to three decimal points. I’m more interested in “the probability of what?” rather than in “what is the probability?”.

Expand full comment
Mar 13Liked by Ben Recht

“ At that point, the uncertainty that needs to be quantified is the value of the math itself.”

Love that quote

Expand full comment