Oz Blog News Commentary

The master, his emissary and the balance of risk

March 27, 2020 - 18:09 -- Admin

Is this a bunch of black patches on a white background. It is. It also depicts something which you can’t unsee once you’ve seen it.

The performance of expertise is tangled up in status displays. Often that subtly displaces what should be the true object of inquiry. Thus, for instance, economists will often be drawn off into spinning their view of a future which A. G. L. Shackle engagingly called “kaleidic”. As I’ve argued, they should, instead, be focused, as weather forecasters are, on understanding how much they know – which in forecasting would actually involve understanding how little they know. Further:

  1. Without confidence intervals around the forecasts, they could do more harm than good and
  2. Forecasts about the major risks to the economy would probably be more useful than point forecasts. They should be issued in a probabilistic form such as “We estimate the chances of recession in the next 6 months has risen from 10 to 20%”.

At the highest level of generality, these problems can be thought of in terms of Nietzsche’s story of the Master and his Emissary. In the story, the Master of a great kingdom can only run his empire by sending emissaries out to govern provinces. The emissary is a competent fellow, but the competence he’s shown his master is tightly defined in some domain – say accounting, and running committees. When the emissary usurps his master, his part of the kingdom declines because he lacks wisdom. You know – the wisdom that the master has – masters are like that.

In McGilchrist’s telling of the story the master is the ‘right-brain’ which is the ‘big picture’ thinker. The left brain is the special ops division – doing special tasks – learning how to put topspin on the ball if you’re a backhand in tennis, learning accounting, building an epidemiological model, making sure people follow procedures, making sure academics hit their publication KPIs.

One of the most tangible functions of the right brain on the African savannah was scanning for predators. It’s the right-brain’s job to frame a problem before and while the left brain helps analyse it with logic and models and any other tools it’s developed, and sends back messages to the master and awaits further instructions.

But here’s the thing, we don’t teach right-brain skills at uni. We assume them. When you go to uni you learn left-brain stuff. We have whole disciplines dominated by left-brain toolmaking and tool-using primates. We give some of them Nobel Prizes. Unfortunately, some boosters of the humanities regard ‘the humanities’ as the solution to these problems. So long as they’ve not been colonised by ideology or other nonsense, these disciplines are all very well, but they’re not focused particularly on the kind of role I’m arguing we’re so desperately short of here which is the role of the right brain in applying the tools of the left brain.1

The pandemic is providing plenty of teachable moments which show what a massive toll this bias is taking on human wellbeing right now. Nassim Nicholas Taleb and Yaneer Bar-Yam take up the story in this excellent piece:

The error in the UK is on two levels. Modelling and policymaking.

First, at the modelling level, the government relied at all stages on epidemiological models that were designed to show us roughly what happens when a preselected set of actions are made, and not what we should make happen, and how.

The modellers use hypotheses/assumptions, which they then feed into models, and use to draw conclusions and make policy recommendations. Critically, they do not produce an error rate. What if these assumptions are wrong? Have they been tested? The answer is often no. For academic papers, this is fine. Flawed theories can provoke discussion. Risk management – like wisdom – requires robustness in models.

Note something here which I highlighted in a recent post – the God’s eye view. The epidemiologists have a model that aspires to a God’s eye view. It, or models like it, should certainly be part of a sensible response. But

  1. Our use of the models should be guided by a wider understanding of how to use them – what blind spots they have and therefore what we should both pay attention to in their outputs and what questions we should do further research on to try to refine our understanding of.
  2. Our thinking shouldn’t be seduced by the God’s eye view. We’re not Gods. We’re little humangoes and we are trying to figure out what to do from our point of view.  (And ‘our’ point of view might involve our individual interest and/or the interests of those groups of which we consider ourselves a part.) As I suggested here regarding economics, the point of the discipline is to help us answer the question “what should we do?” We may well need to mount specific researches asking specific questions about the way the world is, but the primary motivation should be our need to build tools that can help us manage our world for the better.

At least as far as I understand it – and I may being unfair in my ignorance – the Imperial College simulations were rolled out as a ‘take’ on the crisis which pointed at what to do. But though what they suggest we do is of deep significance, the other thing of fantastic significance is the role they can play in surfacing precisely what we don’t know. If they came with a list of crucial assumptions that might be wrong – and I don’t know if they did – was that list prominent in the executive summary of the document?

Off the top of my head, some of those assumptions were

  • the length of hospital stays indicated
  • the number of stays indicated over time
  • the interaction with the capacity of the system
  • the extent to which capacity could be augmented
  • the viability of the ‘hammer and dance’ strategy
  • the relevant economic cost of different options
  • the R0 of the virus in different populations – particularly children
  • the proportion of people who have the disease who are asymptomatic

And plenty of others. Each of these assumptions could have indicated in appendices what was known and what might be found out, might have come with blog posts for public comment (either by invitation or in some filtered way – for instance, you might require an academic email – far from perfect but we’re in a crisis here.)

There are more lessons from this than I can elaborate here, but let me finish this post with a concrete suggestion which can function as an example of a different way of approaching our dilemmas – focused on the balance of risks rather than imagining we can simply head towards our preferred future. I recall the Productivity Commission arguing that where there’s price control of privately supplied infrastructure it should not aim for the optimum price as it would come out of a model (the price which the private provider receives just enough return to want to expand capacity when they should).

That doesn’t take into account the risk of getting it wrong – indeed the certainty of getting it wrong at least to some extent. Given this, the price controller should aim to set the price a bit above the optimum level. Why? Because the costs of having the price too high are small – and indeed vanishingly small if the margin is small. But the costs of having the price too low are underinvestment in the asset and so, inadequate infrastructure with all the congestion costs that generates.

As we come out of this disaster, governments will be stimulating the economy. In television studio land the balance of risks is set by body language. That’s how we’ve decided (disastrously as it’s turned out) to cut interest rates very slowly and reluctantly. Everyone knows that being more concerned about inflation than unemployment is responsible, adult, indeed masculine behaviour.2 Erring on the other side is for sissies. I watched it happen in 1991. We did much better in 2008-9. But even there with hindsight, one might think we could have gone a little stronger.

The ‘optimal aim’ with a stimulus is to land the economy as close as possible to its maximum sustainable growth possibility frontier. And you don’t know enough to know what that is – and even if you did, you wouldn’t know enough to hit the target precisely. If you’re too unambitious you forego a lot of growth, unemployment is higher than it need be with thousands suffering out of work and young lives blighted and you miss your inflation target. If you are too ambitious inflation goes over the target. And at least if it becomes entrenched we don’t know how to bring it down without a recession.

If you pay close attention to the lie of the land intimated in my last paragraph, it leads me to suggest that just as the PC suggests ‘aiming high’ we should do the same for our stimulus measures. If they’re too big, we’ll get inflation. We’ve had too little inflation recently and certainly when inflation is low (say below 4 percent), too little is substantially worse than too much – the obverse of the case with prices on private infrastructure. That may give us the task of reigning in inflation after the event, but there are no risk free paths here, and my argument is I’d rather have that problem than its obverse.

  1. Note parenthetically how the arts tried to muscle in on the STEM panic recently. Along they came and ‘Banksy’ like stuck an “A” for Arts into STEM making it STEAM. Note that that’s injecting something in an additive way as if thinking that improves the world is just thinking of a whole lot of different kinds all added into the mix without attending to their structural relations.
  2. People argue that the reluctance of the RBA has been attributable to its ‘leaning against the wind’ on asset prices. This is fine, but if that’s the case then, first, they should release some modelling taking us through the argument and identifying the risks, and second, had they done so the assumptions their current strategy is based on should be being surfaced. A crucial assumption is that cutting reluctantly doesn’t create a greater risk of a bubble because of the ‘one way bet’ it gives the markets who know that it’s likely to be years before rates rise – when the chances of rates rising after more aggressive action are much higher.