Nick Leeson – King of the Rational Emotional Decision Makers

Nick Leeson is a working class lad made good. Then made bad. Then made good again. His story is famous the world over. He is the man who brought down an entire bank. Exactly how this was allowed to happen is of course a Black Swan in itself but it did and the fallout was immense, not least for Leeson himself who ended up in a Singapore jail for several years before laudably coming back with a vengeance and writing a book about his recovery and the effect that stress can have and how to beat it.

The exact details of what happened in 1995 are obviously complicated but essentially the principle is exactly as the world understands it: by the end of 1992, the amount concealed in the error account was £2m, but the end of £1994 it was £208m and by 23 Feb 1995 Leeson fled the country leaving a note saying “I’m sorry” and an account which held £827m in losses. Essentially, Leeson had got caught in a spiral of loss partly due to his decision to use a system to try and correct what began as fairly modest errors.

The system – known as the Martingale system – is known to gamblers and traders the world over and must be the cause of more unhappy nights in Vegas than any other. Essentially, it says if you lose your first bet (of, let’s say $1) you simply double it on the next spin, or hand, and if you lose that, simply double it again and so on and so on until you, eventually win, as you eventually must. The inherent problem with the system is that in theory it works perfectly. At some point within an infinite period of time, however, the monkey on the typewriter will bash out the works of Shakespeare or at least a sonnet or two! What the laws of probability don’t say, however, is WHEN! Which means that an infinite bankroll is required to make the system work in practice, something that even Warren Buffet doesn’t have. And neither did Barings Bank!

But here’s the question: why – when he was £100 million pounds down, did Leeson – an apparently intelligent, determined young man keep going? Why not throw in the towel, concede defeat and do the time that he was ultimately sentenced to do? Why push the envelope until the point of no return and potentially destroy everything for everyone in the process.

The answer lies in the meanings or values that Leeson placed on the different possible outcomes of the potential opportunities available to him.

The only thing that meant anything at all to him at that point was just breaking even. Reducing the losses to £70 or £40 million meant nothing. He was still going to lose his job and almost certainly go to jail in this eventuality. Increasing the losses to £200, £300 even £400 or £500 million was not going to worsen the situation for him really. He had got to a stage where the downside remained the same and the meaning of the upside (his possible reward) was everything. Effectively doubling through and breaking even meant safety, it meant freedom, it meant employment, it meant an end to the stress which eventually gave him cancer. It meant getting the life he loved back.

In this way Nick Leeson was no different to the millions of gamblers who step inside the billion dollar casinos in Las Vegas or Sun City or Mayfair or anywhere around the world. Largely otherwise intelligent, rational people make decisions for an evening which deep down they know will cost them money. People who play slot machines, for the most part, know deep down that the house will win in the long run. I’m not saying that there aren’t people in those places who become convinced that they are “hot” or that they have a “system”.

For Nick Leeson, whatever the probability of the downside, the meaning he placed on losing even more money was minimal. The only thing which meant anything at all to him was breaking even. This was immoral, certainly, given the consequences of his actions for thousands of people but rationally… well, rationally it made sense! Given a similar lack of moral fibre any one of us might well do the same in the same situation.

Certainly given a different situation which, for example, forced us to make a decision to do something which might kill us but which might also very well save the life of our child, which mother or father would not take this course of action? Naturally, this is a hypothetical situation with no detail at all and the first thing you would want to know would be the probabilities involved – but we’ve already covered that. The point is that the meaning we place on the possible upside as a parent would far outweigh the potential downside. Somewhat morbidly, in order to more closely simulate Nick Leeson’s emotional calculation we would probably have to assume that we were likely to die in either event – a factor which would make the ultimate decision a no-brainer.

The situation that Leeson found himself in was very different to going into a casino with £50 (or even £50 million like Kerry Packer did) that you’re happy and prepared to lose. It’s more akin to the position that the poor sap who has heard about the Martingale system finds himself in after maxing out two credit cards in a desperate bid to win back that first $10 after a statistically freakish but eminently possible 10 spin losing run (1024 -1 against). Now, we are anything but happy. We are anything but prepared. But in much the same way, if for very different reasons, there is very little additional pain we can experience. It’s not just our credit cards that are maxed. So are our pain levels. And the only way is up.

Posted 02:47pm by Caspar and filed in Decision Making, Risk

The decision making process – calculating risk

I’m often minded to think back to the “financial crisis” and the causes and fall out therein. As an exercise in uncertainty, you don’t have to go much further than the life of Sir Fred Goodwin. Once revered as the banker who led Royal Bank of Scotland from relatively small provincial institution to the fifth largest bank on the face of the planet. Now reviled as the man who bankrupted it in the process of doing so.

Getting to grips with the fundamental aspects of risk what we now have to understand is that that even if Fred Goodwin were the most competent and brilliant banker on the planet… the law of uncertainty says that there is still a probability that what happened would have happened. It has before and it will again. That’s not to say that Goodwin did all he could to prevent it. But it is to say that no-one could have done everything. For in any system there will always be things we cannot control: “A man does all he can and his destiny will reveal itself.”

The most important question as viewed through this lens is, therefore, did they do all they could? Were the models used by financial institutions accurate? Benoit Mandelbroit has been warning for a long time that they aren’t. According to conventional wisdom taught around the world, he says the chances of the fall in stocks on August 31st 1998 at 1 in 20 million; the odds of the Dow’s fall by 7.7% the previous day 1 in 50 billion and the odds of black Monday 1997 1 in 10 to the power 50, “odds so small” he says “they have no meaning”.

Nothing is certain. Everything has just a probability. The important thing, where the world’s finances are concerned, is to estimate the probabilities effectively. If bankers think the chances of meltdown are less likely than they actually are then obviously they’re going to take more risks than if they estimate them accurately. Likewise in life, if we don’t know the probabilities of outcomes associated with what we do then we’re going to make similar uneducated, potentially poor, decisions.

Ultimately, whenever we do anything our minds consider future possibilities much like an actuary calculating the value of an insurance policy. At a subconscious level we understand and accept uncertainty and do a multitude of tiny little calculations that will ultimately decide what we do. Specifically, we are always multiplying probability by outcome or impact of an possible event: we know the probability of our plane crashing is very low, but we feel that the actual impact of such an event would be huge combining as it does not only the chance of dying in a fireball but also falling 30,000 ft beforehand! Probability times potential outcome here (to give a result that will of course differ for each of us) usually creates a number high enough to cause concern but not enough to stop us making the decision to “take the risk.”

Asking someone on a date is a very different kind of risk. On this occasion, the overall potential impact of the downside is indignity or shame of possible rejection. Not as great, perhaps, as dying in a fireball. But the probability of experiencing it is higher, hence the fear that many of us have of it and why many are less inclined to do it than get into an airplane – or even jump out of one.

The exact probability we’d put on being rejected is going to differ from person to person and situation to situation. In order to make this decision, then, we need not only to make some kind of assessment of the chances of a negative response but also how significant the pain of this possible rejection might be – something that will depend on a large number of private, personal, considerations, not least whether we’ve experienced it before.

But there’s another consideration we take into account in order to make this and every other decision that we ever make: for pain and hurt are not the only possible outcomes of this action. There’s also the potential reward of success – and of course the probability of experiencing it.

And by putting these two probabilities and outcomes together – success and failure – we come up with an expectation which ultimately will define what we decide to do. Whether or not our decision turns out for the best or course is a very different matter!

Probability in everyday conversation

The problem with probability as a language is that it doesn’t sit well in our minds. As human beings we tend to express ourselves in single outcome terms: France will beat Scotland because they are by far the superior team… Milan will win the Champions League final now they are three goals up… Greece have no chance of winning the European Championships because they simply don’t have the players. And yet, in these, and many other cases, the spectacularly unlikely happened and one again the single outcome prediction fell down.

What’s fascinating, though, is that while our conscious minds feel uncomfortable with probability, our subconscious minds don’t appear to have a problem with the concept of uncertainty and probability in the same way.

Ask a room full of successful people what their biggest risk was (as I do as a part of my work) and you will get an array of answers from setting up their own company to moving to a different country. Every time you challenge them you ask them whether they thought they would be successful “Yes” they answer “otherwise I would not have done it.” But ask them why they chose to define it as a risk and they will very soon concede that because – one some level – they knew that they were not definitely going to be successful and that this chance of failure was precisely what defined it as a risk!

Posted 01:55pm by Caspar and filed in Risk

PERT analysis

The 1950s were a time of great innovation and advance, some developments like the Hula Hoop and the Hovercraft may well resulted individual effort in inspiration but some – like the space race was motivated by the need to win the Cold War. When the Soviets launched Sputnik in 1958, the Americans became convinced that their project management system was deeply flawed and launched a number of initiatives aimed at tacking this deficiency. One of these – instigated by the US Navy – was the Project Evaluation and Review Technique or PERT.

After looking at the outcomes of hundreds of projects, the use of PERT analysis concluded that – as Helmuth von Moltke the Elder said 100 years previously, “No battle plan survives the first contact with the enemy”  Another way of saying that single outcome predictions are pointless, Molktke was himself drawing on work by Napoleon 50 years earlier still when he wrote that “Plans are usually useless but time spent planning is invaluable” a reference to the benefits of scenario planning for the variety of different possible outcomes which may befall and decision, action or endeavour.

Specifically, PERT analysis concluded that – while projects almost never hit their original targets (sometimes coming in over budget, rarely under) a more effective way of estimating the time and money needed was to aggregate together various optimistic, pessimistic and realistic estimates giving each one a weighting according to its likelihood (or probability). While theoretically every organisation would have different probabilities, the ones created by the US Navy fit most purposes and are as follows.

Since people’s estimates were usually optimistic because people naturally overestimate how much of any system they can control and put the interventions of fate down as freak occurrences which – now the lessons have been learned will not happen again… the weighted estimate was to be four times the most pessimistic plus the realistic, plus two times the optimistic, divided by seven or:

4P  +   R  +  2O

7

Crucially the point of this analysis was NOT to tame uncertainty in the sense that the result would magically predict the future, but rather the figure created – along with the other figures for other projects would correctly predict the average time and budget needed. The figure produced is essentially just an expectation calculation. Instead of two possible outcomes (win or lose) in this case there are three and they are weighted according to their probabilities in exactly the same way as the win or lose figures were.

In this sense though – since the actual result was going to be roughly one of the three figures, PERT analysis represents an industrial game of Deal or No Deal with seven boxes containing the following amounts:

Optimistic
Optimistic
Realistic
Pessimistic
Pessimistic
Pessimistic
Pessimistic

At some point one of the boxes will be opened and that will be the result but until that time all we have is the multi outcome expectation. Until then, there is no “one” future but many possibilities, each with its own attendant probability. This is one of the reasons for James Surowiecki’s observation in The Wisdom of Crowds that expert forecasts are usually less accurate than the combined guesses of a crowd of people of people who are diverse, independent and decentralized. The aggregation of such different opinions acts as a kind of Montecarlo analysis that gives all possible futures consideration in the calculation and produces a kind of “expectation” or average of all possible futures. (See Chapter Two)

Indeed when people throughout history have made single outcome predictions and prognostications, they are setting themselves up to fail: “Who the hell wants to hear actors talk?” asked Jack Warner. Everything that can be invented has been invented” declared Charles H Duell, Commissioner of patents in 1899. “640KB ought to be enough for anybody.” Said Bill Gates in 1981

And we don’t laugh at any of these individuals: they were the best guesses that we had from experts in their field. If someone had told you in August 2001 that the world’s airlines would lose more money in the next 18 months than they had collectively made in profit since their inception you’d have thought they were crazy and yet that is exactly what went on to happen. If someone had told you in spring 2008 that Britain’s banking system would be all but nationalized by Christmas…?

The only meaningful way of talking about the future is to use the language of probability. By doing so we not only avoid the pitfalls of “pinning our colours to the mast” we also manage to tame uncertainty as well as we can – just ask any project manager!

Posted 01:48pm by Caspar and filed in Innovation, Uncertainty

The Speed of Trust

It’s not easy being a professional poker player. On the plus side, admittedly there is an unparalleled amount of autonomy and an absence of accountability: no boss, no clients, no shareholders to report to. On the other hand, it’s one of the few jobs where you can do everything right and come home down overall on the day. It’s a world in which you can win at the table only to be mugged on the way back to your hotel room with your winnings in your money belt. The risks extend way beyond those which you can logically calculate in a 52 card deck. No wonder it was once described as a hard way to make an easy living!

The most trying tribulations are often the little things that you wouldn’t necessarily think about – like what you put as your profession on your passport if you want to go on holiday to Dubai. Getting car insurance is difficult, income replacement is obviously impossible! This week, as if all that were not enough, I was refused a bank account.

The meeting was embarrassing and awkward in the way that we’re all used to in the call centre economy of 2012. Obviously the person I was talking to was not to blame. He was not the decision maker in the way that he would have been 20 years ago. He just typed my details into a computer and it dutifully said “no”.

With the exponentially increasing processing power of computers, such impersonal and data-based decisions make perfect sense. On the aggregate, they trump intuition and, of course, they’re cheaper to churn out. The behavioural biases of human judgement are replaced by the cold hard calculations of an algorithm which does not demand sick-pay or annual leave.

Somewhere, a computer processed my credit score – a function of my living arrangements and financial repayments in the last 6 years – and reckoned that there was a high probability that I would run up an overdraft and abscond. It doesn’t know who I am or what I’ll actually do any more than Amazon knows what I want to read when it recommends books that I might enjoy. But – like Amazon – it compared the data it has on me with lots of people like me and made a probability assessment which, in my case, was not good.

The link between such an assessment and the world of gambling is, historically, a close one. In the 16th Century, renaissance Italian mathematician, physician and compulsive gambler Geronimo Cardano laid the foundations of the language of probability in a work which gave more attention to his recommendations for ways to cheat at games of chance than it did to the new and revolutionary branch of mathematics that he’d produced. More famously, the creation of expectation theory was Fermat and Pascal’s solution to the Problem of the Points, posed to them by another gambling addict, in this case their friend the Chevalier de Méré.

It’s hard to overstate the importance of such works. In the creation and application of the language of probability, Cardano, Fermat and Pascal along with numerous others in the years that followed (Laplace, Huygens, Bernoulli) laid the foundations for the industrial development of the instruments of modern finance that have profoundly shaped the world in which we live. It is a long time since their formulations found their apotheosis in the world of high finance rather than the lowest of all low culture. So what on earth does the skillset of a professional poker player have to bring to the world of credit assessment which so thwarted me this month?

Ultimately of course, in a hand of poker, I’m trying to do exactly the same job as the bank manager or the algorithm that has replaced him: I’m trying to assess the honesty of the person sitting opposite me. It’s significant of course that the word credit itself derives from the Latin credo, meaning trust, and poker is a rare – perhaps unique – activity in that it legalises lying, so assessments of honesty are fundamental to success. In this way though, poker just distils a process of probability assessment in which we’re engaged with each and every inter-personal encounter: does this person threaten us as we encounter them in the street; do we trust that they’ll pay us back if we lend them money?

As with most assessments that we make, the majority of the time we do it subconsciously: we match the patterns of a person’s dress, behaviour and facial movements with our database of information built up over the course of a lifetime – in the same way that Amazon matches our purchasing records with similar customers in order to assess what we’re likely to buy.

Sometimes, though, even Amazon gets it wrong and so do we. The other day I thought a guy was going to mug me on the way to my hotel room, only to find that he was running towards me to return my phone that I had dropped at the poker table! It all happened very quickly and I based my first assessment on a limited amount of information. As more “data” became available I revised my initial assessment and was more accurate as a result.

As accurate as my bank’s algorithm is in the aggregate, in my case it also got it wrong. Far from posing a credit risk, I have no desire to take on any kind of debt and every intention of depositing a six figure sum to open the account! I tried explaining this but the man on the end of the line had no ability to converse with a computer than could not take on new information.

While we call this age of big data, the amount of information contained in my credit score is actually a tiny proportion of the whole. This is not just personally frustrating for me, it is affecting the profits of companies who are relying on such algorithm assessments for their income. Sure, they’re better than the assessments made by the relatively low pay-grade personnel who inhabited these roles 20 years ago. But they can be better still – and they will be. The question is who is going to steal a march on the competition and really open up this kind of assessment to a revolution by using the kind of granular and detailed information about me that exists out there in cyberspace.

In a world of scarcity, decisions must be made that maximise ROI and reduce risk and some people are definitely better “bets” than others. But turning me – and others who exist outside the norms of a system – down is like folding to a bluff. It’s poor play and players who can exploit this inefficiency can make a lot of money.

Posted 01:44pm by Caspar and filed in Decision Making, Risk

Life’s not fair

“That’s not fair” I cried as my friend completed his set of Vine St, Marlborough St and Bow St that would eventually lead to victory that evening.

I wouldn’t have minded of course had he completed the transaction fairly and squarely but the deal was sealed with the offer of both four hundred pounds of monopoly currency and twenty pounds of actual sterling! “That’s cheating,” I shouted again. And thus began an argument which still rages to this day, twenty years on. To me, there is no debate; what he did was outside of the rules of the game. It was cheating.

I was reminded of this experience recently when, in April of this year, the FBI seized the URL addresses of the two biggest online poker sites, Poker Stars and Full Tilt. The action finally brought an end to their US operations five years after the uncertainty initially created by the Unlawful Internet Gambling Enforcement Act of 2006.

Some have viewed this definitive action to be a good thing, but the unacceptable sting for many players active on the sites at the time is that whilst Full Tilt had a reported $400m in players’ funds it only had $60m in the bank – apparently placing its faith in a kind of fractional reserve system of deposits that is likely to leave thousands of players nearly bankrupt.

“That’s not fair” they have understandably cried as two of the site’s founders are alleged  to have taken dividends of tens of millions from the site leaving the vast majority with almost nothing to show for years of play/work!

Interestingly, while what has happened is almost certainly illegal, and therefore obviously much more serious and profound than a bit of cheating at monopoly, the central problem is very similar: what happened took place outside the basic rules of the game. It still happened, however, and no amount of shouting or complaining is likely to change that fact.

The common perception of poker is that it is broadly a game of epistemic risk. Complete novices sometimes think that it is a game of deductive logic, like blackjack, that starts and ends with the likelihood of a particular card coming down. After playing the game for even just a few hours, however, it is clear that this is only a very small part of the story and that in practice playing the game involves making a series of inductive assessments about our opponents – assessments which get better with experience that leads to greater accomplishment and expertise.  In this respect, the decision-making process of poker is very like that of any other realm of life.

Where poker is unlike life is that often you are forced to make such a decision every 90 seconds or so. Given that you cannot abdicate, delegate or procrastinate during the process of making them, they are even less like a lot of the decisions we make in life which are often left unaddressed for as long as possible! Similarly, the decisions we make in poker are often very one dimensional, focusing merely on the allocation of money, in the form of poker chips, for the purposes of accumulating more in the long term. There is rarely any consideration of society or the greater good of those around you which ultimately is why I stopped playing professionally.

But where poker is exactly the same as life is that – despite a recognized set of rules by which the game can be played reasonably and fairly on a daily basis – from time to time those rules will be breached. Often, such a contravention will be unfair to one or more parties: people may lose money, in the real world they may lose things much more valuable and precious.

In The Black Swan, Taleb defines the “Ludic Fallacy” as the misuse of games to model real-life situations, but in actual fact both are a perfect metaphor for the other and Taleb goes on to say as much. Each has a set of rules which – if and when adhered to – make their play relatively straightforward and enjoyable. In games, however, people cheat and poker sites go under and in real life governments default and whole currencies collapse.

He creates the hypothetical situation of a “fair” coin being flipped ninety nine times and coming up heads every time. Two fictitious creations of the author – Fat Tony and Dr John – are asked for the chances that the next flip will also bring up heads. Dr John’s answer: “obviously fifty-fifty, independent events have no bearing on each other”. Fat Tony: “near enough one hundred percent heads” on the grounds that any coin that flips heads ninety nine times straight is obviously biased.  In other words, just because a game appears to have rules, doesn’t mean that they are – or always will be – adhered to.

(And even if they are, the probability is never an objective 50% Actually the uncertainty principle reigns even here as Dr Jeffrey Hamilton demonstrated accidentally in October 1972 when in a lecture on probability in the University of Warwick he tossed a 2p coin and was astonished as everyone else to witness it land on its edge!)

The effect of such a consideration in the game of poker has the interesting effect of making it much more a game of aleatory or unpredictable – some might say unquantifiable – risk. The probability that the next card is an ace is arguably far less significant in our overall calculations than the chances that the game we’re playing is somehow rigged, or the likelihood of being robbed on our way to the carpark or the probability that the website on which we’re playing doesn’t actually have our money in it, or… who knows what?

When such breaches occur, we may feel aggrieved; we may be moved to protest; we may seek revenge and we may be justified in any and all such responses. But, hopefully, there is one thing we know for sure above all else… from time to time such things will happen. They will. Life… is unfair. And any long-term strategy that does not account for such possibility must surely be inherently flawed in some respect. If we play the game assuming that nothing unfair and outside the rules of the game will ever happen then we’re not playing the game particularly well.

The only question is what will we do in response? Will we have factored such events into our analysis and strategy? Or will the memory of such events still hurt twenty years on?

Posted 01:01pm by Caspar and filed in Decision Making, Uncertainty