The Most You Can Lose?
July 3, 2013 1 Comment
In yesterday’s Wall Street Journal, there’s an article entitled Tracking Risk Isn’t So Easy that overall is a solid description of some of the problems with measuring risk and comparing quantitative risk levels from one firm to another. The article is worth reading because it redeems itself quite nicely despite a major blunder early on. You’d think that an outfit like the Journal wouldn’t make a rookie mistake in one of the first few paragraphs where they wrote: “…value-at-risk, or VAR,[is] the most common yardstick of trading risk. VAR is designed to measure the maximum trading losses faced by a bank in a single day” (my emphasis). Wrong. Very, very wrong. Believing that VaR measures the most you can lose is a mistake that reveals just how misunderstood risk management is, even by experts at the WSJ.
What if someone asks for the risk of a river overflowing and flooding a city? One response might be a statistical analysis to calculate the “Floors-at-Risk” or FaR. I can just image the deadlines: “This month’s flood had a FaR of 2.5”, and the article would go on to explain that meant experts expected water to rise to 2 1/2 floors before receding. Would anyone ever suggest that the FaR number is the highest the water could ever get to? Of course not! Everyone understands that water can rise much farther than the expected level. It’s pointless to talk about the “maximum possible height” of the water. What matters is the associated likelihood of each possible water height, with the expectation (and hope!) that the likelihood drops off quickly with increasing height.
The Value-at-Risk number is similar: it’s an estimate of how bad things can get most of the time, with the same expectation of diminishing likelihoods for bigger loses. VaR is certainly not the most you can lose. In just about every case involving “plain vanilla” securities like stocks and bonds, the most you can lose is everything: 100% of your investment (in cases of derivatives, you can actually lose more than that amount). So thinking about the most you can lose is not really informative – rather, risk managers talk about the probabilities associated with losing various amounts.
Investor Analytics once had a client, years ago, confused by this: they called to ask how it was possible that they lost more than their “Value-at-Risk” number. I pointed out that if, in fact, their VaR number was set (by them!) to be at the 95% threshold, that they had to lose more than that VaR 5% of the time – that’s what the 95% level means: it’s true 95% of the time. In other words, it’s not true 5% of the time so they should expect to lose more than this number 5% of the time which works out to 1 day out of 20 (there are 20 business/trading days a month). In this case, losing more than the VaR amount should be as common as the full moon! Unswayed by this line of reasoning, they again asked how it was possible to lose more than that number. I was tempted to point out that they could have sold all the securities, taken the cash and used a match — that would certainly lose more than any reasonable estimate of risk! Of course, decorum dictated that I phrase it a bit more politely.
Although yesterday’s WSJ article got this one very important point dead wrong, the rest of the article is quite good about explaining the difficulties in comparing different banks’ VaR estimates and even in understanding a proper interpretation of a particular bank’s calculation. There are so many acceptable variations in the VaR calculation methodology that it simply cannot be relied on for meaningful comparisons. Unfortunately, that doesn’t stop everyone from trying. In attempting to impose a standard measure of risk in VaR, regulators have given the impression that there is consistency in interpretation. But because banks and other investment managers are given so much latitude in acceptable calculation approaches, time-horizons and confidence intervals, setting that false expectation will inevitably result in bigger problems down the road.