Today on (Un)Calculated Risk we welcome Shaun Rai, a Managing Director at Montague DeRose and Associates, as our guest contributor (and another outstanding IA client!).
A bond salesman friend recently had his home on the market. It had been listed for a month or two when I asked him if there had been many showings, or any offers to buy the house. He indicated that there had been some interest but no actual offers as yet. I asked him what the house was worth. He responded, “Not sure, but I know what it’s not worth.”
In light of the recent increased focus on using municipal bond call option valuations to assess refunding opportunities and, potentially, award competitive bids via TIC+, this anecdote serves as a useful reminder that it is important to distinguish between “information” and “price.”
A price is the amount a willing buyer will actually pay a willing seller for a good or service. In this sense, prices do not exist for municipal bond call options, for there is no traded market for these options. An issuer cannot sell the call option embedded in its bonds. An investor cannot buy a call option on bonds it owns to cover the call option it has sold to the issuer. Callable and non-callable bonds of the same maturity with the same credit are very rarely offered to the same investors on the same day.
Thus, market participants can only estimate the “value” of municipal call options using option pricing models. And in doing so, they must input key pricing parameters which cannot be precisely extracted from actual, traded market prices. For example, there are no actively traded non-callable yield curves, nor is there a forward municipal bond market. Given these limitations, using theoretical option values to assess a refunding opportunity is, at its core, a convenient “short form” way to do probability-weighted scenario analysis in which the results are a function of the assumptions used.
This leads to the conclusion that the only “price” that can be established for a muni call option on an outstanding bond is the present value savings that an issuer is willing to accept to execute a refunding of that bond. If the issuer executes a refunding for present value savings of $5 million, that is the “price” of the call option on that day. If an option pricing model indicates that the theoretical value of the option is $6 million, that is “information,” but it is not a “price.”
Does this mean that using estimates of muni call option values is not useful? As a dyed-in-the-wool derivatives guy, my opinion is definitely not. Estimates of muni call option values can be very helpful in thinking about whether to pull the trigger on a refunding or if it makes sense to use lower coupon bonds to achieve a lower yield-to-maturity. However, it is important to emphasize that the call option valuation is “information” -- it is not a “price” -- and should be viewed and used in the same way an issuer would use more traditional scenario analysis.
Confusing “information” generated by models with “price” confirmed by the market can lead to poor decision making. My friend the bond salesman knows the difference – until he sells the house, he doesn’t know what it’s worth.
Shaun Rai is a Managing Director at Montague DeRose and Associates, a leading municipal financial advisory firm based in California, whose clients include many of the largest issuers of municipal bonds on the West Coast, including the State of California and the State of Washington. Shaun can be contacted at firstname.lastname@example.org or 805-319-4145.
"Despite its role in...finance, the expectations hypothesis (EH) of the term structure of interest rates has received virtually no empirical support." - Predictions of Short-Term Rates and the Expectations Hypothesis, Federal Reserve Bank of St. Louis
As we've written on these pages before, forecasting is a necessary evil in finance. It's uncertain by nature and of course the longer the horizon, the more difficult the job. The theory that forward rates are good predictors of future realized rates is called the expectations hypothesis and as one MIT professor put it, "If the attractiveness of an economic hypothesis is measured by the number of papers which statistically reject it, the expectations theory of the term structure is a knockout."
For fun (and to dust off my fast fading coding skills) I went back and looked at how US Treasury implied forward 10Y rates have done in forecasting realized 10Y UST yields from July, 1959 to the present. We used first of month data for 3, 6 and 12 month Tbills as zero rates (making the appropriate daycount adjustments of course) and then 2, 3, 5, 7, 10, 20, and 30-year UST coupon instruments for our implied 10Y forward calculations. And this is what we get...
The red line is the actual 10Y yield over the period and the "hair" is the implied 10Y par yield 1, 2, 3, and 5 years forward. The way to read this then is to look at how often the hair tracks with the actual realization of the 10Y yields as shown by the red line. In general, during this single big rate cycle we've seen over the last 50 years, forward rates have badly underpredicted when rates were going up (note the implied decreasing 10Y forwards during the 70s) and then overpredicted over the last 30 or so years as rates have fallen. How badly do forwards do? Well over this 50 year span, and this holds over most subperiods as well, you'd be better off as a forecaster just assuming today's yield curve stays constant i.e. a perfectly random walk.
Let's look at the tax-exempt market. Analyzing today's current tax-exempt yield curve (non-call) we see an implied increase in the curve over 10 years, though we think not in a particularly realistic way. The bottom line in the chart below is the current non-call tax-exempt curve from 1 month out to 30 years (labeled in green).
Each successive curve above it is the implied forward yield curve in 1 year forward increments from 1 year through 10. Over the 10 year horizon, you can see the 1 month tax-exempt rate smartly moving up over 500 basis points, equivalent to a 7% slam on the monetary brakes by the Fed. However this is accompanied by only a 1.45% move in the long end of the curve from 3.73% up to 5.18%. Realistic? Perhaps, but we'd expect to probably see a higher 30Y rate if the Fed were really that active over the next 10 years.
Don't get me wrong - if you're in a financial services environment as a trader or you're looking to perform a fair price analysis of an interest rate derivative using an interest rate model, you better use forward rates. If you've got complete and relatively efficient markets, you'll get your head removed if you don't. However, if you're an issuer or working with an issuer looking at some sort of scenario analysis on their debt portfolio, forward rates may be a "good to know" but probably not the end of the forecasting road.
If you've taken a break from the news lately, you may have missed the hot water Bloomberg's found themselves in over Bloomberg reporters accessing certain information about Bloomberg users. Finance types scouring for the proverbial free lunch in the markets are understandbly private and the prospect of some Bloomberg journalists looking over their shoulders from those comfy midtown offices is well, unsettling. Of course this is likely overblown by the non-Bloomberg media but we thought the message we got today (below) after logging in to our own Bberg terminal (below) was particularly entertaining and candid...
Doesn't anyone watch Mad Men over there?
"Horse sense is the thing a horse has which keeps it from betting on people" - W.C. Fields
A bookie matches bets. If a bookie sees lopsided interest in bettors taking the Giants by 3 in Sunday’s big game against the Patriots, she’ll ultimately need to adjust the odds she is offering to get a matched book. The instant before the game starts, you could calculate an implied probability for each team winning based upon the bets in the book. Does the bookie care what these specific probabilities are? Absolutely not. Does the bookie even care which team wins? Not if she’s done her job right. The only thing the bookie cares about is that the sum total of the implied probabilities for the teams adds up to more than 100%. Because the amount over 100% is her vigorish – and that’s how she buys dinner. To borrow a term from asset pricing theory, her position is “risk-neutral”.
Contrast the bookie now with bettors, who are in a different position entirely. Bettors might conduct fundamental research into the severity of center Ryan Wendell’s sprained ankle, or how wide receiver Victor Cruz’s hamstring is doing. Bettors are obviously interested in the odds and payout from the bookie, but they are concerned as much or more with the real world likelihood that the Pats will win the day. On this latter analysis of the real world rests the core betting decision. It would certainly be wrong to conclude that the bookie held all relevant analysis of the probabilities based on the bets in her book.
Of course, this describes our financial markets as well. Does a vanilla equity options trader care whether the Jan14 40 MSFT calls expire in the money? Not one bit. His money is made regardless of outcome: he runs a matched book functionally identical to the bookie’s. But how about investors who hold a position in those same options? Absolutely they care. And in the case of options there is an important extension to consider. The somewhat radical and at the time very unintuitive conclusion by Fisher Black, Myron Scholes, and Robert Merton, is that for purposes of pricing an equity option, the trader must assume the growth rate of MSFT to be the risk free interest rate. The same “risk-neutral” term again applies: the options are fully hedgable and as such the underlying growth must be assumed to be the risk free rate or arbitrageurs will enter the market and make it so. The investor of course in order to evaluate a buy or sell decision, must examine the real world expected growth rate of MSFT; it is the real world growth that matters. Is there a difference? Over the last 5 years MSFTs annual ROE has averaged about 40.1%. Over the same time 6M Tbills have yielded 0.24% - big difference.
At Intuitive Analytics we welcome the increased application of asset pricing and valuation theory to the municipal market. We believe it is a very positive and constructive development which ultimately can lead to better decisions under uncertainty. At the same time we believe, particularly in light of recent financial history, that the right models be used the right way, for the right reasons and with well-vetted inputs. There is important context to consider in the use of any financial model. Though it might be nice if we all could live in the bookie's world to make our vig, states, municipalities and the dedicated debt managers who serve them live in the real
one. They have to take real, unhedged positions; their job is harder.
"If people do not believe that mathematics is simple, it is only because they do not realize how complicated life is." - John von Neumann
I’ve landed on Earth from outer space, in these here United States (somehow feel right at home in Williamsburg…) and am trying to learn from you Earthlings how to calculate present value savings in public finance. During my journey here I tried to educate myself by reading a variety of materials including:
In the end, I’ve seen 2 methods. The first and more commonly used to date I'll call the Single Rate Method (SRM) which usually uses the arbitrage yield for discounting and the second I call the Zero Method (ZM) where different zero rates from the term curve are used to discount each cash flow. In order to understand these two methods better, I ran some refunding numbers and to my surprise what I discovered is that both methods are WRONG. I’m confused. Here’s what I did…
I took a hypothetical 20-year, 5% municipal bond callable at 102 in 3 years and ran some refundings, 20 to be exact. I used a single maturity, non-call par bond for each refunding from year 1 to year 20. Coupons/yields for the 20 individual refundings are shown in the 2nd column in the the table below.
The maximum hypothetical escrow yield is 2% so the first 4 scenarios limit the escrow yield to the arbitrage yield of the respective refunding bond. To the size of the escrow (remember, 3 year call at 102) we add 1% for costs of issuance to arrive at the new bond size in the 'New Bond Size' column. In order to calculate PV savings we first take the present value of the existing bonds to maturity then subtract the present value of the new bond which is of course just the 'New Bond Size' under both methods.
Now in order to compare the two methods, we need to generate zero rates from the par curve which I discovered is not a common procedure in public finance circles. Nevertheless I do this easily using the handy Bootstrap function I found in these public finance utilities. Results and a proof of equality between discounting at 4% (the arb yield) and zero rates for the 20 year refunding bond scenario are shown in a table at the end of this post.
So how do these methods look? The first SRM method (green above) creates a massively different change in present value of the current debt, ignoring the valuation of the call feature itself for the moment. But the implication that the value of the existing bond to the borrower is as much as $17.2mm or as little as $11.4mm depending on the tenor of the refunding at best, lacks intuitive appeal. But it is exactly this calculated change in value of current debt that leads to the huge $5.7mm amount of PV savings for the 1% 1-year bond versus the paltry $199k savings in the 4%, 20 year bond scenario. This method must be wrong.
So let's look at the Zero Method (blue above). Using the zero rate discount factors (table below) the present value of existing debt is constant, which seems like an improvement. Even incorporating the call would be a constant offest against the noncall debt value for each scenario. But the present value of the new debt isn't changing either, except when the yield restriction kicks in. Therefore PV savings is identical in refunding scenarios from year 5 through year 20 but starts going DOWN as we refund with earlier maturities from year 4 and in. The larger present value of the new bond deal results from the arbitrage yield restriction on the ("100% efficient"!) escrow, hence decreasing savings. Therefore the Zero Method shows that refunding a 20-year 5% bond with a 1% 1-year bond actually creates negative savings. Though it may be in some sense more theoretically consistent, it's performance in aiding refunding criteria selection just doesn't seem spot on.
I'm new to your planet but we're trying to (re)finance a lot of infrastructure back home. What am I missing? Where have I gone astray? Can someone please help? Obi-Wan was busy...
You can download the Excel spreadsheet, 'PV Savings for Refundings-Methods Compared' used for this article here.
PROOF OF ZERO RATE EQUIVALENCE TO 4% PAR RATE, 20Yr BOND
|| Zero Rate
|| PV @
|| Par Rate
|| Zero Rate
|| New DS
|| Zero DFs
|| TOTALS =
Calculated using Bootstrap in these Utilities.
My colleague David de la Nuez (PhD Operations Research) and I build on the linear algebra from our prior video to show how to set up sources and uses and cash flows for a hypothetical three bond deal. Dr. David then solves the linear program using a few simple lines of MATLAB code; doesn't get much juicier than this, don't miss it!
"Computer programming is an art, because it applies accumulated knowledge to the world, because it requires skill and ingenuity, and especially because it produces objects of beauty. A programmer who subconsciously views himself as an artist will enjoy what he does and will do it better." Donald Knuth
This video builds on the overwhelming popularity of our first video on using linear algebra to structure bond deals in public finance and lays out a technique for applying a basic optimzation algorithm to simultaneously size and amortize a $100 million municipal bond deal, from 10 years out to 3,000 (just for fun).
Using this type of framework we are building towards some solutions which are definitely more sexy so stay tuned. In addition to this video, we've just added another few free public finance Excel models and companion videos to our Resources page so please take a look when you have a few between closings.
If you'd like a copy of the MATLAB code used in this video, shoot us an email at info(at)intuitive-analytics.com. Happy Sizing!
"If people do not believe that mathematics is simple, it is only because they do not realize how complicated life is." - John Louis von Neumann
Many masters of the universe IB types think public finance is akin to Michael Lewis’ (in)famous "equities in Dallas" assignment offered to the lower tier analysts post training. But that perception is largely because they don't know that some of the peculiarities that make munis munis also make it interesting, unique, and frankly more quantitatively challenging then corporate finance.
This video from our Resources page introduces linear algebra (via MATLAB) as an integral part of understanding what’s really going on when sizing and structuring public finance bond deals. We quickly explain (perhaps review) a few basic concepts in linear algebra and then plunge headlong into sizing a simple 3 bond example using linear algebra and the canonical system of equations expressed by Ax=b.
As this our first entry in the “propeller head” category of resources, we’d love to hear your thoughts!
We are honored this week to have a guest post from Win Smith, author of the The Well-Tempered Spreadsheet. Win is the president of Win Analytics LLC, an independent research and advisory firm.
In football, a quarterback “goes long” by heaving the ball down the field. If the play succeeds, the team advances many yards and might even score a touchdown. If the play fails, a down is wasted and there might even be an interception.
The U.S. Treasury's borrowing policies are not quite as exciting. Going long for the Treasury is not a risky call. Long-term borrowing protects against rising interest rates and reduces uncertainty. Although the longest security in the Treasury’s playbook is the 30-year bond, some commentators have urged the Treasury to go farther and lock in today’s low rates for a hundred years or beyond. On the other hand, the Fed has been buying up long treasurys, removing them from private hands in order to lower long rates and stimulate the economy. Some worry that the Treasury is undermining the Fed by flooding the market with long debt.
Many economists and financial journalists believe that the Treasury is going long, tilting its offerings toward more long-term debt and away from short-term debt. For example, the Wall Street Journal reported in June 2012 that “the Treasury Department has been ramping up its issuance of long-term debt to take advantage of historically low long-term rates.” This is both reassuring (the Treasury is guarding against higher rates), and worrying (the Treasury is impeding the Fed from boosting the economy).
The idea that the Treasury is going long owes to charts like this one from the Department:
The key terms here are “marketable U.S. Treasury securities” and “weighted average maturity.” The marketable securities are the short-term bills, medium-term notes, long-term bonds, and inflation-protected securities that the Treasury sells to fund the deficit and refinance older debt. The marketable securities make up the bulk of the national debt held by the public. About 350 such securities are currently outstanding.
The length of the marketable debt is usually measured by its “weighted average maturity.” This is the average remaining time to maturity for the securities (from a given date), weighted by their respective outstanding principal amounts. A $100 billion security carries ten times the weight of a $10 billion security in this calculation.
Looking at the chart above, it seems obvious that the Treasury is issuing more long-term debt. But there is an unrecognized assumption, that increases in average maturity imply sales of more long-term debt. This is the relationship between stocks and flows. Stocks in this sense are not like IBM or AAPL; these stocks are the levels of assets or liabilities at some point in time. Flows are movements over a period of time. Changes in the stock of Treasury debt should reveal the nature of the flows.
But is the Treasury really ramping up sales of long-term debt? Here is the debt issued in the last federal fiscal year, which closed on September 30, 2012. We consider the securities issued during the year, but exclude those that had already matured by the end of the year.
This is a heavily front-loaded basket of securities. Their weighted average maturity is under four years; 77% of them mature within five years; and less than 5% mature in thirty years.
Let's see how the average maturity of the FY 2012 issuance compares with earlier years and with the overall average of the marketable debt:
The average maturities for the new issuance were especially low in FY 2009. This was when the Treasury sold extraordinary levels of T-Bills in response to the financial crisis. The new issues lengthened after that, but shortened again from FY 2011 to FY 2012. The new issuance was always shorter than the overall debt, and yet somehow the overall average maturity continued to lengthen.
Why is the Average Maturity Lengthening?
We have a puzzle on our hands. The average maturity of the marketable debt continues to increase while the new issuance is shorter than average. How can this happen?
To solve the puzzle, let’s focus on the change from FY 2011 to FY 2012. Here is the debt as of September 30, 2011:
The securities are color-coded by remaining maturity. A gray bar represents the securities that were scheduled to mature within a year. The gray securities include recent bills as well as older notes and bonds. Their remaining average maturity was 0.39 years. The securities with more than a year remaining are indicated by blue bars. These securities totaled nearly $7 trillion and their remaining average maturity was 7.06 years. The combined average for of all of the securities was 5.21 years.
Now let’s look at the portfolio a year later, at the end of FY 2012:
The blue bars, representing the same $7 trillion in securities, are still here. The remaining average maturity for this group stepped down from 7.06 to 6.06 years since each blue security is now one year closer to maturity. The gray securities are gone because they all matured. The new securities issued in FY 2012 are shown in yellow, as in one of the charts above. These have an average maturity of 3.94 years. The overall average maturity is 5.39 years, an increase of .18 years over FY 2011.
We can understand what happened if we break down each year’s average into its components. I started with the largest group first and then found the incremental changes when smaller groups were added one at a time:
Components of the Average Life of the Marketable Debt
|FY13-41, Issued by FY11
|FY12, Issued by FY11
|FY12 New Issues
In FY 2011, we start with the blue group and its average of 7.06 years. The very short gray group pulled down the average by 1.85 years. The gray group had a significant impact because it constituted about a quarter of the debt.
In FY 2012, the blue group was down to 6.06 years and the gray group was gone. The new issuance, in yellow, reduced the average down by .73 years (recall that the FY 2012 issuance was more than a year shorter than the overall average). Finally, adjustments related to inflation compensation and additional sales of older securities had a small effect.
The last column explains the change from FY 2011 to FY 2012. The blue group declined by one year. The departure of the gray securities was a double negative, adding 1.85 years. The new securities subtracted .73 years.
The average maturity rose for one reason: the short-term securities from the prior year matured on schedule. These securities mattered because they took up a large share of the portfolio, which was a consequence of the front-loading of the debt (if the maturities were spread out more evenly, no year of maturities would have that much influence). The increase in average maturity was already baked in to the portfolio by FY 2011. We could say that the portfolio lengthened itself. A puzzle on my blog illustrated the same strange process, by which the decay of a population can extend its average remaining life.
If you only consider only one kind of flow - additions to debt through new issuance - you are likely to misinterpret changes in average life. The other kind of flow - subtractions of debt by maturing securities - is critical to understanding the evolution of average life.
The Treasury could have accelerated the increase by selling longer debt. As it was, the issuance in FY 2012 was so short that it nearly stopped the average maturity for no gain.
Note: a month-to-month version of this analysis was included in a presentation I gave last year.
Copyright 2013 by Win Analytics LLC. All rights reserved.
“Exponentials can’t go on forever, because they will gobble up everything.” - Carl Sagan
Over the holidays I read a book called The Lights in the Tunnel: Automation, Accelerating Technology and the Economy of the Future which compellingly argues technology is fast moving into the second half of the proverbial chessboard. Recall the brain teaser about solving for the amount of rice you’d have if you start with one grain on the first square of a chessboard, and then doubled it for every successive square. By the end you’d have 2^63 or 9.2 quintillion (10^18) grains of rice, roughly equivalent to a rice mountain the size of Everest. It’s a striking example of the power of exponential growth and one that many find un-intuitive.
But this story has an analogy we’re living through with technology today. Let’s generalize Moore’s 1965 “Law” a bit and say computing power has and is still doubling roughly every 2 years. With this happening since the early 70s, that means we’ve “doubled” our compute power around 20 times or so. These advances have led to some pretty significant inventions including personal computers, the internet, and smartphones. But the first half of the chessboard isn’t really where the action is. Getting to the halfway point causes some meaningful growth, but nothing compared to what happens in the second half. And obviously going from square 4 to 5 is very different than going from square 34 to 35.
The popular WSJ article title, Why Software is Eating the World, by Netscape’s founder Marc Andreeson eerily echoes Sagan’s quote above. Occupations and industries are indeed getting eaten and though I risk departing from the relative safety of public finance topics, our current trajectory has profound ramifications for our society, economy and certainly public infrastructure. Below is a quick survey of a number of industries or occupations and what’s heading their way.
It’s been said that traditional classrooms are set up to transfer the notes of the teacher to the notes of the student without going through the brain of either. Khan Academy, edX (MIT and Harvard) and the new public education competitor to edX, MOOC2DEGREE, are all moving towards radically changing how we assimilate information, and what we pay for the privilege. $200,000 for four years of college's adult childhood to get a degree is not feasible or prudent for a lot of people today. It will change.
Virtual doctors providing a probability ranked list of your likely illnesses in response to symptoms plus a bevy of remote monitors, diagnostic devices, and medical mini-apps are among a long list of technologies that will make every hypochondriac rejoice. But more importantly these will fundamentally change healthcare delivery and its cost structure.
A portfolio manager friend of mine about 15 years ago said, “The next job after trader is lawn mower.” Guess he didn’t see automated lawnmowers coming. In all seriousness software (in high frequency and algorithmic trading, electronic exchanges, etc) are eating finance jobs at an alarming rate. I also guarantee we don’t need a lot of people getting paid $500k/year or more visiting Manhattan office space daily in order for our economy to efficiently allocate capital. This sector is now largely a vestigial tail which still exists due only to history’s profound path dependence.
An online Economist article last month called Rise of the Software Machines details development at a company called IPsoft that has a virtual service-desk employee called Eliza who’s handling 62,000 calls a month for a US media giant, two thirds of which require no human handling. Outsourcing is one step removed from automation and unfortunately for China and India, that will become very evident over the next few years.
E-discovery software has been around for a few years now, putting lots of budding legal associates out in the cold, but technology is moving way beyond that. Software now predicts case outcomes based upon databases of case law. Lex Machina is one such project. Handling language is hard, but the legal profession is one large, well-paid target. Slate has a whole series on automation replacing jobs but one article is focused on law.
The perhaps unappetizing safety job for many in the economy has been food service. A company called Momentum Machines now has a robot burger machine that can make up to 360 burgers an hour in just 24 square feet of space, complete with freshly sliced tomato, pickle and the customer’s selected mix of meat (1/3 pork and 2/3 bison? No problem). Momentum Machines estimates they could save around $9 billion a year industry-wide, the savings on which could be spent on better ingredients. Listening MickeyD?
Physical goods manufacturing
3D Printers for $1,300; enough said.
Truckers, Cab, Limo Drivers
Recently there’s been a spate of articles on self-driving vehicles. Google’s been testing one since 2009 and Toyota and Audi are adding enhanced safety features that are increasingly taking over or augmenting the role of driver in emergency situations. No regulations exist that provide for autonomous vehicle testing or driving but a number of states are enacting such laws now. One of the bumps in the road is determining liability in a crash when there’s no human driver. Fully automated driving is in our near future and the repercussions are massive. My question is, what does the software do when faced with hitting a child or swerving into a telephone pole? Tough code to write and a question the developers probably didn’t cover in their computer science training…
Ever since a public finance class in college I’ve been intrigued by the concept of an externality. What occurs to me, as we move into the 2nd half of the technology chessboard, is that unemployment or underemployment may be a new type of externality which is now arising as a byproduct of firms simply following Smith's invisible hand. Technological unemployment could be considered a type of pollution that needs to be monitored, measured, and managed, perhaps even taxed.
What is certain is that our economy doesn’t function without the participation of a solid majority of consumers. If huge chunks of average work done by a large portion of average workers are best done by software/machine/robot in the very near future, many of those essential consumers won’t have income, which means they can’t function as consumers. And their demand will fall long before actual employment drops. Through the magic of expectations demand will contract as soon as prospects for the future look sufficiently bleak. So how are you feeling today?