25 Şubat 2013 Pazartesi

Checkerboard Puzzle, Moore's Law, and Growth Prospects

To contact us Click HERE
My father the mathematician first posed the checkerboard puzzle to me back in grade-school, perhaps on some rainy Saturday. His version of the story went something like this:

The jester performs a great deed, and the king asks him how he would like to be rewarded. The jester is aware that the king is a highly volatile individual, and if the jester asks for too much, the king might just kill him then and there. The jester also knows that the king views his promise as sacred, so if the king says "yes" to the jester's proposal, then the king will honor that promise. So in a way, the jester's problem is how to ask for a lot, but have the king at least initially think it's not very much, so that the king will give his consent.

So the jester clowns around a bit and then says: "Here's all I want. Take this checkerboard. On the first square, but one piece of gold. On the second square, two pieces. On the third square, four pieces, and on the fourth square, 8 pieces. Double the amount on each square until you reach the end of the checkerboard."

In the story, the king laughs at this comic proposal and says,  "Your great deed was so wonderful, I would have happily done much more than this! I grant your request!"

But of course, when the king starts hauling up gold pieces from the treasury, he will discover that 2 raised to the 63rd power, the final spot on the checkerboard requires about 9 quintillion gold pieces (that is, 9 followed by 18 zeros). 

I've had some sense of the power of exponential growth ever since.  But what I hadn't thought about is the interaction of Moore's Law and economic growth. Moore's Law is of course named for Gordon Moore, one of the founders of Intel, who noticed this pattern back in 1965. Back in the 1970s, he wrote a paper that contained the following graph showing how much it cost to produce a computer chip with a certain number of components. Here's his figure. Notice that the numbers of component on the horizontal axis and the cost figures on the vertical axis are both graphed as logarithm (specifically, each step up the axis is a change by a factor of 10). The key takeaway was that the number of transistors ("components") on an integrated circuit was doubling about every two years, making computing power much cheaper and faster.
This chart from Intel co-founder Gordon Moore's seminal 1965 paper showed the cost of transistors decreased with new manufacturing processes even as the number of transistors on a chip increased.

Ever since I started reading up on Moore's law in the early 1980s, there have been predictions in the trade press that it will soon reach technological limits and come to and end. But Moore's law marches on: indeed, the research and innovation targets at Intel and other chip-makers are defined in terms of making sure that Moore's law continues to hold for at least awhile longer. Stephen Shankland offers a nice accessible overview of the current situation in an October 15, 2012, essay on CNET: ""Moore's Law: The rule that really matters in tech"  (The Gordon Moore graph above is copied from Shankland's essay.)

As Shankland writes: "To keep up with Moore's Law, engineers must keep shrinking the size of transistors. Intel, the leader in the race, currently uses a manufacturing process with 22-nanometer features. That's 22 billionths of a meter, or roughly a 4,000th the width of a human hair." He cites a variety of industry and research experts to the effect that Moore's law has at least another decade to run--and remember, a decade of doubling every two years means five more doublings!

It's hard to wrap one's mind around what it means to say that the power of microchipo technology will increase by a factor of 32 (doubling five times) in the next 10 years. A characteristically intriguing survey essay  from the January 10 issue of the Economist on the future of innovation uses the checkerboard analogy to think about the potential effects of Moore's law. Here's a comment from the Economist essay:



Ray Kurzweil, a pioneer of computer science and a devotee of exponential technological extrapolation, likes to talk of “the second half of the chess board”. There is an old fable in which a gullible king is tricked into paying an obligation in grains of rice, one on the first square of a chessboard, two on the second, four on the third, the payment doubling with every square. Along the first row, the obligation is minuscule. With half the chessboard covered, the king is out only about 100 tonnes of rice. But a square before reaching the end of the seventh row he has laid out 500m tonnes in total—the whole world’s annual rice production. He will have to put more or less the same amount again on the next square. And there will still be a row to go.

Erik Brynjolfsson and Andrew McAfee of MIT make use of this image in their e-book “Race Against the Machine”. By the measure known as Moore’s law, the ability to get calculations out of a piece of silicon doubles every 18 months. That growth rate will not last for ever; but other aspects of computation, such as the capacity of algorithms to handle data, are also growing exponentially. When such a capacity is low, that doubling does not matter. As soon as it matters at all, though, it can quickly start to matter a lot. On the second half of the chessboard not only has the cumulative effect of innovations become large, but each new iteration of innovation delivers a technological jolt as powerful as all previous rounds combined."

Now, it's of course true that doubling the capacity of computer chips doesn't translate in a direct way into a higher standard of living: there are many steps from one to the other. But my point here is to note that many of us (myself included) have been thinking about the changes in electronics technology a little too much like the king in the checkerboard story: that is, we think of something doubling a few times, even 10 or 20 times, and we know it's a big change, but it somehow seems within our range of comprehension.

But when something has already been doubling every 18 months or two years for a half-century--it is continuing to double!--the absolute size of each additional doubling is starting to get very large. I lack the imagination to conceive of what will be done with all this cheap computing power in terms of health care, education, industrial process, communication, transportation, entertainment, food, travel, design, and more. But I suspect that these enormous repeated doublings, as Moore's law marches forward in the next decade and drives computing speeds up and prices down, will transform lives and industries in ways that we are only just starting to imagine.


Trends in End-Of-Life Care

To contact us Click HERE
When talking about ways of curbing health care spending, someone always brings up the costs of acute care at the very end of life. Could we save significant money by not spending so much on people who are the verge of dying? To what extent are we already changing the patterns of end-of-life care?

We spend about 25-30% of Medicare spending on patients who are in their last year of life, according to Gerald F. Riley and James D. Lubitz in their 2010 study, "Long-term trends in Medicare payments in the last year of life" (Health Services Research, April 2010, 45(2):565-76). They also find that this number hasn't changed much over the last 30 years--that is, health care spending during the last year of life is rising at about the same pace as other Medicare spending-- and that the percentage isn't much affected by adjusting for changes in age or gender of the elderly.  On their estimates, Medicare spending on those who die in a given year is much higher than on those who survive the year: in 2006, Medicare spending in 2006 on those who died in that year was $38,975, while Medicare spending in 2006 on those who survived the year was $5,993.


Total Medicare spending in 2012 was about $560 billion. Thus, 25% of that amount would be $140 billion spent during the last year of life. It's often unclear at the time whether someone is actually in their last year of life, but say for the sake of argument that such cases could be identified, and spending in this area could be reduced by half. If attainable, cuts of this size would be $70 billion in annual savings, which is certainly a substantial sum. But to keep it in perspective, total U.S. health care spending is in the neighborhood of $2.6 trillion. Thus, the potential gains from even fairly aggressive limits on end-of-life health care spending through Medicare is a little under 3% of total U.S. healthcare spending.

To what extent is the U.S. health care system changing its practices in end-of-life care? In the February 6, 2013 issue of JAMA, a team of writers led by Joan Temo address this question in an article called: "Change in End-of-Life Care for Medicare Beneficiaries" (vol. 309, #5, pp. 470-477). They find an intriguingly mixed set of patterns.

One common measure of end-of-life care is to see what share of patients died in a hospice or at home, compared to dying in the acute-care ward of a a hospital. Their results show that from 2000 to 2009, the share of patients who died in the acute care section of a hospital declined from 32.6% to 24.6%; the share of patients who died at home rose from  30-7% to 33.5%; and the share of patients who died in a hospice rose dramatically from 21.6% to 42.2%. On the surface, these kinds of numbers certainly suggest a pattern of less aggressive end-of-life care.

But when Temo et al. dug just a bit deeper, they found that many of the hospice stays were extremely short--just a few days. Looking at the use of intensive care units in the last month of life, they found that it has risen from 24.3% in 2000 to 29.2% in 2009. In addition, the number of health care "transitions" from one care setting to another has risen both in the last 90 days of life and the last three days of life. For example, 10.3% of patients had a "transition" in the last three days of life in 2000, while 14.2% of patients had a transition in the last three days of life in 2009.

As Temo et al. put it: "Although a hospice stay of 1 day may be viewed as beneficial by a dying patient and family, an important yet unanswered research question is whether this pattern of care is consistent with patient preferences and improved quality of life. ... Our findings of an increase in the number of short hospice stays following a hospitalization, often involving an ICU stay, suggest that increasing hospice use may not lead to a reduction in resource utilization. Short hospice lengths of stay raise concerns that hospice is an "add-on" to a growing pattern of more utilization of intensive care services at the end of life."

Few questions in health care policy are harder than what should be spent on end-of-life care. It's fairly common for the elderly, when healthy, to say that they don't want extreme end-of-life measures. But when those same people become very ill, both they and their families often start thinking that extreme care makes a lot of sense. In addition, while perhaps the diagnostic and statistical techniques for figuring out a few months or a year in advance who is likely to die will improve over over time, right now they are not very accurate. Thus, the common sense policies in this area tend to revolve around earlier counseling for the elderly, so that patients (and their families) can have a more clear sense of what they want in terms of end of life care, and improving hospice and end-of-life home care--after all, basic palliative services like intravenous fluids and antibiotics don't need to happen in a hospital setting.

24 Şubat 2013 Pazar

Checkerboard Puzzle, Moore's Law, and Growth Prospects

To contact us Click HERE
My father the mathematician first posed the checkerboard puzzle to me back in grade-school, perhaps on some rainy Saturday. His version of the story went something like this:

The jester performs a great deed, and the king asks him how he would like to be rewarded. The jester is aware that the king is a highly volatile individual, and if the jester asks for too much, the king might just kill him then and there. The jester also knows that the king views his promise as sacred, so if the king says "yes" to the jester's proposal, then the king will honor that promise. So in a way, the jester's problem is how to ask for a lot, but have the king at least initially think it's not very much, so that the king will give his consent.

So the jester clowns around a bit and then says: "Here's all I want. Take this checkerboard. On the first square, but one piece of gold. On the second square, two pieces. On the third square, four pieces, and on the fourth square, 8 pieces. Double the amount on each square until you reach the end of the checkerboard."

In the story, the king laughs at this comic proposal and says,  "Your great deed was so wonderful, I would have happily done much more than this! I grant your request!"

But of course, when the king starts hauling up gold pieces from the treasury, he will discover that 2 raised to the 63rd power, the final spot on the checkerboard requires about 9 quintillion gold pieces (that is, 9 followed by 18 zeros). 

I've had some sense of the power of exponential growth ever since.  But what I hadn't thought about is the interaction of Moore's Law and economic growth. Moore's Law is of course named for Gordon Moore, one of the founders of Intel, who noticed this pattern back in 1965. Back in the 1970s, he wrote a paper that contained the following graph showing how much it cost to produce a computer chip with a certain number of components. Here's his figure. Notice that the numbers of component on the horizontal axis and the cost figures on the vertical axis are both graphed as logarithm (specifically, each step up the axis is a change by a factor of 10). The key takeaway was that the number of transistors ("components") on an integrated circuit was doubling about every two years, making computing power much cheaper and faster.
This chart from Intel co-founder Gordon Moore's seminal 1965 paper showed the cost of transistors decreased with new manufacturing processes even as the number of transistors on a chip increased.

Ever since I started reading up on Moore's law in the early 1980s, there have been predictions in the trade press that it will soon reach technological limits and come to and end. But Moore's law marches on: indeed, the research and innovation targets at Intel and other chip-makers are defined in terms of making sure that Moore's law continues to hold for at least awhile longer. Stephen Shankland offers a nice accessible overview of the current situation in an October 15, 2012, essay on CNET: ""Moore's Law: The rule that really matters in tech"  (The Gordon Moore graph above is copied from Shankland's essay.)

As Shankland writes: "To keep up with Moore's Law, engineers must keep shrinking the size of transistors. Intel, the leader in the race, currently uses a manufacturing process with 22-nanometer features. That's 22 billionths of a meter, or roughly a 4,000th the width of a human hair." He cites a variety of industry and research experts to the effect that Moore's law has at least another decade to run--and remember, a decade of doubling every two years means five more doublings!

It's hard to wrap one's mind around what it means to say that the power of microchipo technology will increase by a factor of 32 (doubling five times) in the next 10 years. A characteristically intriguing survey essay  from the January 10 issue of the Economist on the future of innovation uses the checkerboard analogy to think about the potential effects of Moore's law. Here's a comment from the Economist essay:



Ray Kurzweil, a pioneer of computer science and a devotee of exponential technological extrapolation, likes to talk of “the second half of the chess board”. There is an old fable in which a gullible king is tricked into paying an obligation in grains of rice, one on the first square of a chessboard, two on the second, four on the third, the payment doubling with every square. Along the first row, the obligation is minuscule. With half the chessboard covered, the king is out only about 100 tonnes of rice. But a square before reaching the end of the seventh row he has laid out 500m tonnes in total—the whole world’s annual rice production. He will have to put more or less the same amount again on the next square. And there will still be a row to go.

Erik Brynjolfsson and Andrew McAfee of MIT make use of this image in their e-book “Race Against the Machine”. By the measure known as Moore’s law, the ability to get calculations out of a piece of silicon doubles every 18 months. That growth rate will not last for ever; but other aspects of computation, such as the capacity of algorithms to handle data, are also growing exponentially. When such a capacity is low, that doubling does not matter. As soon as it matters at all, though, it can quickly start to matter a lot. On the second half of the chessboard not only has the cumulative effect of innovations become large, but each new iteration of innovation delivers a technological jolt as powerful as all previous rounds combined."

Now, it's of course true that doubling the capacity of computer chips doesn't translate in a direct way into a higher standard of living: there are many steps from one to the other. But my point here is to note that many of us (myself included) have been thinking about the changes in electronics technology a little too much like the king in the checkerboard story: that is, we think of something doubling a few times, even 10 or 20 times, and we know it's a big change, but it somehow seems within our range of comprehension.

But when something has already been doubling every 18 months or two years for a half-century--it is continuing to double!--the absolute size of each additional doubling is starting to get very large. I lack the imagination to conceive of what will be done with all this cheap computing power in terms of health care, education, industrial process, communication, transportation, entertainment, food, travel, design, and more. But I suspect that these enormous repeated doublings, as Moore's law marches forward in the next decade and drives computing speeds up and prices down, will transform lives and industries in ways that we are only just starting to imagine.


23 Şubat 2013 Cumartesi

Checkerboard Puzzle, Moore's Law, and Growth Prospects

To contact us Click HERE
My father the mathematician first posed the checkerboard puzzle to me back in grade-school, perhaps on some rainy Saturday. His version of the story went something like this:

The jester performs a great deed, and the king asks him how he would like to be rewarded. The jester is aware that the king is a highly volatile individual, and if the jester asks for too much, the king might just kill him then and there. The jester also knows that the king views his promise as sacred, so if the king says "yes" to the jester's proposal, then the king will honor that promise. So in a way, the jester's problem is how to ask for a lot, but have the king at least initially think it's not very much, so that the king will give his consent.

So the jester clowns around a bit and then says: "Here's all I want. Take this checkerboard. On the first square, but one piece of gold. On the second square, two pieces. On the third square, four pieces, and on the fourth square, 8 pieces. Double the amount on each square until you reach the end of the checkerboard."

In the story, the king laughs at this comic proposal and says,  "Your great deed was so wonderful, I would have happily done much more than this! I grant your request!"

But of course, when the king starts hauling up gold pieces from the treasury, he will discover that 2 raised to the 63rd power, the final spot on the checkerboard requires about 9 quintillion gold pieces (that is, 9 followed by 18 zeros). 

I've had some sense of the power of exponential growth ever since.  But what I hadn't thought about is the interaction of Moore's Law and economic growth. Moore's Law is of course named for Gordon Moore, one of the founders of Intel, who noticed this pattern back in 1965. Back in the 1970s, he wrote a paper that contained the following graph showing how much it cost to produce a computer chip with a certain number of components. Here's his figure. Notice that the numbers of component on the horizontal axis and the cost figures on the vertical axis are both graphed as logarithm (specifically, each step up the axis is a change by a factor of 10). The key takeaway was that the number of transistors ("components") on an integrated circuit was doubling about every two years, making computing power much cheaper and faster.
This chart from Intel co-founder Gordon Moore's seminal 1965 paper showed the cost of transistors decreased with new manufacturing processes even as the number of transistors on a chip increased.

Ever since I started reading up on Moore's law in the early 1980s, there have been predictions in the trade press that it will soon reach technological limits and come to and end. But Moore's law marches on: indeed, the research and innovation targets at Intel and other chip-makers are defined in terms of making sure that Moore's law continues to hold for at least awhile longer. Stephen Shankland offers a nice accessible overview of the current situation in an October 15, 2012, essay on CNET: ""Moore's Law: The rule that really matters in tech"  (The Gordon Moore graph above is copied from Shankland's essay.)

As Shankland writes: "To keep up with Moore's Law, engineers must keep shrinking the size of transistors. Intel, the leader in the race, currently uses a manufacturing process with 22-nanometer features. That's 22 billionths of a meter, or roughly a 4,000th the width of a human hair." He cites a variety of industry and research experts to the effect that Moore's law has at least another decade to run--and remember, a decade of doubling every two years means five more doublings!

It's hard to wrap one's mind around what it means to say that the power of microchipo technology will increase by a factor of 32 (doubling five times) in the next 10 years. A characteristically intriguing survey essay  from the January 10 issue of the Economist on the future of innovation uses the checkerboard analogy to think about the potential effects of Moore's law. Here's a comment from the Economist essay:



Ray Kurzweil, a pioneer of computer science and a devotee of exponential technological extrapolation, likes to talk of “the second half of the chess board”. There is an old fable in which a gullible king is tricked into paying an obligation in grains of rice, one on the first square of a chessboard, two on the second, four on the third, the payment doubling with every square. Along the first row, the obligation is minuscule. With half the chessboard covered, the king is out only about 100 tonnes of rice. But a square before reaching the end of the seventh row he has laid out 500m tonnes in total—the whole world’s annual rice production. He will have to put more or less the same amount again on the next square. And there will still be a row to go.

Erik Brynjolfsson and Andrew McAfee of MIT make use of this image in their e-book “Race Against the Machine”. By the measure known as Moore’s law, the ability to get calculations out of a piece of silicon doubles every 18 months. That growth rate will not last for ever; but other aspects of computation, such as the capacity of algorithms to handle data, are also growing exponentially. When such a capacity is low, that doubling does not matter. As soon as it matters at all, though, it can quickly start to matter a lot. On the second half of the chessboard not only has the cumulative effect of innovations become large, but each new iteration of innovation delivers a technological jolt as powerful as all previous rounds combined."

Now, it's of course true that doubling the capacity of computer chips doesn't translate in a direct way into a higher standard of living: there are many steps from one to the other. But my point here is to note that many of us (myself included) have been thinking about the changes in electronics technology a little too much like the king in the checkerboard story: that is, we think of something doubling a few times, even 10 or 20 times, and we know it's a big change, but it somehow seems within our range of comprehension.

But when something has already been doubling every 18 months or two years for a half-century--it is continuing to double!--the absolute size of each additional doubling is starting to get very large. I lack the imagination to conceive of what will be done with all this cheap computing power in terms of health care, education, industrial process, communication, transportation, entertainment, food, travel, design, and more. But I suspect that these enormous repeated doublings, as Moore's law marches forward in the next decade and drives computing speeds up and prices down, will transform lives and industries in ways that we are only just starting to imagine.


22 Şubat 2013 Cuma

Checkerboard Puzzle, Moore's Law, and Growth Prospects

To contact us Click HERE
My father the mathematician first posed the checkerboard puzzle to me back in grade-school, perhaps on some rainy Saturday. His version of the story went something like this:

The jester performs a great deed, and the king asks him how he would like to be rewarded. The jester is aware that the king is a highly volatile individual, and if the jester asks for too much, the king might just kill him then and there. The jester also knows that the king views his promise as sacred, so if the king says "yes" to the jester's proposal, then the king will honor that promise. So in a way, the jester's problem is how to ask for a lot, but have the king at least initially think it's not very much, so that the king will give his consent.

So the jester clowns around a bit and then says: "Here's all I want. Take this checkerboard. On the first square, but one piece of gold. On the second square, two pieces. On the third square, four pieces, and on the fourth square, 8 pieces. Double the amount on each square until you reach the end of the checkerboard."

In the story, the king laughs at this comic proposal and says,  "Your great deed was so wonderful, I would have happily done much more than this! I grant your request!"

But of course, when the king starts hauling up gold pieces from the treasury, he will discover that 2 raised to the 63rd power, the final spot on the checkerboard requires about 9 quintillion gold pieces (that is, 9 followed by 18 zeros). 

I've had some sense of the power of exponential growth ever since.  But what I hadn't thought about is the interaction of Moore's Law and economic growth. Moore's Law is of course named for Gordon Moore, one of the founders of Intel, who noticed this pattern back in 1965. Back in the 1970s, he wrote a paper that contained the following graph showing how much it cost to produce a computer chip with a certain number of components. Here's his figure. Notice that the numbers of component on the horizontal axis and the cost figures on the vertical axis are both graphed as logarithm (specifically, each step up the axis is a change by a factor of 10). The key takeaway was that the number of transistors ("components") on an integrated circuit was doubling about every two years, making computing power much cheaper and faster.
This chart from Intel co-founder Gordon Moore's seminal 1965 paper showed the cost of transistors decreased with new manufacturing processes even as the number of transistors on a chip increased.

Ever since I started reading up on Moore's law in the early 1980s, there have been predictions in the trade press that it will soon reach technological limits and come to and end. But Moore's law marches on: indeed, the research and innovation targets at Intel and other chip-makers are defined in terms of making sure that Moore's law continues to hold for at least awhile longer. Stephen Shankland offers a nice accessible overview of the current situation in an October 15, 2012, essay on CNET: ""Moore's Law: The rule that really matters in tech"  (The Gordon Moore graph above is copied from Shankland's essay.)

As Shankland writes: "To keep up with Moore's Law, engineers must keep shrinking the size of transistors. Intel, the leader in the race, currently uses a manufacturing process with 22-nanometer features. That's 22 billionths of a meter, or roughly a 4,000th the width of a human hair." He cites a variety of industry and research experts to the effect that Moore's law has at least another decade to run--and remember, a decade of doubling every two years means five more doublings!

It's hard to wrap one's mind around what it means to say that the power of microchipo technology will increase by a factor of 32 (doubling five times) in the next 10 years. A characteristically intriguing survey essay  from the January 10 issue of the Economist on the future of innovation uses the checkerboard analogy to think about the potential effects of Moore's law. Here's a comment from the Economist essay:



Ray Kurzweil, a pioneer of computer science and a devotee of exponential technological extrapolation, likes to talk of “the second half of the chess board”. There is an old fable in which a gullible king is tricked into paying an obligation in grains of rice, one on the first square of a chessboard, two on the second, four on the third, the payment doubling with every square. Along the first row, the obligation is minuscule. With half the chessboard covered, the king is out only about 100 tonnes of rice. But a square before reaching the end of the seventh row he has laid out 500m tonnes in total—the whole world’s annual rice production. He will have to put more or less the same amount again on the next square. And there will still be a row to go.

Erik Brynjolfsson and Andrew McAfee of MIT make use of this image in their e-book “Race Against the Machine”. By the measure known as Moore’s law, the ability to get calculations out of a piece of silicon doubles every 18 months. That growth rate will not last for ever; but other aspects of computation, such as the capacity of algorithms to handle data, are also growing exponentially. When such a capacity is low, that doubling does not matter. As soon as it matters at all, though, it can quickly start to matter a lot. On the second half of the chessboard not only has the cumulative effect of innovations become large, but each new iteration of innovation delivers a technological jolt as powerful as all previous rounds combined."

Now, it's of course true that doubling the capacity of computer chips doesn't translate in a direct way into a higher standard of living: there are many steps from one to the other. But my point here is to note that many of us (myself included) have been thinking about the changes in electronics technology a little too much like the king in the checkerboard story: that is, we think of something doubling a few times, even 10 or 20 times, and we know it's a big change, but it somehow seems within our range of comprehension.

But when something has already been doubling every 18 months or two years for a half-century--it is continuing to double!--the absolute size of each additional doubling is starting to get very large. I lack the imagination to conceive of what will be done with all this cheap computing power in terms of health care, education, industrial process, communication, transportation, entertainment, food, travel, design, and more. But I suspect that these enormous repeated doublings, as Moore's law marches forward in the next decade and drives computing speeds up and prices down, will transform lives and industries in ways that we are only just starting to imagine.


The Financial Cycle: Theory and Implications

To contact us Click HERE
In the aftermath of the Great Recession, mainstream macroeconomists have been seeking in various ways to bring the financial sector into their models. As that activity implies, the financial sector had not previously been playing much of a role mainstream models. Claudio Borio lays out a perspective on treating cycles in the financial sector as having a life of their own in "The financial cycle andmacroeconomics: What have we learnt?", published in December 2012  as working paper #395 for the Bank of International Settlements. I should note that Borio's view of how the financial sector interrelates with the real economy is not conventional macroeconomic wisdom, but I should also note that conventional macroeconomics hasn't exactly covered itself with glory in the last few years.


Borio begins by point out that conventional macroeconomics was paying little attention to the financial sector in the years before the Great Recession, and argues that the strategies for trying to add a  financial sector to existing models doesn't go nearly far enough. (Citations and footnote are omitted from quotations throughout.)  Here's Borio: "The financial crisis that engulfed mature economies in the late 2000s has prompted much soul searching. Economists are now trying hard to
incorporate financial factors into standard macroeconomic models. However, the prevailing,
in fact almost exclusive, strategy is a conservative one. It is to graft additional so-called
financial “frictions” on otherwise fully well behaved equilibrium macroeconomic models ... The main thesis is that macroeconomics without the financial cycle is like Hamlet without the Prince. In the environment that has prevailed for at least three decades now, just as in the one that prevailed in the pre-WW2 years, it is simply not possible to understand business fluctuations and their policy challenges without understanding the financial cycle."

Borio argues that there is a "financial cycle" with its own dynamics. Here's a figure with U.S. data showing the regular business cycle, measured by variations in GDP, compared with the "financial cycle," based on estimates of credit, the credit/GDP ratio, and property prices. He argues that while business cycles are usually in the range of 1-8 years, "the average length of the financial cycle in a sample of seven industrialised countries since the 1960s has been around 16 years."

Borio argues that the peaks of the financial cycle are associated with financial crises. When a business cycle recession happens at the same time as the contraction part of a financial cycle, the recession is about 50% deeper.

This perspective on the financial cycle also offers some policy advice. Central banks and financial regulators should pay attention to credit/GDP ratios and to property prices. Borio writes: "The idea is to build up buffers in good times, as financial vulnerabilities grow, so as to be able to draw them down in bad times, as financial stress materialises. There are many ways of doing so, through the appropriate design of tools such as capital and liquidity standards, provisioning, collateral and margining practices, and so on. ... In the case of monetary policy, it is necessary to adopt strategies that allow central banks to tighten so as to lean against the build-up of financial imbalances even if near-term inflation remains subdued – what might be called the “lean option”. Operationally, this calls for extending policy horizons beyond the roughly 2-year ones typical of inflation targeting regimes and for giving greater prominence to the balance of risks in the outlook, fully taking into account the slow build-up of vulnerabilities associated with the financial cycle. ... In the case of fiscal policy, there is a need for extra prudence during economic expansions associated with financial booms. ...  Financial booms are especially generous for the public coffers, because of the structure of revenues. And the sovereign inadvertently accumulates contingent liabilities, which crystallise as the
boom turns to bust and balance sheet problems emerge, especially in the financial sector."

However, once the double-whammy of a financial crisis and a business cycle recession has hit simultaneously, Borio also argues that conventional policy responses may not work well. In a "balance sheet recession," fiscal and monetary policy may not be very capable of stimulating demand, and instead may encourage financial firms and businesses to put off the necessary hard steps they need to take, leaving the economy too dependent on government stimulation rather than on the private sector moving forward. As he writes:  "On reflection, the basic reason for the limitations of monetary policy in a financial bust is not hard to find. Monetary policy typically operates by encouraging borrowing, boosting asset prices and risk-taking. But initial conditions already include too much debt, too-high asset prices (property) and too much risk-taking. There is an inevitable tension between how policy works and the direction the economy needs to take."

The concept of a "financial cycle" has a plausible back-story. When times are good, borrowers and investors of all kinds tend to let down their guard, worry less about risks, and gradually become overextended--which can then brings on a counterreaction, or even in some cases a financial crisis. It's easy to point to financial crises, but it's harder to show convincingly that an earlier financial boom is the cause of the crisis. The nice smooth curve of financial cycles above is created by using statistical tools ("filtering") to blend together tBorio is up front about this difficulty and others, and has some suggestions for how the appropriate modeling might proceed. But even if one doesn't buy into the notion of a self-perpetuating financial cycle, standing apart from the regular business cycle, one lesson that everyone seems to have learned from the Great Recession is that rapid expansions of credit and rapid rises in property values have real macroeconomic risks--and thus are an appropriate target for policy.


 

21 Şubat 2013 Perşembe

Checkerboard Puzzle, Moore's Law, and Growth Prospects

To contact us Click HERE
My father the mathematician first posed the checkerboard puzzle to me back in grade-school, perhaps on some rainy Saturday. His version of the story went something like this:

The jester performs a great deed, and the king asks him how he would like to be rewarded. The jester is aware that the king is a highly volatile individual, and if the jester asks for too much, the king might just kill him then and there. The jester also knows that the king views his promise as sacred, so if the king says "yes" to the jester's proposal, then the king will honor that promise. So in a way, the jester's problem is how to ask for a lot, but have the king at least initially think it's not very much, so that the king will give his consent.

So the jester clowns around a bit and then says: "Here's all I want. Take this checkerboard. On the first square, but one piece of gold. On the second square, two pieces. On the third square, four pieces, and on the fourth square, 8 pieces. Double the amount on each square until you reach the end of the checkerboard."

In the story, the king laughs at this comic proposal and says,  "Your great deed was so wonderful, I would have happily done much more than this! I grant your request!"

But of course, when the king starts hauling up gold pieces from the treasury, he will discover that 2 raised to the 63rd power, the final spot on the checkerboard requires about 9 quintillion gold pieces (that is, 9 followed by 18 zeros). 

I've had some sense of the power of exponential growth ever since.  But what I hadn't thought about is the interaction of Moore's Law and economic growth. Moore's Law is of course named for Gordon Moore, one of the founders of Intel, who noticed this pattern back in 1965. Back in the 1970s, he wrote a paper that contained the following graph showing how much it cost to produce a computer chip with a certain number of components. Here's his figure. Notice that the numbers of component on the horizontal axis and the cost figures on the vertical axis are both graphed as logarithm (specifically, each step up the axis is a change by a factor of 10). The key takeaway was that the number of transistors ("components") on an integrated circuit was doubling about every two years, making computing power much cheaper and faster.
This chart from Intel co-founder Gordon Moore's seminal 1965 paper showed the cost of transistors decreased with new manufacturing processes even as the number of transistors on a chip increased.

Ever since I started reading up on Moore's law in the early 1980s, there have been predictions in the trade press that it will soon reach technological limits and come to and end. But Moore's law marches on: indeed, the research and innovation targets at Intel and other chip-makers are defined in terms of making sure that Moore's law continues to hold for at least awhile longer. Stephen Shankland offers a nice accessible overview of the current situation in an October 15, 2012, essay on CNET: ""Moore's Law: The rule that really matters in tech"  (The Gordon Moore graph above is copied from Shankland's essay.)

As Shankland writes: "To keep up with Moore's Law, engineers must keep shrinking the size of transistors. Intel, the leader in the race, currently uses a manufacturing process with 22-nanometer features. That's 22 billionths of a meter, or roughly a 4,000th the width of a human hair." He cites a variety of industry and research experts to the effect that Moore's law has at least another decade to run--and remember, a decade of doubling every two years means five more doublings!

It's hard to wrap one's mind around what it means to say that the power of microchipo technology will increase by a factor of 32 (doubling five times) in the next 10 years. A characteristically intriguing survey essay  from the January 10 issue of the Economist on the future of innovation uses the checkerboard analogy to think about the potential effects of Moore's law. Here's a comment from the Economist essay:



Ray Kurzweil, a pioneer of computer science and a devotee of exponential technological extrapolation, likes to talk of “the second half of the chess board”. There is an old fable in which a gullible king is tricked into paying an obligation in grains of rice, one on the first square of a chessboard, two on the second, four on the third, the payment doubling with every square. Along the first row, the obligation is minuscule. With half the chessboard covered, the king is out only about 100 tonnes of rice. But a square before reaching the end of the seventh row he has laid out 500m tonnes in total—the whole world’s annual rice production. He will have to put more or less the same amount again on the next square. And there will still be a row to go.

Erik Brynjolfsson and Andrew McAfee of MIT make use of this image in their e-book “Race Against the Machine”. By the measure known as Moore’s law, the ability to get calculations out of a piece of silicon doubles every 18 months. That growth rate will not last for ever; but other aspects of computation, such as the capacity of algorithms to handle data, are also growing exponentially. When such a capacity is low, that doubling does not matter. As soon as it matters at all, though, it can quickly start to matter a lot. On the second half of the chessboard not only has the cumulative effect of innovations become large, but each new iteration of innovation delivers a technological jolt as powerful as all previous rounds combined."

Now, it's of course true that doubling the capacity of computer chips doesn't translate in a direct way into a higher standard of living: there are many steps from one to the other. But my point here is to note that many of us (myself included) have been thinking about the changes in electronics technology a little too much like the king in the checkerboard story: that is, we think of something doubling a few times, even 10 or 20 times, and we know it's a big change, but it somehow seems within our range of comprehension.

But when something has already been doubling every 18 months or two years for a half-century--it is continuing to double!--the absolute size of each additional doubling is starting to get very large. I lack the imagination to conceive of what will be done with all this cheap computing power in terms of health care, education, industrial process, communication, transportation, entertainment, food, travel, design, and more. But I suspect that these enormous repeated doublings, as Moore's law marches forward in the next decade and drives computing speeds up and prices down, will transform lives and industries in ways that we are only just starting to imagine.