25 Şubat 2013 Pazartesi

Checkerboard Puzzle, Moore's Law, and Growth Prospects

To contact us Click HERE
My father the mathematician first posed the checkerboard puzzle to me back in grade-school, perhaps on some rainy Saturday. His version of the story went something like this:

The jester performs a great deed, and the king asks him how he would like to be rewarded. The jester is aware that the king is a highly volatile individual, and if the jester asks for too much, the king might just kill him then and there. The jester also knows that the king views his promise as sacred, so if the king says "yes" to the jester's proposal, then the king will honor that promise. So in a way, the jester's problem is how to ask for a lot, but have the king at least initially think it's not very much, so that the king will give his consent.

So the jester clowns around a bit and then says: "Here's all I want. Take this checkerboard. On the first square, but one piece of gold. On the second square, two pieces. On the third square, four pieces, and on the fourth square, 8 pieces. Double the amount on each square until you reach the end of the checkerboard."

In the story, the king laughs at this comic proposal and says,  "Your great deed was so wonderful, I would have happily done much more than this! I grant your request!"

But of course, when the king starts hauling up gold pieces from the treasury, he will discover that 2 raised to the 63rd power, the final spot on the checkerboard requires about 9 quintillion gold pieces (that is, 9 followed by 18 zeros). 

I've had some sense of the power of exponential growth ever since.  But what I hadn't thought about is the interaction of Moore's Law and economic growth. Moore's Law is of course named for Gordon Moore, one of the founders of Intel, who noticed this pattern back in 1965. Back in the 1970s, he wrote a paper that contained the following graph showing how much it cost to produce a computer chip with a certain number of components. Here's his figure. Notice that the numbers of component on the horizontal axis and the cost figures on the vertical axis are both graphed as logarithm (specifically, each step up the axis is a change by a factor of 10). The key takeaway was that the number of transistors ("components") on an integrated circuit was doubling about every two years, making computing power much cheaper and faster.
This chart from Intel co-founder Gordon Moore's seminal 1965 paper showed the cost of transistors decreased with new manufacturing processes even as the number of transistors on a chip increased.

Ever since I started reading up on Moore's law in the early 1980s, there have been predictions in the trade press that it will soon reach technological limits and come to and end. But Moore's law marches on: indeed, the research and innovation targets at Intel and other chip-makers are defined in terms of making sure that Moore's law continues to hold for at least awhile longer. Stephen Shankland offers a nice accessible overview of the current situation in an October 15, 2012, essay on CNET: ""Moore's Law: The rule that really matters in tech"  (The Gordon Moore graph above is copied from Shankland's essay.)

As Shankland writes: "To keep up with Moore's Law, engineers must keep shrinking the size of transistors. Intel, the leader in the race, currently uses a manufacturing process with 22-nanometer features. That's 22 billionths of a meter, or roughly a 4,000th the width of a human hair." He cites a variety of industry and research experts to the effect that Moore's law has at least another decade to run--and remember, a decade of doubling every two years means five more doublings!

It's hard to wrap one's mind around what it means to say that the power of microchipo technology will increase by a factor of 32 (doubling five times) in the next 10 years. A characteristically intriguing survey essay  from the January 10 issue of the Economist on the future of innovation uses the checkerboard analogy to think about the potential effects of Moore's law. Here's a comment from the Economist essay:



Ray Kurzweil, a pioneer of computer science and a devotee of exponential technological extrapolation, likes to talk of “the second half of the chess board”. There is an old fable in which a gullible king is tricked into paying an obligation in grains of rice, one on the first square of a chessboard, two on the second, four on the third, the payment doubling with every square. Along the first row, the obligation is minuscule. With half the chessboard covered, the king is out only about 100 tonnes of rice. But a square before reaching the end of the seventh row he has laid out 500m tonnes in total—the whole world’s annual rice production. He will have to put more or less the same amount again on the next square. And there will still be a row to go.

Erik Brynjolfsson and Andrew McAfee of MIT make use of this image in their e-book “Race Against the Machine”. By the measure known as Moore’s law, the ability to get calculations out of a piece of silicon doubles every 18 months. That growth rate will not last for ever; but other aspects of computation, such as the capacity of algorithms to handle data, are also growing exponentially. When such a capacity is low, that doubling does not matter. As soon as it matters at all, though, it can quickly start to matter a lot. On the second half of the chessboard not only has the cumulative effect of innovations become large, but each new iteration of innovation delivers a technological jolt as powerful as all previous rounds combined."

Now, it's of course true that doubling the capacity of computer chips doesn't translate in a direct way into a higher standard of living: there are many steps from one to the other. But my point here is to note that many of us (myself included) have been thinking about the changes in electronics technology a little too much like the king in the checkerboard story: that is, we think of something doubling a few times, even 10 or 20 times, and we know it's a big change, but it somehow seems within our range of comprehension.

But when something has already been doubling every 18 months or two years for a half-century--it is continuing to double!--the absolute size of each additional doubling is starting to get very large. I lack the imagination to conceive of what will be done with all this cheap computing power in terms of health care, education, industrial process, communication, transportation, entertainment, food, travel, design, and more. But I suspect that these enormous repeated doublings, as Moore's law marches forward in the next decade and drives computing speeds up and prices down, will transform lives and industries in ways that we are only just starting to imagine.


Trends in End-Of-Life Care

To contact us Click HERE
When talking about ways of curbing health care spending, someone always brings up the costs of acute care at the very end of life. Could we save significant money by not spending so much on people who are the verge of dying? To what extent are we already changing the patterns of end-of-life care?

We spend about 25-30% of Medicare spending on patients who are in their last year of life, according to Gerald F. Riley and James D. Lubitz in their 2010 study, "Long-term trends in Medicare payments in the last year of life" (Health Services Research, April 2010, 45(2):565-76). They also find that this number hasn't changed much over the last 30 years--that is, health care spending during the last year of life is rising at about the same pace as other Medicare spending-- and that the percentage isn't much affected by adjusting for changes in age or gender of the elderly.  On their estimates, Medicare spending on those who die in a given year is much higher than on those who survive the year: in 2006, Medicare spending in 2006 on those who died in that year was $38,975, while Medicare spending in 2006 on those who survived the year was $5,993.


Total Medicare spending in 2012 was about $560 billion. Thus, 25% of that amount would be $140 billion spent during the last year of life. It's often unclear at the time whether someone is actually in their last year of life, but say for the sake of argument that such cases could be identified, and spending in this area could be reduced by half. If attainable, cuts of this size would be $70 billion in annual savings, which is certainly a substantial sum. But to keep it in perspective, total U.S. health care spending is in the neighborhood of $2.6 trillion. Thus, the potential gains from even fairly aggressive limits on end-of-life health care spending through Medicare is a little under 3% of total U.S. healthcare spending.

To what extent is the U.S. health care system changing its practices in end-of-life care? In the February 6, 2013 issue of JAMA, a team of writers led by Joan Temo address this question in an article called: "Change in End-of-Life Care for Medicare Beneficiaries" (vol. 309, #5, pp. 470-477). They find an intriguingly mixed set of patterns.

One common measure of end-of-life care is to see what share of patients died in a hospice or at home, compared to dying in the acute-care ward of a a hospital. Their results show that from 2000 to 2009, the share of patients who died in the acute care section of a hospital declined from 32.6% to 24.6%; the share of patients who died at home rose from  30-7% to 33.5%; and the share of patients who died in a hospice rose dramatically from 21.6% to 42.2%. On the surface, these kinds of numbers certainly suggest a pattern of less aggressive end-of-life care.

But when Temo et al. dug just a bit deeper, they found that many of the hospice stays were extremely short--just a few days. Looking at the use of intensive care units in the last month of life, they found that it has risen from 24.3% in 2000 to 29.2% in 2009. In addition, the number of health care "transitions" from one care setting to another has risen both in the last 90 days of life and the last three days of life. For example, 10.3% of patients had a "transition" in the last three days of life in 2000, while 14.2% of patients had a transition in the last three days of life in 2009.

As Temo et al. put it: "Although a hospice stay of 1 day may be viewed as beneficial by a dying patient and family, an important yet unanswered research question is whether this pattern of care is consistent with patient preferences and improved quality of life. ... Our findings of an increase in the number of short hospice stays following a hospitalization, often involving an ICU stay, suggest that increasing hospice use may not lead to a reduction in resource utilization. Short hospice lengths of stay raise concerns that hospice is an "add-on" to a growing pattern of more utilization of intensive care services at the end of life."

Few questions in health care policy are harder than what should be spent on end-of-life care. It's fairly common for the elderly, when healthy, to say that they don't want extreme end-of-life measures. But when those same people become very ill, both they and their families often start thinking that extreme care makes a lot of sense. In addition, while perhaps the diagnostic and statistical techniques for figuring out a few months or a year in advance who is likely to die will improve over over time, right now they are not very accurate. Thus, the common sense policies in this area tend to revolve around earlier counseling for the elderly, so that patients (and their families) can have a more clear sense of what they want in terms of end of life care, and improving hospice and end-of-life home care--after all, basic palliative services like intravenous fluids and antibiotics don't need to happen in a hospital setting.

24 Şubat 2013 Pazar

Checkerboard Puzzle, Moore's Law, and Growth Prospects

To contact us Click HERE
My father the mathematician first posed the checkerboard puzzle to me back in grade-school, perhaps on some rainy Saturday. His version of the story went something like this:

The jester performs a great deed, and the king asks him how he would like to be rewarded. The jester is aware that the king is a highly volatile individual, and if the jester asks for too much, the king might just kill him then and there. The jester also knows that the king views his promise as sacred, so if the king says "yes" to the jester's proposal, then the king will honor that promise. So in a way, the jester's problem is how to ask for a lot, but have the king at least initially think it's not very much, so that the king will give his consent.

So the jester clowns around a bit and then says: "Here's all I want. Take this checkerboard. On the first square, but one piece of gold. On the second square, two pieces. On the third square, four pieces, and on the fourth square, 8 pieces. Double the amount on each square until you reach the end of the checkerboard."

In the story, the king laughs at this comic proposal and says,  "Your great deed was so wonderful, I would have happily done much more than this! I grant your request!"

But of course, when the king starts hauling up gold pieces from the treasury, he will discover that 2 raised to the 63rd power, the final spot on the checkerboard requires about 9 quintillion gold pieces (that is, 9 followed by 18 zeros). 

I've had some sense of the power of exponential growth ever since.  But what I hadn't thought about is the interaction of Moore's Law and economic growth. Moore's Law is of course named for Gordon Moore, one of the founders of Intel, who noticed this pattern back in 1965. Back in the 1970s, he wrote a paper that contained the following graph showing how much it cost to produce a computer chip with a certain number of components. Here's his figure. Notice that the numbers of component on the horizontal axis and the cost figures on the vertical axis are both graphed as logarithm (specifically, each step up the axis is a change by a factor of 10). The key takeaway was that the number of transistors ("components") on an integrated circuit was doubling about every two years, making computing power much cheaper and faster.
This chart from Intel co-founder Gordon Moore's seminal 1965 paper showed the cost of transistors decreased with new manufacturing processes even as the number of transistors on a chip increased.

Ever since I started reading up on Moore's law in the early 1980s, there have been predictions in the trade press that it will soon reach technological limits and come to and end. But Moore's law marches on: indeed, the research and innovation targets at Intel and other chip-makers are defined in terms of making sure that Moore's law continues to hold for at least awhile longer. Stephen Shankland offers a nice accessible overview of the current situation in an October 15, 2012, essay on CNET: ""Moore's Law: The rule that really matters in tech"  (The Gordon Moore graph above is copied from Shankland's essay.)

As Shankland writes: "To keep up with Moore's Law, engineers must keep shrinking the size of transistors. Intel, the leader in the race, currently uses a manufacturing process with 22-nanometer features. That's 22 billionths of a meter, or roughly a 4,000th the width of a human hair." He cites a variety of industry and research experts to the effect that Moore's law has at least another decade to run--and remember, a decade of doubling every two years means five more doublings!

It's hard to wrap one's mind around what it means to say that the power of microchipo technology will increase by a factor of 32 (doubling five times) in the next 10 years. A characteristically intriguing survey essay  from the January 10 issue of the Economist on the future of innovation uses the checkerboard analogy to think about the potential effects of Moore's law. Here's a comment from the Economist essay:



Ray Kurzweil, a pioneer of computer science and a devotee of exponential technological extrapolation, likes to talk of “the second half of the chess board”. There is an old fable in which a gullible king is tricked into paying an obligation in grains of rice, one on the first square of a chessboard, two on the second, four on the third, the payment doubling with every square. Along the first row, the obligation is minuscule. With half the chessboard covered, the king is out only about 100 tonnes of rice. But a square before reaching the end of the seventh row he has laid out 500m tonnes in total—the whole world’s annual rice production. He will have to put more or less the same amount again on the next square. And there will still be a row to go.

Erik Brynjolfsson and Andrew McAfee of MIT make use of this image in their e-book “Race Against the Machine”. By the measure known as Moore’s law, the ability to get calculations out of a piece of silicon doubles every 18 months. That growth rate will not last for ever; but other aspects of computation, such as the capacity of algorithms to handle data, are also growing exponentially. When such a capacity is low, that doubling does not matter. As soon as it matters at all, though, it can quickly start to matter a lot. On the second half of the chessboard not only has the cumulative effect of innovations become large, but each new iteration of innovation delivers a technological jolt as powerful as all previous rounds combined."

Now, it's of course true that doubling the capacity of computer chips doesn't translate in a direct way into a higher standard of living: there are many steps from one to the other. But my point here is to note that many of us (myself included) have been thinking about the changes in electronics technology a little too much like the king in the checkerboard story: that is, we think of something doubling a few times, even 10 or 20 times, and we know it's a big change, but it somehow seems within our range of comprehension.

But when something has already been doubling every 18 months or two years for a half-century--it is continuing to double!--the absolute size of each additional doubling is starting to get very large. I lack the imagination to conceive of what will be done with all this cheap computing power in terms of health care, education, industrial process, communication, transportation, entertainment, food, travel, design, and more. But I suspect that these enormous repeated doublings, as Moore's law marches forward in the next decade and drives computing speeds up and prices down, will transform lives and industries in ways that we are only just starting to imagine.


23 Şubat 2013 Cumartesi

Checkerboard Puzzle, Moore's Law, and Growth Prospects

To contact us Click HERE
My father the mathematician first posed the checkerboard puzzle to me back in grade-school, perhaps on some rainy Saturday. His version of the story went something like this:

The jester performs a great deed, and the king asks him how he would like to be rewarded. The jester is aware that the king is a highly volatile individual, and if the jester asks for too much, the king might just kill him then and there. The jester also knows that the king views his promise as sacred, so if the king says "yes" to the jester's proposal, then the king will honor that promise. So in a way, the jester's problem is how to ask for a lot, but have the king at least initially think it's not very much, so that the king will give his consent.

So the jester clowns around a bit and then says: "Here's all I want. Take this checkerboard. On the first square, but one piece of gold. On the second square, two pieces. On the third square, four pieces, and on the fourth square, 8 pieces. Double the amount on each square until you reach the end of the checkerboard."

In the story, the king laughs at this comic proposal and says,  "Your great deed was so wonderful, I would have happily done much more than this! I grant your request!"

But of course, when the king starts hauling up gold pieces from the treasury, he will discover that 2 raised to the 63rd power, the final spot on the checkerboard requires about 9 quintillion gold pieces (that is, 9 followed by 18 zeros). 

I've had some sense of the power of exponential growth ever since.  But what I hadn't thought about is the interaction of Moore's Law and economic growth. Moore's Law is of course named for Gordon Moore, one of the founders of Intel, who noticed this pattern back in 1965. Back in the 1970s, he wrote a paper that contained the following graph showing how much it cost to produce a computer chip with a certain number of components. Here's his figure. Notice that the numbers of component on the horizontal axis and the cost figures on the vertical axis are both graphed as logarithm (specifically, each step up the axis is a change by a factor of 10). The key takeaway was that the number of transistors ("components") on an integrated circuit was doubling about every two years, making computing power much cheaper and faster.
This chart from Intel co-founder Gordon Moore's seminal 1965 paper showed the cost of transistors decreased with new manufacturing processes even as the number of transistors on a chip increased.

Ever since I started reading up on Moore's law in the early 1980s, there have been predictions in the trade press that it will soon reach technological limits and come to and end. But Moore's law marches on: indeed, the research and innovation targets at Intel and other chip-makers are defined in terms of making sure that Moore's law continues to hold for at least awhile longer. Stephen Shankland offers a nice accessible overview of the current situation in an October 15, 2012, essay on CNET: ""Moore's Law: The rule that really matters in tech"  (The Gordon Moore graph above is copied from Shankland's essay.)

As Shankland writes: "To keep up with Moore's Law, engineers must keep shrinking the size of transistors. Intel, the leader in the race, currently uses a manufacturing process with 22-nanometer features. That's 22 billionths of a meter, or roughly a 4,000th the width of a human hair." He cites a variety of industry and research experts to the effect that Moore's law has at least another decade to run--and remember, a decade of doubling every two years means five more doublings!

It's hard to wrap one's mind around what it means to say that the power of microchipo technology will increase by a factor of 32 (doubling five times) in the next 10 years. A characteristically intriguing survey essay  from the January 10 issue of the Economist on the future of innovation uses the checkerboard analogy to think about the potential effects of Moore's law. Here's a comment from the Economist essay:



Ray Kurzweil, a pioneer of computer science and a devotee of exponential technological extrapolation, likes to talk of “the second half of the chess board”. There is an old fable in which a gullible king is tricked into paying an obligation in grains of rice, one on the first square of a chessboard, two on the second, four on the third, the payment doubling with every square. Along the first row, the obligation is minuscule. With half the chessboard covered, the king is out only about 100 tonnes of rice. But a square before reaching the end of the seventh row he has laid out 500m tonnes in total—the whole world’s annual rice production. He will have to put more or less the same amount again on the next square. And there will still be a row to go.

Erik Brynjolfsson and Andrew McAfee of MIT make use of this image in their e-book “Race Against the Machine”. By the measure known as Moore’s law, the ability to get calculations out of a piece of silicon doubles every 18 months. That growth rate will not last for ever; but other aspects of computation, such as the capacity of algorithms to handle data, are also growing exponentially. When such a capacity is low, that doubling does not matter. As soon as it matters at all, though, it can quickly start to matter a lot. On the second half of the chessboard not only has the cumulative effect of innovations become large, but each new iteration of innovation delivers a technological jolt as powerful as all previous rounds combined."

Now, it's of course true that doubling the capacity of computer chips doesn't translate in a direct way into a higher standard of living: there are many steps from one to the other. But my point here is to note that many of us (myself included) have been thinking about the changes in electronics technology a little too much like the king in the checkerboard story: that is, we think of something doubling a few times, even 10 or 20 times, and we know it's a big change, but it somehow seems within our range of comprehension.

But when something has already been doubling every 18 months or two years for a half-century--it is continuing to double!--the absolute size of each additional doubling is starting to get very large. I lack the imagination to conceive of what will be done with all this cheap computing power in terms of health care, education, industrial process, communication, transportation, entertainment, food, travel, design, and more. But I suspect that these enormous repeated doublings, as Moore's law marches forward in the next decade and drives computing speeds up and prices down, will transform lives and industries in ways that we are only just starting to imagine.


22 Şubat 2013 Cuma

Checkerboard Puzzle, Moore's Law, and Growth Prospects

To contact us Click HERE
My father the mathematician first posed the checkerboard puzzle to me back in grade-school, perhaps on some rainy Saturday. His version of the story went something like this:

The jester performs a great deed, and the king asks him how he would like to be rewarded. The jester is aware that the king is a highly volatile individual, and if the jester asks for too much, the king might just kill him then and there. The jester also knows that the king views his promise as sacred, so if the king says "yes" to the jester's proposal, then the king will honor that promise. So in a way, the jester's problem is how to ask for a lot, but have the king at least initially think it's not very much, so that the king will give his consent.

So the jester clowns around a bit and then says: "Here's all I want. Take this checkerboard. On the first square, but one piece of gold. On the second square, two pieces. On the third square, four pieces, and on the fourth square, 8 pieces. Double the amount on each square until you reach the end of the checkerboard."

In the story, the king laughs at this comic proposal and says,  "Your great deed was so wonderful, I would have happily done much more than this! I grant your request!"

But of course, when the king starts hauling up gold pieces from the treasury, he will discover that 2 raised to the 63rd power, the final spot on the checkerboard requires about 9 quintillion gold pieces (that is, 9 followed by 18 zeros). 

I've had some sense of the power of exponential growth ever since.  But what I hadn't thought about is the interaction of Moore's Law and economic growth. Moore's Law is of course named for Gordon Moore, one of the founders of Intel, who noticed this pattern back in 1965. Back in the 1970s, he wrote a paper that contained the following graph showing how much it cost to produce a computer chip with a certain number of components. Here's his figure. Notice that the numbers of component on the horizontal axis and the cost figures on the vertical axis are both graphed as logarithm (specifically, each step up the axis is a change by a factor of 10). The key takeaway was that the number of transistors ("components") on an integrated circuit was doubling about every two years, making computing power much cheaper and faster.
This chart from Intel co-founder Gordon Moore's seminal 1965 paper showed the cost of transistors decreased with new manufacturing processes even as the number of transistors on a chip increased.

Ever since I started reading up on Moore's law in the early 1980s, there have been predictions in the trade press that it will soon reach technological limits and come to and end. But Moore's law marches on: indeed, the research and innovation targets at Intel and other chip-makers are defined in terms of making sure that Moore's law continues to hold for at least awhile longer. Stephen Shankland offers a nice accessible overview of the current situation in an October 15, 2012, essay on CNET: ""Moore's Law: The rule that really matters in tech"  (The Gordon Moore graph above is copied from Shankland's essay.)

As Shankland writes: "To keep up with Moore's Law, engineers must keep shrinking the size of transistors. Intel, the leader in the race, currently uses a manufacturing process with 22-nanometer features. That's 22 billionths of a meter, or roughly a 4,000th the width of a human hair." He cites a variety of industry and research experts to the effect that Moore's law has at least another decade to run--and remember, a decade of doubling every two years means five more doublings!

It's hard to wrap one's mind around what it means to say that the power of microchipo technology will increase by a factor of 32 (doubling five times) in the next 10 years. A characteristically intriguing survey essay  from the January 10 issue of the Economist on the future of innovation uses the checkerboard analogy to think about the potential effects of Moore's law. Here's a comment from the Economist essay:



Ray Kurzweil, a pioneer of computer science and a devotee of exponential technological extrapolation, likes to talk of “the second half of the chess board”. There is an old fable in which a gullible king is tricked into paying an obligation in grains of rice, one on the first square of a chessboard, two on the second, four on the third, the payment doubling with every square. Along the first row, the obligation is minuscule. With half the chessboard covered, the king is out only about 100 tonnes of rice. But a square before reaching the end of the seventh row he has laid out 500m tonnes in total—the whole world’s annual rice production. He will have to put more or less the same amount again on the next square. And there will still be a row to go.

Erik Brynjolfsson and Andrew McAfee of MIT make use of this image in their e-book “Race Against the Machine”. By the measure known as Moore’s law, the ability to get calculations out of a piece of silicon doubles every 18 months. That growth rate will not last for ever; but other aspects of computation, such as the capacity of algorithms to handle data, are also growing exponentially. When such a capacity is low, that doubling does not matter. As soon as it matters at all, though, it can quickly start to matter a lot. On the second half of the chessboard not only has the cumulative effect of innovations become large, but each new iteration of innovation delivers a technological jolt as powerful as all previous rounds combined."

Now, it's of course true that doubling the capacity of computer chips doesn't translate in a direct way into a higher standard of living: there are many steps from one to the other. But my point here is to note that many of us (myself included) have been thinking about the changes in electronics technology a little too much like the king in the checkerboard story: that is, we think of something doubling a few times, even 10 or 20 times, and we know it's a big change, but it somehow seems within our range of comprehension.

But when something has already been doubling every 18 months or two years for a half-century--it is continuing to double!--the absolute size of each additional doubling is starting to get very large. I lack the imagination to conceive of what will be done with all this cheap computing power in terms of health care, education, industrial process, communication, transportation, entertainment, food, travel, design, and more. But I suspect that these enormous repeated doublings, as Moore's law marches forward in the next decade and drives computing speeds up and prices down, will transform lives and industries in ways that we are only just starting to imagine.


The Financial Cycle: Theory and Implications

To contact us Click HERE
In the aftermath of the Great Recession, mainstream macroeconomists have been seeking in various ways to bring the financial sector into their models. As that activity implies, the financial sector had not previously been playing much of a role mainstream models. Claudio Borio lays out a perspective on treating cycles in the financial sector as having a life of their own in "The financial cycle andmacroeconomics: What have we learnt?", published in December 2012  as working paper #395 for the Bank of International Settlements. I should note that Borio's view of how the financial sector interrelates with the real economy is not conventional macroeconomic wisdom, but I should also note that conventional macroeconomics hasn't exactly covered itself with glory in the last few years.


Borio begins by point out that conventional macroeconomics was paying little attention to the financial sector in the years before the Great Recession, and argues that the strategies for trying to add a  financial sector to existing models doesn't go nearly far enough. (Citations and footnote are omitted from quotations throughout.)  Here's Borio: "The financial crisis that engulfed mature economies in the late 2000s has prompted much soul searching. Economists are now trying hard to
incorporate financial factors into standard macroeconomic models. However, the prevailing,
in fact almost exclusive, strategy is a conservative one. It is to graft additional so-called
financial “frictions” on otherwise fully well behaved equilibrium macroeconomic models ... The main thesis is that macroeconomics without the financial cycle is like Hamlet without the Prince. In the environment that has prevailed for at least three decades now, just as in the one that prevailed in the pre-WW2 years, it is simply not possible to understand business fluctuations and their policy challenges without understanding the financial cycle."

Borio argues that there is a "financial cycle" with its own dynamics. Here's a figure with U.S. data showing the regular business cycle, measured by variations in GDP, compared with the "financial cycle," based on estimates of credit, the credit/GDP ratio, and property prices. He argues that while business cycles are usually in the range of 1-8 years, "the average length of the financial cycle in a sample of seven industrialised countries since the 1960s has been around 16 years."

Borio argues that the peaks of the financial cycle are associated with financial crises. When a business cycle recession happens at the same time as the contraction part of a financial cycle, the recession is about 50% deeper.

This perspective on the financial cycle also offers some policy advice. Central banks and financial regulators should pay attention to credit/GDP ratios and to property prices. Borio writes: "The idea is to build up buffers in good times, as financial vulnerabilities grow, so as to be able to draw them down in bad times, as financial stress materialises. There are many ways of doing so, through the appropriate design of tools such as capital and liquidity standards, provisioning, collateral and margining practices, and so on. ... In the case of monetary policy, it is necessary to adopt strategies that allow central banks to tighten so as to lean against the build-up of financial imbalances even if near-term inflation remains subdued – what might be called the “lean option”. Operationally, this calls for extending policy horizons beyond the roughly 2-year ones typical of inflation targeting regimes and for giving greater prominence to the balance of risks in the outlook, fully taking into account the slow build-up of vulnerabilities associated with the financial cycle. ... In the case of fiscal policy, there is a need for extra prudence during economic expansions associated with financial booms. ...  Financial booms are especially generous for the public coffers, because of the structure of revenues. And the sovereign inadvertently accumulates contingent liabilities, which crystallise as the
boom turns to bust and balance sheet problems emerge, especially in the financial sector."

However, once the double-whammy of a financial crisis and a business cycle recession has hit simultaneously, Borio also argues that conventional policy responses may not work well. In a "balance sheet recession," fiscal and monetary policy may not be very capable of stimulating demand, and instead may encourage financial firms and businesses to put off the necessary hard steps they need to take, leaving the economy too dependent on government stimulation rather than on the private sector moving forward. As he writes:  "On reflection, the basic reason for the limitations of monetary policy in a financial bust is not hard to find. Monetary policy typically operates by encouraging borrowing, boosting asset prices and risk-taking. But initial conditions already include too much debt, too-high asset prices (property) and too much risk-taking. There is an inevitable tension between how policy works and the direction the economy needs to take."

The concept of a "financial cycle" has a plausible back-story. When times are good, borrowers and investors of all kinds tend to let down their guard, worry less about risks, and gradually become overextended--which can then brings on a counterreaction, or even in some cases a financial crisis. It's easy to point to financial crises, but it's harder to show convincingly that an earlier financial boom is the cause of the crisis. The nice smooth curve of financial cycles above is created by using statistical tools ("filtering") to blend together tBorio is up front about this difficulty and others, and has some suggestions for how the appropriate modeling might proceed. But even if one doesn't buy into the notion of a self-perpetuating financial cycle, standing apart from the regular business cycle, one lesson that everyone seems to have learned from the Great Recession is that rapid expansions of credit and rapid rises in property values have real macroeconomic risks--and thus are an appropriate target for policy.


 

21 Şubat 2013 Perşembe

Checkerboard Puzzle, Moore's Law, and Growth Prospects

To contact us Click HERE
My father the mathematician first posed the checkerboard puzzle to me back in grade-school, perhaps on some rainy Saturday. His version of the story went something like this:

The jester performs a great deed, and the king asks him how he would like to be rewarded. The jester is aware that the king is a highly volatile individual, and if the jester asks for too much, the king might just kill him then and there. The jester also knows that the king views his promise as sacred, so if the king says "yes" to the jester's proposal, then the king will honor that promise. So in a way, the jester's problem is how to ask for a lot, but have the king at least initially think it's not very much, so that the king will give his consent.

So the jester clowns around a bit and then says: "Here's all I want. Take this checkerboard. On the first square, but one piece of gold. On the second square, two pieces. On the third square, four pieces, and on the fourth square, 8 pieces. Double the amount on each square until you reach the end of the checkerboard."

In the story, the king laughs at this comic proposal and says,  "Your great deed was so wonderful, I would have happily done much more than this! I grant your request!"

But of course, when the king starts hauling up gold pieces from the treasury, he will discover that 2 raised to the 63rd power, the final spot on the checkerboard requires about 9 quintillion gold pieces (that is, 9 followed by 18 zeros). 

I've had some sense of the power of exponential growth ever since.  But what I hadn't thought about is the interaction of Moore's Law and economic growth. Moore's Law is of course named for Gordon Moore, one of the founders of Intel, who noticed this pattern back in 1965. Back in the 1970s, he wrote a paper that contained the following graph showing how much it cost to produce a computer chip with a certain number of components. Here's his figure. Notice that the numbers of component on the horizontal axis and the cost figures on the vertical axis are both graphed as logarithm (specifically, each step up the axis is a change by a factor of 10). The key takeaway was that the number of transistors ("components") on an integrated circuit was doubling about every two years, making computing power much cheaper and faster.
This chart from Intel co-founder Gordon Moore's seminal 1965 paper showed the cost of transistors decreased with new manufacturing processes even as the number of transistors on a chip increased.

Ever since I started reading up on Moore's law in the early 1980s, there have been predictions in the trade press that it will soon reach technological limits and come to and end. But Moore's law marches on: indeed, the research and innovation targets at Intel and other chip-makers are defined in terms of making sure that Moore's law continues to hold for at least awhile longer. Stephen Shankland offers a nice accessible overview of the current situation in an October 15, 2012, essay on CNET: ""Moore's Law: The rule that really matters in tech"  (The Gordon Moore graph above is copied from Shankland's essay.)

As Shankland writes: "To keep up with Moore's Law, engineers must keep shrinking the size of transistors. Intel, the leader in the race, currently uses a manufacturing process with 22-nanometer features. That's 22 billionths of a meter, or roughly a 4,000th the width of a human hair." He cites a variety of industry and research experts to the effect that Moore's law has at least another decade to run--and remember, a decade of doubling every two years means five more doublings!

It's hard to wrap one's mind around what it means to say that the power of microchipo technology will increase by a factor of 32 (doubling five times) in the next 10 years. A characteristically intriguing survey essay  from the January 10 issue of the Economist on the future of innovation uses the checkerboard analogy to think about the potential effects of Moore's law. Here's a comment from the Economist essay:



Ray Kurzweil, a pioneer of computer science and a devotee of exponential technological extrapolation, likes to talk of “the second half of the chess board”. There is an old fable in which a gullible king is tricked into paying an obligation in grains of rice, one on the first square of a chessboard, two on the second, four on the third, the payment doubling with every square. Along the first row, the obligation is minuscule. With half the chessboard covered, the king is out only about 100 tonnes of rice. But a square before reaching the end of the seventh row he has laid out 500m tonnes in total—the whole world’s annual rice production. He will have to put more or less the same amount again on the next square. And there will still be a row to go.

Erik Brynjolfsson and Andrew McAfee of MIT make use of this image in their e-book “Race Against the Machine”. By the measure known as Moore’s law, the ability to get calculations out of a piece of silicon doubles every 18 months. That growth rate will not last for ever; but other aspects of computation, such as the capacity of algorithms to handle data, are also growing exponentially. When such a capacity is low, that doubling does not matter. As soon as it matters at all, though, it can quickly start to matter a lot. On the second half of the chessboard not only has the cumulative effect of innovations become large, but each new iteration of innovation delivers a technological jolt as powerful as all previous rounds combined."

Now, it's of course true that doubling the capacity of computer chips doesn't translate in a direct way into a higher standard of living: there are many steps from one to the other. But my point here is to note that many of us (myself included) have been thinking about the changes in electronics technology a little too much like the king in the checkerboard story: that is, we think of something doubling a few times, even 10 or 20 times, and we know it's a big change, but it somehow seems within our range of comprehension.

But when something has already been doubling every 18 months or two years for a half-century--it is continuing to double!--the absolute size of each additional doubling is starting to get very large. I lack the imagination to conceive of what will be done with all this cheap computing power in terms of health care, education, industrial process, communication, transportation, entertainment, food, travel, design, and more. But I suspect that these enormous repeated doublings, as Moore's law marches forward in the next decade and drives computing speeds up and prices down, will transform lives and industries in ways that we are only just starting to imagine.


Rebuilding Unemployment Insurance

To contact us Click HERE
In theory,  the federal government sets minimum guidelines for each state's unemployment insurance system, and then each state sets its own rules for what is paid in and and what benefits are offered. Each state has its own unemployment trust fund. The idea is that the the trust fund will build up in good economic times, and then be drawn down in recessions. But it hasn't actually worked that way for a long time, and the problem is getting worse.  Christopher J. O’Leary lays out the issue and possible solutions in "A Changing Federal-State Balance in Unemployment Insurance?" written for the January 2013 Employment Research Newsletter published by the Upjohn Institute.

When a recession hits, the federal government has developed a habit of stepping in with extra unemployment insurance funds. For example, the feds stepped in with additional funding for extending unemployment benefits in 1958, 1961, 1971, 1974, 1982, 1991 and 2002--as well as during the most recent recession. With the feds stepping up, it has been easier and easier for the states to keep their unemployment taxes as low as possible. For example, average unemployment insurance taxes (adjusted for inflation) were $274/employee in 2008, lower than the $350/employee in 1994 and the $515/employee in 1984, according to Ronald Wilus of the U.S. Department of Labor.

As a result, over time the feds are paying for a larger share of unemployment insurance during recessions. Here's an illustrative figure from O'Leary.


For some perspective on the revenues coming into the unemployment trust funds from the regular unemployment tax, as opposed to how much money is going out, here's a table from a Congressional Research Service report on "Unemployment Insurance: Programs and Benefits," by Julie M. Whittaker and Katelin P. Isaacs, dated December 31, 2012. Notice that when unemployment rates were fairly low from 2005-2007, revenue exceeded outlays by about $10 billion per year. Then from 2009, 2010, and 2010, outlays exceeded revenue by something like $100 billion per year. The difference was made up by general taxpayer spending.

The intergovernmental incentives in the unemployment insurance system are clearly messed up. States have an incentive to keep unemployment insurance premiums fairly low, promise significant benefits, and then let the federal government pick up the tab when a recession occurs. What would be needed to get back to a system where states save up funds for unemployment insurance money in trust funds--even if some federal help might occasionally be needed?

One step suggested by O'Leary is to raise the "tax base." At present, the minimum federal standard requires that states collect unemployment insurance taxes on the first $7000 of taxable wages--a level that was established back in 1983. Just adjusting that $7,000 base for inflation would mean increasing it to about $16,000. O'Leary notes that 35 states currently have a taxable wage base at or below $15,000.

A second step would be to have a rule that unemployment insurance benefits would not kick in until after a waiting period. O'Leary writes: "A much neglected potential reform on the benefit side would be to institute waiting periods of 2–4 weeks, with the duration of the wait depending inversely on the aggregate level of unemployment. ... A somewhat longer waiting period will reduce program entry by those with ready reemployment options, and help to preserve the income security strength of the system for those who are involuntarily jobless for 4, 5, or 6 months."

Yet another step would be to use federal rules to discourage states from lowballing the funding of their unemployment insurance and relying on an influx of federal funding. Here's O'Leary: "[T]he federal partner should institute minimum standards on weekly benefit levels and durations, and also tie potential durations of any future federal emergency benefits to the existing state maximum durations. For example, a state providing up to 26 weeks would get 13 weeks of federal temporary benefits, but if the state maximum were 20 weeks the federal supplement would be 10 weeks."

It's worth pointing out that unemployment insurance has a number of problems other than whether it is pre-funded. You need to meet certain qualification tests for unemployment insurance, typically based on earnings in the previous year or so, and as a result, many of the unemployed do not receive unemployment insurance. In January 2013, about 3.5 million people were receiving unemployment insurance benefits, but about 12.3 million people were unemployed.


There are also a number of proposals that seek to adjust the incentives so that unemployment insurance can better co-exist with incentives to find a new job. Some proposals are that unemployment benefits should be larger, so as to soften the economic blow of unemployment, but for a shorter time, to hasten the incentive to find a new job. Some proposals would require or allow people to set up individual unemployment accounts, which they could keep at retirement, so that people would tap their own money before turning to the government fund. One proposal would offer a bonus to those receiving unemployment insurance if they found a job quickly, because it could be less costly for the unemployment insurance trust fund if they find a job faster rather than linger on receiving benefits.

The Great Recession and its aftermath have wrecked the premises of the existing unemployment insurance system. It's time to rebuild.


20 Şubat 2013 Çarşamba

Checkerboard Puzzle, Moore's Law, and Growth Prospects

To contact us Click HERE
My father the mathematician first posed the checkerboard puzzle to me back in grade-school, perhaps on some rainy Saturday. His version of the story went something like this:

The jester performs a great deed, and the king asks him how he would like to be rewarded. The jester is aware that the king is a highly volatile individual, and if the jester asks for too much, the king might just kill him then and there. The jester also knows that the king views his promise as sacred, so if the king says "yes" to the jester's proposal, then the king will honor that promise. So in a way, the jester's problem is how to ask for a lot, but have the king at least initially think it's not very much, so that the king will give his consent.

So the jester clowns around a bit and then says: "Here's all I want. Take this checkerboard. On the first square, but one piece of gold. On the second square, two pieces. On the third square, four pieces, and on the fourth square, 8 pieces. Double the amount on each square until you reach the end of the checkerboard."

In the story, the king laughs at this comic proposal and says,  "Your great deed was so wonderful, I would have happily done much more than this! I grant your request!"

But of course, when the king starts hauling up gold pieces from the treasury, he will discover that 2 raised to the 63rd power, the final spot on the checkerboard requires about 9 quintillion gold pieces (that is, 9 followed by 18 zeros). 

I've had some sense of the power of exponential growth ever since.  But what I hadn't thought about is the interaction of Moore's Law and economic growth. Moore's Law is of course named for Gordon Moore, one of the founders of Intel, who noticed this pattern back in 1965. Back in the 1970s, he wrote a paper that contained the following graph showing how much it cost to produce a computer chip with a certain number of components. Here's his figure. Notice that the numbers of component on the horizontal axis and the cost figures on the vertical axis are both graphed as logarithm (specifically, each step up the axis is a change by a factor of 10). The key takeaway was that the number of transistors ("components") on an integrated circuit was doubling about every two years, making computing power much cheaper and faster.
This chart from Intel co-founder Gordon Moore's seminal 1965 paper showed the cost of transistors decreased with new manufacturing processes even as the number of transistors on a chip increased.

Ever since I started reading up on Moore's law in the early 1980s, there have been predictions in the trade press that it will soon reach technological limits and come to and end. But Moore's law marches on: indeed, the research and innovation targets at Intel and other chip-makers are defined in terms of making sure that Moore's law continues to hold for at least awhile longer. Stephen Shankland offers a nice accessible overview of the current situation in an October 15, 2012, essay on CNET: ""Moore's Law: The rule that really matters in tech"  (The Gordon Moore graph above is copied from Shankland's essay.)

As Shankland writes: "To keep up with Moore's Law, engineers must keep shrinking the size of transistors. Intel, the leader in the race, currently uses a manufacturing process with 22-nanometer features. That's 22 billionths of a meter, or roughly a 4,000th the width of a human hair." He cites a variety of industry and research experts to the effect that Moore's law has at least another decade to run--and remember, a decade of doubling every two years means five more doublings!

It's hard to wrap one's mind around what it means to say that the power of microchipo technology will increase by a factor of 32 (doubling five times) in the next 10 years. A characteristically intriguing survey essay  from the January 10 issue of the Economist on the future of innovation uses the checkerboard analogy to think about the potential effects of Moore's law. Here's a comment from the Economist essay:



Ray Kurzweil, a pioneer of computer science and a devotee of exponential technological extrapolation, likes to talk of “the second half of the chess board”. There is an old fable in which a gullible king is tricked into paying an obligation in grains of rice, one on the first square of a chessboard, two on the second, four on the third, the payment doubling with every square. Along the first row, the obligation is minuscule. With half the chessboard covered, the king is out only about 100 tonnes of rice. But a square before reaching the end of the seventh row he has laid out 500m tonnes in total—the whole world’s annual rice production. He will have to put more or less the same amount again on the next square. And there will still be a row to go.

Erik Brynjolfsson and Andrew McAfee of MIT make use of this image in their e-book “Race Against the Machine”. By the measure known as Moore’s law, the ability to get calculations out of a piece of silicon doubles every 18 months. That growth rate will not last for ever; but other aspects of computation, such as the capacity of algorithms to handle data, are also growing exponentially. When such a capacity is low, that doubling does not matter. As soon as it matters at all, though, it can quickly start to matter a lot. On the second half of the chessboard not only has the cumulative effect of innovations become large, but each new iteration of innovation delivers a technological jolt as powerful as all previous rounds combined."

Now, it's of course true that doubling the capacity of computer chips doesn't translate in a direct way into a higher standard of living: there are many steps from one to the other. But my point here is to note that many of us (myself included) have been thinking about the changes in electronics technology a little too much like the king in the checkerboard story: that is, we think of something doubling a few times, even 10 or 20 times, and we know it's a big change, but it somehow seems within our range of comprehension.

But when something has already been doubling every 18 months or two years for a half-century--it is continuing to double!--the absolute size of each additional doubling is starting to get very large. I lack the imagination to conceive of what will be done with all this cheap computing power in terms of health care, education, industrial process, communication, transportation, entertainment, food, travel, design, and more. But I suspect that these enormous repeated doublings, as Moore's law marches forward in the next decade and drives computing speeds up and prices down, will transform lives and industries in ways that we are only just starting to imagine.


Big Data and Development Applications

To contact us Click HERE
"Big data" has become a buzzword. It conveys the notion that our interconnected world is generating a vast array of data--and asks how that data can be used for analysis, social problem-solving, and private profit. However, I had not known that the United Nations has an organization called Global Pulse, which focuses on issues of Big Data from a development perspective. The Global Observatory, a publication of the International Peace Institute, had an interview last November with Robert Kirkpatrick, Director of UN Global Pulse.  Here, I'll quote from the interview with Kirkpatrick, and will also refer to a May 2012 white paper from Global Pulse called "Big Data for Development: Challenges and Opportunities." 

As  a starting point, here's Kirkpatrick defining Big Data: "[Bbig data is a term that has come into vogue only in the last couple of years, and it refers to the tremendous explosion in volume and velocity and variety of digital data that is being produced around the world. The statistics are somewhat astonishing: there was more data produced in 2011 alone than in all of the rest of human history combined back to the invention of the alphabet."

The May 2012 report offers this comment (footnotes and references to figures omitted: "The world is experiencing a data revolution, or “data deluge”. Whereas in previous generations, a relatively small volume of analog data was produced and made available through a limited number of channels, today a massive amount of data is regularly being generated and flowing from various sources, through different channels, every minute in today’s Digital Age. It is the speed and frequency with which data is emitted and transmitted on the one hand, and the rise in the number and variety of sources from which it emanates on the other hand, that jointly constitute the data deluge. The amount of available digital data at the global level grew from 150 exabytes in 2005 to 1200 exabytes in 2010. It is projected to increase by 40% annually in the next few years .. This rate of growth means that the stock of digital data is expected to increase 44 times between 2007 and 2020, doubling every 20 months."

 The flood of data relevant for development issues includes four categories, according to Global Pulse: 1) "Data exhaust" created by people's transactions with digital services, including web searches, purchases, and mobile phone use; 2) "Online information" available in news media and social media, as well as job postings and e-commerce sites; 3) Physical sensors that look at landscapes, traffic patterns, weather, earthquakes, light emissions, and much else; 4) Citizen reporting, when information is submitted by citizens through surveys, hotlines, updating of maps,and the like.

Of course, there are enormous challenges in dealing with Big Data, including privacy concerns, the sheer size of the datasets, how quickly they are expanding, and how to digest and interpret it. But the potential for understanding what is happening much more quickly is becoming apparent. As Kirkpatrick says: "[W]e now live in this hyper-connected world where information moves at the speed of light, and a crisis can be all around the world very, very quickly, but we’re still using two- to three-year-old statistics to make most policy decisions. The irony is, we’re swimming in this ocean of digital data, which is being produced for free all around us."

Private sector firms like Google are already using Big Data. Some of the public sector and research studies include:
  • A country's GDP can be estimated based on light emissions at night, as perceived by satellites. 
  • Outbreaks of flu or cholera or dengue fever can be identified much more quickly by looking at web searches. Another study used Twitter mentions of earthquakes as a way to get a faster response to quakes.
  • One study was able to predict where people were at any time with greater than 90% accuracy based on cell-phone records showing past movements. Another study in developing countries could predict income with 90% accuracy based on how often you top off the air time on your mobile phone. Kirkpatrick says: "Even if you are looking at purely anonymized data on the use of mobile phones, carriers could predict your age to within in some cases plus or minus one year with over 70 percent accuracy. They can predict your gender with between 70 and 80 percent accuracy."
  • A study in Indonesia was able to approximate a consumer price index for basic foods by looking at comments on social media. (Apparently, Jakarta produces more tweets than any other city in the world.) Other studies have sought evidence on food shortages or food price volatility by looking at social media.

I confess that the social scientist within me finds the research possibilities here to be fascinating. Kirkpatrick says:" Now think about this, this is astonishing: the ability to see in real time where beneficiaries are can allow us to understand exactly where the population is that we need to reach, and if you combine that with information on the size of air-time purchases, you can tell how much money these people have. You start to be able to extract basic demographic information, population movement, and behavior data from this information while fully protecting privacy in the process.

What we’re focused on now is working with mobile carriers around the world, including in Indonesia, to get access to archives of anonymized call records and purchase records, because what we do is essentially correlate that data with official statistics. You look at the movement patterns, the mobile service consumption patterns, the social-network patterns that you can derive from how people interact and compare that to food prices, fuel prices, unemployment rates, disease outbreaks, earthquakes, and look at how a population was affected. Or, you compare it to when a program was initiated in the field or when a policy initiative got off the ground: did it actually work? The potential for monitoring and evaluation here as well is quite remarkable."

Moreover, Kirkpatrick describes the effort by Global Pulse to find a middle ground in concerns about privacy and access to Bid Data: "Right now, the conversation around big data is very polarized. You might call it "Germany vs. Mark Zuckerberg." You have the very conservative prohibition against reuse without explicit permission that has become pervasive in the European Union; it’s a very guarded approach. At the opposite end of the spectrum, you have companies that live on big data, which are saying privacy is dead, profit is king. We’re trying to insert a third pole into this debate, which is to say, big data is a raw public good. But to do that we have to create a kind of R & D sandbox where we can experiment with it and learn how to use it safely."

At least to me, many of the existing efforts to use Big Data seem to me interesting--but relatively small potatoes. As the existing data increases 40-fold in the next few years, along with techniques and capabilities to digest and analyze that data, challenges and possibilities will probably emerge that I can't even imagine now.  The May 2012 report quotes the comment from social technology guru Andreas Weigend, who said: "[D]ata is the new oil; like oil, it must be refined before it can be used."


19 Şubat 2013 Salı

Checkerboard Puzzle, Moore's Law, and Growth Prospects

To contact us Click HERE
My father the mathematician first posed the checkerboard puzzle to me back in grade-school, perhaps on some rainy Saturday. His version of the story went something like this:

The jester performs a great deed, and the king asks him how he would like to be rewarded. The jester is aware that the king is a highly volatile individual, and if the jester asks for too much, the king might just kill him then and there. The jester also knows that the king views his promise as sacred, so if the king says "yes" to the jester's proposal, then the king will honor that promise. So in a way, the jester's problem is how to ask for a lot, but have the king at least initially think it's not very much, so that the king will give his consent.

So the jester clowns around a bit and then says: "Here's all I want. Take this checkerboard. On the first square, but one piece of gold. On the second square, two pieces. On the third square, four pieces, and on the fourth square, 8 pieces. Double the amount on each square until you reach the end of the checkerboard."

In the story, the king laughs at this comic proposal and says,  "Your great deed was so wonderful, I would have happily done much more than this! I grant your request!"

But of course, when the king starts hauling up gold pieces from the treasury, he will discover that 2 raised to the 63rd power, the final spot on the checkerboard requires about 9 quintillion gold pieces (that is, 9 followed by 18 zeros). 

I've had some sense of the power of exponential growth ever since.  But what I hadn't thought about is the interaction of Moore's Law and economic growth. Moore's Law is of course named for Gordon Moore, one of the founders of Intel, who noticed this pattern back in 1965. Back in the 1970s, he wrote a paper that contained the following graph showing how much it cost to produce a computer chip with a certain number of components. Here's his figure. Notice that the numbers of component on the horizontal axis and the cost figures on the vertical axis are both graphed as logarithm (specifically, each step up the axis is a change by a factor of 10). The key takeaway was that the number of transistors ("components") on an integrated circuit was doubling about every two years, making computing power much cheaper and faster.
This chart from Intel co-founder Gordon Moore's seminal 1965 paper showed the cost of transistors decreased with new manufacturing processes even as the number of transistors on a chip increased.

Ever since I started reading up on Moore's law in the early 1980s, there have been predictions in the trade press that it will soon reach technological limits and come to and end. But Moore's law marches on: indeed, the research and innovation targets at Intel and other chip-makers are defined in terms of making sure that Moore's law continues to hold for at least awhile longer. Stephen Shankland offers a nice accessible overview of the current situation in an October 15, 2012, essay on CNET: ""Moore's Law: The rule that really matters in tech"  (The Gordon Moore graph above is copied from Shankland's essay.)

As Shankland writes: "To keep up with Moore's Law, engineers must keep shrinking the size of transistors. Intel, the leader in the race, currently uses a manufacturing process with 22-nanometer features. That's 22 billionths of a meter, or roughly a 4,000th the width of a human hair." He cites a variety of industry and research experts to the effect that Moore's law has at least another decade to run--and remember, a decade of doubling every two years means five more doublings!

It's hard to wrap one's mind around what it means to say that the power of microchipo technology will increase by a factor of 32 (doubling five times) in the next 10 years. A characteristically intriguing survey essay  from the January 10 issue of the Economist on the future of innovation uses the checkerboard analogy to think about the potential effects of Moore's law. Here's a comment from the Economist essay:



Ray Kurzweil, a pioneer of computer science and a devotee of exponential technological extrapolation, likes to talk of “the second half of the chess board”. There is an old fable in which a gullible king is tricked into paying an obligation in grains of rice, one on the first square of a chessboard, two on the second, four on the third, the payment doubling with every square. Along the first row, the obligation is minuscule. With half the chessboard covered, the king is out only about 100 tonnes of rice. But a square before reaching the end of the seventh row he has laid out 500m tonnes in total—the whole world’s annual rice production. He will have to put more or less the same amount again on the next square. And there will still be a row to go.

Erik Brynjolfsson and Andrew McAfee of MIT make use of this image in their e-book “Race Against the Machine”. By the measure known as Moore’s law, the ability to get calculations out of a piece of silicon doubles every 18 months. That growth rate will not last for ever; but other aspects of computation, such as the capacity of algorithms to handle data, are also growing exponentially. When such a capacity is low, that doubling does not matter. As soon as it matters at all, though, it can quickly start to matter a lot. On the second half of the chessboard not only has the cumulative effect of innovations become large, but each new iteration of innovation delivers a technological jolt as powerful as all previous rounds combined."

Now, it's of course true that doubling the capacity of computer chips doesn't translate in a direct way into a higher standard of living: there are many steps from one to the other. But my point here is to note that many of us (myself included) have been thinking about the changes in electronics technology a little too much like the king in the checkerboard story: that is, we think of something doubling a few times, even 10 or 20 times, and we know it's a big change, but it somehow seems within our range of comprehension.

But when something has already been doubling every 18 months or two years for a half-century--it is continuing to double!--the absolute size of each additional doubling is starting to get very large. I lack the imagination to conceive of what will be done with all this cheap computing power in terms of health care, education, industrial process, communication, transportation, entertainment, food, travel, design, and more. But I suspect that these enormous repeated doublings, as Moore's law marches forward in the next decade and drives computing speeds up and prices down, will transform lives and industries in ways that we are only just starting to imagine.


Social Welfare Programs and Incentives to Work

To contact us Click HERE
There's a fundamental conflict between helping those in need and encouraging self-support. I sometimes say that if you give a person a fish, every day, then you remove that person's incentive to learn to fish. But if you vow not give them a fish, they may starve to death while learning to fish. 

C. Eugene Steuerle explores the current state of this conflict in "Labor Force Participation, Taxes, and the Nation’s Social Welfare System," which is testimony given to the Committee on Oversight and Government Reform of the U.S. House of Representatives on February 14, 2013. As a starting point, focus first on the support that we give to those in need. Steuerle writes:

"Figures 1 and 2 display the benefits available to a single mother with two children in 2011 under these two cases. The first case, what I call the “universal” case, shows the benefits available to anyone whose income was low enough to qualify for them, namely nutrition assistance and tax benefits. The second case adds to those benefits narrower assistance—TANF and housing subsidies and supplements to nutrition assistance—that is available to some households but not to others based on availability, time limits, and other criteria. Because health reform will soon alter the delivery of health benefits in an important way, in both cases I assume that the provisions of the Affordable Care Act are in effect."



It's useful to remember that these graphs do not refer to cash benefits, and they represent averages that will vary across families. For example, the amount that families receive in Medicaid benefits is not received in cash, but in the form of access to health care services, and the amount will vary from year to year, depending on health. I find the details of these figures interesting for what they reveal about the size of spending and support from different programs and the income range over which programs operate. For example, the figures highlight that SNAP, more commonly known as "food stamps," is a substantially larger program than TANF, more commonly known as "welfare." The figures also show the relatively large size of health care benefits like Medicaid, CHIP, and the "exchange subsidy" compared with other forms of benefits, following a pattern that as a society we are willing to pay large health care bills for those with low incomes, or to give them food stamps, but we are less willing to give them cash benefits.

But the main point that Steuerle emphasizes is in the overall hump shape of the curves: that is more support for those at lower incomes, and then declining support as income rises. This pattern makes perfect sense: more fish for those with very low incomes, less fish as people learn to fish and bring in their own income. But it also means that those with low incomes face what economists call a "negative income tax."

A "positive" income tax is the usual tax in which, as you earn additional income, the government taxes a percentage. A "negative" income tax arises when, as you earn additional income, the government phases out benefits it would otherwise have provided. Both kinds of taxes have the same  result on incentives: when you earn an additional marginal dollar of income, you take home less than a dollar after taxes. When social programs phase out quickly as income rises, then a situation can arise where earning an additional dollar of income means losing 50 cents or more in benefits--thus greatly reducing the incentives to work.

Here are the effective marginal tax rates as Steuerle calculates them. That is, adding together both the "positive" tax rates of federal income taxes, state taxes, and payroll taxes for Social Security and Medicare, together with the implicit "negative" tax rates of the phase-out of social programs, what is the effective tax rate on a marginal dollar of income as income rises. Notice how the phase-out of social programs--that is, how their support declines as earned income rises--leads to a spike in the overall "effective" marginal tax rates that people experience at around $10,000-$15,000 in earned income.


Of course, if you're someone who doesn't believe that marginal tax rates affect work effort, then this sort of chart won't bother you. Personally, I'm concerned about the effects of marginal tax rates on incentives not just at the top of the income scale, and not just at the bottom of the income scale, but at all income levels.


Those interested in this subject might also see my post of November 16, 2012, based on a report from the Congressional Budget Office, about  "Marginal Tax Rates on the Poor and Lower Middle Class."


18 Şubat 2013 Pazartesi

Checkerboard Puzzle, Moore's Law, and Growth Prospects

To contact us Click HERE
My father the mathematician first posed the checkerboard puzzle to me back in grade-school, perhaps on some rainy Saturday. His version of the story went something like this:

The jester performs a great deed, and the king asks him how he would like to be rewarded. The jester is aware that the king is a highly volatile individual, and if the jester asks for too much, the king might just kill him then and there. The jester also knows that the king views his promise as sacred, so if the king says "yes" to the jester's proposal, then the king will honor that promise. So in a way, the jester's problem is how to ask for a lot, but have the king at least initially think it's not very much, so that the king will give his consent.

So the jester clowns around a bit and then says: "Here's all I want. Take this checkerboard. On the first square, but one piece of gold. On the second square, two pieces. On the third square, four pieces, and on the fourth square, 8 pieces. Double the amount on each square until you reach the end of the checkerboard."

In the story, the king laughs at this comic proposal and says,  "Your great deed was so wonderful, I would have happily done much more than this! I grant your request!"

But of course, when the king starts hauling up gold pieces from the treasury, he will discover that 2 raised to the 63rd power, the final spot on the checkerboard requires about 9 quintillion gold pieces (that is, 9 followed by 18 zeros). 

I've had some sense of the power of exponential growth ever since.  But what I hadn't thought about is the interaction of Moore's Law and economic growth. Moore's Law is of course named for Gordon Moore, one of the founders of Intel, who noticed this pattern back in 1965. Back in the 1970s, he wrote a paper that contained the following graph showing how much it cost to produce a computer chip with a certain number of components. Here's his figure. Notice that the numbers of component on the horizontal axis and the cost figures on the vertical axis are both graphed as logarithm (specifically, each step up the axis is a change by a factor of 10). The key takeaway was that the number of transistors ("components") on an integrated circuit was doubling about every two years, making computing power much cheaper and faster.
This chart from Intel co-founder Gordon Moore's seminal 1965 paper showed the cost of transistors decreased with new manufacturing processes even as the number of transistors on a chip increased.

Ever since I started reading up on Moore's law in the early 1980s, there have been predictions in the trade press that it will soon reach technological limits and come to and end. But Moore's law marches on: indeed, the research and innovation targets at Intel and other chip-makers are defined in terms of making sure that Moore's law continues to hold for at least awhile longer. Stephen Shankland offers a nice accessible overview of the current situation in an October 15, 2012, essay on CNET: ""Moore's Law: The rule that really matters in tech"  (The Gordon Moore graph above is copied from Shankland's essay.)

As Shankland writes: "To keep up with Moore's Law, engineers must keep shrinking the size of transistors. Intel, the leader in the race, currently uses a manufacturing process with 22-nanometer features. That's 22 billionths of a meter, or roughly a 4,000th the width of a human hair." He cites a variety of industry and research experts to the effect that Moore's law has at least another decade to run--and remember, a decade of doubling every two years means five more doublings!

It's hard to wrap one's mind around what it means to say that the power of microchipo technology will increase by a factor of 32 (doubling five times) in the next 10 years. A characteristically intriguing survey essay  from the January 10 issue of the Economist on the future of innovation uses the checkerboard analogy to think about the potential effects of Moore's law. Here's a comment from the Economist essay:



Ray Kurzweil, a pioneer of computer science and a devotee of exponential technological extrapolation, likes to talk of “the second half of the chess board”. There is an old fable in which a gullible king is tricked into paying an obligation in grains of rice, one on the first square of a chessboard, two on the second, four on the third, the payment doubling with every square. Along the first row, the obligation is minuscule. With half the chessboard covered, the king is out only about 100 tonnes of rice. But a square before reaching the end of the seventh row he has laid out 500m tonnes in total—the whole world’s annual rice production. He will have to put more or less the same amount again on the next square. And there will still be a row to go.

Erik Brynjolfsson and Andrew McAfee of MIT make use of this image in their e-book “Race Against the Machine”. By the measure known as Moore’s law, the ability to get calculations out of a piece of silicon doubles every 18 months. That growth rate will not last for ever; but other aspects of computation, such as the capacity of algorithms to handle data, are also growing exponentially. When such a capacity is low, that doubling does not matter. As soon as it matters at all, though, it can quickly start to matter a lot. On the second half of the chessboard not only has the cumulative effect of innovations become large, but each new iteration of innovation delivers a technological jolt as powerful as all previous rounds combined."

Now, it's of course true that doubling the capacity of computer chips doesn't translate in a direct way into a higher standard of living: there are many steps from one to the other. But my point here is to note that many of us (myself included) have been thinking about the changes in electronics technology a little too much like the king in the checkerboard story: that is, we think of something doubling a few times, even 10 or 20 times, and we know it's a big change, but it somehow seems within our range of comprehension.

But when something has already been doubling every 18 months or two years for a half-century--it is continuing to double!--the absolute size of each additional doubling is starting to get very large. I lack the imagination to conceive of what will be done with all this cheap computing power in terms of health care, education, industrial process, communication, transportation, entertainment, food, travel, design, and more. But I suspect that these enormous repeated doublings, as Moore's law marches forward in the next decade and drives computing speeds up and prices down, will transform lives and industries in ways that we are only just starting to imagine.


Taking Apprenticeships Seriously

To contact us Click HERE
The United States puts a heavy emphasis on a college degree as the path to economic and social success, and thus it's a familiar pledge of politicians that a higher share of the population will attend college. For example, in a speech to Congress on February 24, 2009, President Obama 
set a goal that "by 2020, America will once again have the highest proportion of college graduates in the world."

But this emphasis on college has two difficulties: 1) as a society, we don't actually mean it; and 2) it probably isn't an appropriate goal, anyway. After all, if we really supported a widespread expansion of college education, we would do considerably more than pump up the loans available to students. Instead, we would be figuring out how current colleges can expand their enrollments, and starting a new wave of colleges and universities--and figuring out how to keep these options affordable to students. Instead, the U.S. has lost its lead as the country in the world with the highest proportion of college graduates.

Moreover, a four-year college degree just isn't going to be right for everyone. Think about those students who managed to finish a high school degree, but were in the bottom third or bottom quarter of the class. For many of these students, their interactions with the educational system have not been happy ones, and the notion that their life plan should start off with yet another four years of education is likely to be met with hard-earned dislike and disbelief.

So what's the alternative for these students, in a U.S. economy that places considerable value on skilled labor. Betty Joyce Nash offers one angle on these issues in in "Journey to Work: European Model Combines Education with Vocation  in the Fourth Quarter issue of Region Focus, which is published by the Federal Reserve Bank of Richmond.  She writes:

"In the United States, vocational education has been disparaged by some as a place for students perceived as unwilling or unable. The United States still largely champions college as the route to higher lifetime wages and the flexibility to retool skills in times of economic change. Yet just 58 percent of the 53 percent of college-goers in 2004 who started at four-year institutions finished within six years. Moreover, 25 percent of those who enter two-year community colleges don’t finish. Only about 28 percent of U.S. adults over age 25 actually have a bachelor’s. What about the rest? What’s their path to the workplace? It may be unrealistic to expect everyone to finish college, but most students will need more than a high school education as jobs become more complex."
Nash focuses her discussion on apprenticeships and vocational education, and as is common in these kinds of arguments, she focuses some attention on practices in Germany and Switzerland. Thus:

"Germany and Switzerland educate roughly 53 percent and 66 percent of students, respectively, in a system that combines apprenticeships with classroom education — the dual system. This approach brings young people into the labor force more quickly and easily. Unemployment for those in
Switzerland between the ages of 15 and 24 in 2011 was 7.7 percent; in Germany, 8.5 percent. In the United States that year, the rate was 17.3 percent, down from 18.4 percent the previous year. (A 10 percent higher rate of participation in vocational education in selected Organization for Economic
Cooperation and Development countries led to a 2 percent lower youth unemployment rate in 2011, according to economist Eric Hanushek of Stanford University.)"

Here's a bit more detail on Switzerland and on Germany:

"At ages 15 to 16, in Switzerland, about two-thirds of every cohort enter  apprenticeships, [Stefan C.] Wolter notes. Apprentices in fields from health care to hairdressing to engineering attend vocational school at least one day a week for general education and theoretical grounding for roughly three years. On other days, they apprentice under the supervision of a seasoned employee. What makes the system work so well is firm participation, which is relatively strong. “If you exclude the one-person companies and the businesses that cannot train, about 40 percent of companies that could train do train,” Wolter says. ..."

"In Germany, about 25 percent of students go to university, and apprenticeships employ another 53 percent. At 16,  they sign on for a three-year stint in one of 350 occupations. Another 15 percent may attend vocational schools. Those who are less qualified take a full-time vocational course or temporary job until they land an apprenticeship. About one-quarter of German employers participate. ..."

Other western European countries use variations of the Swiss and German model. Belgium, Finland, Sweden, and the Netherlands train most vocational students in school programs, while Germany, Switzerland, Austria, and Denmark have large school-and-work programs. The United States is an outlier: By international standards and official definitions, it has virtually no vocational education and training program."
From a U.S. perspective, it's hard to think clearly about how this kind of widespread use of apprenticeships and vocational school would even work.  Half or two-thirds of 16 year-old students involved in paid internships? A quarter or a third of all employers providing a large number of such internships as part of their regular business model? Internships across a wide array of professions, both blue- and white-collar? My American mind boggles. But given that a four-year college degree is demonstrably not a good fit for many young Americans, it's past time to take some of the alternatives more seriously.


I've posted from time to time about the merits of apprenticeships and various alternative credentials.

For examples, see this post from October 18, 2011, on "Apprenticeships for the U.S. Economy, "
this post from last November 3, 2011 on "Recognizing Non-formal and Informal Learning," and th is post from January 16, 2012 on "Certificate Programs for Labor Market Skills."