Outsourcing and IT
Winterspeak began as a trade blog, but then focused on the technology industry with large doses of economics dolloped on as I moved to U Chicago. The current furor over outsourcing IT jobs to India sadly encapsulates both the economic illiteracy at the heart of free-trade opposition and the very real pain felt by out-of-work techies. Arnold Kling covers it well and links to J Delong for more economic details. But it seems that just as there are no athiests in foxholes, there are no classic liberals in a down economy.
"Comparative advantage" is one of the least intuitive but most important ideas in economics. The upshot is that since opportunity costs (gains you pass up) are as real as out-of-pocket costs (although they don't seem so) people are better off specializing and trading even if one person is absolutely better at everything. If US companies outsource some IT functions to India for lower cost, it means they can either increase their profits (which then get spent on stuff) or pass on savings to their customers (who then spend what they've just saved on other stuff). Either way, the economy as a whole grows even though a certain small sector (IT in this case) suffers. Krugman, before he lost his mind and consequently his credibility, wrote a great piece on it in the context of industrial jobs moving to China. And apart from generous unemployment benefits and retraining grants, I'm not sure what else can be done to help unemployed IT workers to change industries. In economic models, capital (ie. a factory) is treated as a fixed asset and labor as flexible, but in real-life labor flexibility comes with a lot of pain and dislocation.
On a similar note, I was chatting with a very technical MIT crytographer buddy of mine who was pessimistic about the state of the IT industry and was wondering if it was "done". His concern seems similar to this intemperate rant on the Register railing against an uncaring market that likes Dell (which does not invest in R&D) but hates Sun and Apple (which do).
My take on this is to think of technology as a stack, with hardware at the bottom, then operating systems, middle ware, software in the middle, and finally service and distribution at the top. The end customer (you and I) need to pay for the whole thing, and we don't care about how much of the purchase price goes to which bit, but we do care about how well it works as a whole. Dell, having lowered prices on the lower parts of the stack, has freed up more money that can now be spent on better software and services. And given the fact that software generally sucks, I see huge room for improvement here.
As for R&D, Apple's investment in creating a better, networked thin client experiences (iTunes, iPhoto, iMovie, hell, OS X) seems to be working well enough, and who knows if Sun's N1 investment will prove a similar boon on the enterprise side. But R&D as a public good should be left to universities and government, companies need to make it payoff, not just do it. The fact that CPUs are fast enough does not mean the end of IT, it just means that IT needs to focus on better customer experiences and not hotter boxes. I think this is a Good Thing, and long overdue.
The economic way to think about it is to consider how elastic demand is for IT at this point. Will people take their savings from lower IT prices and spend them on non-IT things, or will they use their savings to buy even more IT? Historically, the answer has been overwhelmingly the latter as CPU cycles have dropped several orders of magnitude in price over the past 30-40 years, but total sales of computers have kept increasing. I think that this is still true, but that people feel better software will improve their computers more than faster hardware, and so that's where the industry will move to. About time.
"Comparative advantage" is one of the least intuitive but most important ideas in economics. The upshot is that since opportunity costs (gains you pass up) are as real as out-of-pocket costs (although they don't seem so) people are better off specializing and trading even if one person is absolutely better at everything. If US companies outsource some IT functions to India for lower cost, it means they can either increase their profits (which then get spent on stuff) or pass on savings to their customers (who then spend what they've just saved on other stuff). Either way, the economy as a whole grows even though a certain small sector (IT in this case) suffers. Krugman, before he lost his mind and consequently his credibility, wrote a great piece on it in the context of industrial jobs moving to China. And apart from generous unemployment benefits and retraining grants, I'm not sure what else can be done to help unemployed IT workers to change industries. In economic models, capital (ie. a factory) is treated as a fixed asset and labor as flexible, but in real-life labor flexibility comes with a lot of pain and dislocation.
On a similar note, I was chatting with a very technical MIT crytographer buddy of mine who was pessimistic about the state of the IT industry and was wondering if it was "done". His concern seems similar to this intemperate rant on the Register railing against an uncaring market that likes Dell (which does not invest in R&D) but hates Sun and Apple (which do).
My take on this is to think of technology as a stack, with hardware at the bottom, then operating systems, middle ware, software in the middle, and finally service and distribution at the top. The end customer (you and I) need to pay for the whole thing, and we don't care about how much of the purchase price goes to which bit, but we do care about how well it works as a whole. Dell, having lowered prices on the lower parts of the stack, has freed up more money that can now be spent on better software and services. And given the fact that software generally sucks, I see huge room for improvement here.
As for R&D, Apple's investment in creating a better, networked thin client experiences (iTunes, iPhoto, iMovie, hell, OS X) seems to be working well enough, and who knows if Sun's N1 investment will prove a similar boon on the enterprise side. But R&D as a public good should be left to universities and government, companies need to make it payoff, not just do it. The fact that CPUs are fast enough does not mean the end of IT, it just means that IT needs to focus on better customer experiences and not hotter boxes. I think this is a Good Thing, and long overdue.
The economic way to think about it is to consider how elastic demand is for IT at this point. Will people take their savings from lower IT prices and spend them on non-IT things, or will they use their savings to buy even more IT? Historically, the answer has been overwhelmingly the latter as CPU cycles have dropped several orders of magnitude in price over the past 30-40 years, but total sales of computers have kept increasing. I think that this is still true, but that people feel better software will improve their computers more than faster hardware, and so that's where the industry will move to. About time.
0 Comments:
Post a Comment
Subscribe to Post Comments [Atom]
<< Home