COLUMN: Do We Need The CHIPS Act?
The Channel Company’s Founding Partner Robert Faletra believes there hasn’t been enough debate in Washington, D.C. around what could be the unintended consequences of the CHIPS Act.
Here we are, sitting in the opening act of a recession. Washington has spent $6 trillion it didn’t have on COVID relief. Inflation is running at unprecedented levels and is nearing double digits. The Federal Reserve has raised the interest rate again, and the beat goes on in Washington, D.C.
The semiconductor supply shortage continues, as does the push to bring more manufacturing to the states.
Are there strategic reasons to do so? You bet. Would it be best to have America once again be the location where the majority of chips are made? It’s hard to argue against that. At one point, we manufactured the majority of semiconductors here, and today we manufacture only about 12 percent.
There is support on both sides of the political aisle for subsidizing the U.S. semiconductor industry with the CHIPS Act, which as of press time has passed the Senate.
[RELATED: $52B CHIPS Act Clears Senate Hurdle, Expected To Pass House Vote]
But the issues surrounding all this are complex. There are national security issues. Inflationary spending issues. Government involvement in private industry issues. Supply chain issues. And the list goes on.
Would dropping more money the government doesn’t have add to inflation? I’m not an economist, but many argue the biggest contributor to the current inflationary problem is government spending.
The famed economist Milton Friedman argued “inflation is always and everywhere a monetary phenomenon.” Well, the $6 trillion in stimulus contributing to a national debt nearing $31 trillion—or nearly $92,000 for every person in the country—has something to do with the money supply.
Doling out billions more in subsidies given the size of the debt almost seems de minimis. And might it be worth it for national security reasons? I’m not arguing for or against this, but I do know the semiconductor industry isn’t on its deathbed.
Intel earlier this year opened a $3 billion expansion of its facility in Hillsboro, Ore., that sits on a 500-acre campus it named Gordon Moore Park. The company is working on shrinking features on chips to the size of atoms. It’s been investing in its foundry business with a recent $20 billion new fab in Ohio. That’s one of more than 20 construction projects underway.
I’m not picking on Intel, as Nvidia, Broadcom, Micron Technology and others based here are also investing in their businesses, as they should. The question that needs answering is would it be better for the industry to be subsidized by taxpayers or is it best for it to stand on its own? If subsidies are best, how best to implement them and what type of government involvement is going to come with that.
I don’t know the answer to these questions, but it seems as though there isn’t enough debate happening around what could be the unintended consequences of the government’s involvement.
College tuition has skyrocketed because of the student loan program. Constructed in an effort to make it easier to pay for college, the program instead injected tons of capital into the market, and universities universally raised rates as a result. Decades later, it’s actually more difficult now to pay for an education than before the program existed—an unintended but real consequence of increasing the available money supply.
Will we get the desired results of government involvement in the semiconductor industry?
I don’t know, but I also don’t think the folks designing these subsidies do either—and that’s what’s scary. What might be the unintended consequences?