Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is way bigger than trade policies. Intel was the world leader in chip-making until fairly recently. Steve Jobs wanted them to build the iPhone chips because they were a generation or two ahead of any other fab in the world. Today, Intel is no longer leading the world in chip fab process.

Intel at the time wasn't interested in fabing ARM chips, and the rest is history.

Intel and American fabs fell behind for a number of reasons. Some of it is tied to our education system and the kinds of people we are training. Some of it is probably related to Intel turning their nose up at the iPhone chip making, losing billions upon billions of money they could have put into R&D. Some of it is a story of American corporate culture not innovating enough.



> Intel at the time wasn't interested in fabing ARM chips, and the rest is history.

Fun fact, in the late 90's, Intel was the biggest manufacturer of ARM chips. I was there at the time, and I wrote lots of software to aid in their design.

What happened to Intel was twofold: The first problem was that Robert Noyce, Andy Grove, And Gordon Moore died. Each one was a once-in-century intellect, and it took Intel a while to be able to find somebody who could remotely fill those shoes.

The second problem was Intel was making so much money on x86 chips, that dedicating any fab capacity to anything else--including chips for Apple's iPhone, would for years have had such a huge opportunity cost, that the shareholders would have sued, and with cause. Because literally (not metaphorically) anything they would have manufactured instead would have had a drastically lower profit margin, and the stock would have tanked.

I suspect that the only reason Intel made a foray into ARM chips in the first place was to head off anti-trust accusations, and once the political heat was off, they dropped them like they were hot.

By the time the market shifted, almost a generation of mediocre management had left Intel less paranoid, and therefore, less likely to survive.


> would for years have had such a huge opportunity cost, that the shareholders would have sued, and with cause.

I've been considering the Intel issue (had family there) and the other post in the thread mention the decade of misses.

This actually seems like one of the most reasonable rationales that I don't hear mentioned very often. You might want to do something, yet your hands our tied, because the investors will sue for poor leadership.


It is the classic innovator's dilemma. There are ton of incentives to keep the focus on today's cash cow in any business.


I don't think it was that Intel didn't have enough capacity to manufacture more. Intel was a manufacturing-first company that beat out "high end" and specialized competitors by economies of scale. Over the long term, the strategy was to build as much high performance silicon as possible.

The problem looks like they went down the typical dinosaurification route that their vanquished high end competitors the likes of DEC, Sun, HP, SGI, IBM, etc., which was to switch from manufacturing as much as possible, to protecting high margin x86 parts. Their x86 PC and server organization out-muscled their manufacturing organization (either that or finance did), and so the ended up being very reluctant to adopt GPGPUs or ARM mobile because those were a threat to x86. By the time they realized they'd missed the boat in those markets, they put out these ridiculous x86 mobile cores and x86 GPGPUs that were costly failures.

This pattern of protecting profit margins on high end products while fixed costs every generation increase and performance increases so low end grows up and erodes high end, causing further retreat and protectionism, is exactly what happened with those aforementioned big iron companies. When the bean counters look at the state of things without an understanding of the technology trends, that's the logical thing to do, but they paint themselves into this corner.

Intel might have just been saved by the bell with Dennard scaling breaking down around the time they were losing their manufacturing advantage. That meant that instead of competitors on a generation or two process lead wiping the floor with them as they did to the minicomputers, it was a much smaller disadvantage and they've been able to somewhat hold on.

That's my non-insider take on it anyway.


It seems that what happened to Intel was a lot worse than just missing Andy Grove's paranoia...

- They missed the mobile market

- They missed the gaming market

- They missed the AI/ML market

- They got multiple generations behind TSMC and Samsung in the fab business

- They lost a whole bunch of market share to AMD, including datacenter

The NY Times article is paywalled, so I can't read it, but I have to wonder what kind of fabs they are meant to be building for our (taxpayers) $8.5B ? I wouldn't have any faith in them building a SOTA fab at this point.


Actually, Intel has been consistent with their specialized CPU DNN libraries and whatnot for years. MKL and the likes are pervasive in modern deep learning stacks. Pretty sure they've captured as much of the AI/ML business as CPUs possibly can.

Also, by gaming do you mean gaming consoles or gaming PCs? Because Intel is popular in gaming PCs also.


Sure, but those are CPU libraries, but what has grown NVIDIA into a $2T company wasn't building CPUs - it was building processors specific to this new and exploding field of ML/AI, which grew out of them building processors (GPUs) specific to the growing field of realtime PC-based gaming...

Is the problem that Intel saw themselves too narrowly as in the CPU business rather than as in the compute business? I've no idea - it'd be interesting to hear the inside story of how all these missed opportunities went down.


I mean, if your benchmark is "not being as successful as NVIDIA", then yes, a lot of companies are failures.


You're missing the point - it's not about comparing Intel to anyone else, but rather about Intel's missed market opportunities. They make processors but missed all the new market opportunities for processors.


> Pretty sure they've captured as much of the AI/ML business as CPUs possibly can.

Which is exactly how they missed out on AI/ML. CPU is irrelevant in that market.

> Because Intel is popular in gaming PCs

All gaming consoles use AMD or Nvidia hardware.

Gaming PCs use AMD or Nvidia GPUs.


>Gaming PCs use AMD or Nvidia GPUs.

Any Steam hardware survey (the most relevant for gaming PC contexts) will show Intel handily beating AMD at around 66% to 33% for CPU[1], so your claim that Intel is somehow not popular is bunk.

[1]: https://store.steampowered.com/hwsurvey/processormfg/


GPU

Note the "G" followed by PU

Graphics Processing Unit

None of those are GPUs

Steam survey PC video card stats show 76.92% Nvidia, 15.09% AMD, and 7.59% Intel (graphics unit integrated with CPU)

I never said that Intel CPUs are not popular on gaming PCs. But Intel used to have 99% of that Market.


Intel doesn't have and has never had a GPU line worth writing home about, so I don't know why you would bring up something Intel never had to say Intel is unpopular in gaming PCs.


> Intel doesn't have and has never had a GPU line worth writing home about

That is my point. Emphasizing the earlier statement: "They missed the gaming market".


Your point is bad. Ignoring the fact that Intel's bread and butter is the CPU, of which they have a dominant gaming marketshare, is obtuse.


They missed the GPU market (AI/ML/Gaming) and the console market. Hence they lost all of that to Nvidia and AMD. Both of those companies now have a bigger market cap than Intel.

How is this so difficult to understand? It is very simple, not obtuse, unless you are autistic.

Nvidia market cap is $2.26 Trillion.

AMD market cap is $290 Billion.

Intel market cap is $178 Billion.

AMD and Nvidia used to be a tiny fraction of Intel.

You guys are getting autistically stuck on this irrelevant detail about CPU popularity in gaming PCs and missing the bigger topic of the discussion. But even in that niche, Intel has lost significant market share to AMD.


"Autistic". You keep using that word, but I don't think it means what you think it means.

Are we back in kindergarten, with name calling?


You are forgetting two things:

1. All those happened after Andy Grove's time.

2. And when Andy Grove was at the helm, Intel did not miss opportunities like that. E.g. when Intel's memory chip business started losing market share to the Japanese (who were at the time a low-wage country), they were able to transition to CPUs in time.


Yeah, but what I meant is that these can't all be explained by not being paranoid (letting the competition creep up on you), or even by not having an exceptional CEO.

If you are in the business of making CPUs, and obviously aware of how fast things move in the tech world, then how can you not be aware of industry trends going on like smart phones, gaming and AI? If you are somehow late to see the trend then double down and catch up. What about getting behind on process nodes/generations? This used to be one of Intel's core strengths, and you'd have thought they'd have institutionalized the things you do and don't take on in a new process to keep the change manageable.

So, I wouldn't say it just a matter of not having Andy Grove - it seems like a matter of having incompetence at many levels.


When you connect those dots this feels more like a pseudo bailout in disguise.


This is exactly what it is. With the added issue that, in all likelihood, $8B is not even enough to start making a change. The US gov would have a better time giving these $8B to startup companies in the chip space. Intel will just squander this money and give some of it to shareholders.


Oh, its not a pseudo bailout. It's a bailout full stop. Doesn't mean its a bad idea.


8 billion can't build half a fab though.


It's also interesting that Intel made some specific bad technical choices.

(ie, not investing in EUV, assuming SAQP would work forever, ignoring the consequences of everything being power-limited...)

This kind of decision is hard, because it's a technical-economic tradeoff, and the latter is more voodoo than math. And that's not even addressing whether you get surprised by things like the LLM boom...


You say:

> It's also interesting that Intel made some specific bad technical choices.

That's like saying Saudi Arabia made some bad policy decisions. If you are in Saudi Arabia, nothing has a higher return on investment than petroleum. If you invest in anything else, you are leaving money---A LOT of money--on the table.

The result is that capital is drained out of any other business, and the country gets so dependent on just one industry that when the oil runs out, it is a major crisis.

Put yourself in the place of an Intel CEO. You've just invested $4 Billion in a fab. People will by absolutely every single x86 chip that fab will make ~$40 billion over the course of the technology node's life time.

Or, you could make chips for the iPhone, which maybe, perhaps would be a hit? And even if Apple's wildest projections come true, you'll make $8 billion instead of $40 billion, because Apple is not going to pay for Intel to pocket a 90% profit margin, when it could just go to TSMC.

So you are a CEO, and wondering how on the next earnings call you are going to justify turning an asset worth $40 Billion into one worth $8 billion....


Was it not possible to build two fabs and make $48 billion?


What would you do? Build two fabs and make $48 billion, or build two fabs and make $80 billion?

It might be hard to remember now, but Intel was in the position NVIDIA is today: they could sell absolutely every x86 processor they could make, at pretty much any price they cared to name.

So even if Intel could have built another fab (which, btw, it couldn't, the world only graduates so many solid-state physics Ph.D.'s per year) but even if they could, they would STILL have used it to make x86 chips. Literally (not metaphorically or hyperbolically) nothing else they could manufacture would have as high a return on investment as making x86 chips did.


I don’t buy this completely. AMD’s 64-bit humiliation of Intel wasn’t that far in the past, and they knew GPUs were important. They could have hedged their bets a bit more on the basis of sound business management without incurring shareholder wrath.


Google/Alphabet and Facebook/Meta sunk billions into ventures that aren’t half as profitable as your hypothetical 8B deal, and shareholders haven’t sued yet.

Of course the founders have majority voting shares, but that doesn’t prevent them from being sued. It only prevents them/their designated CEO from being fired.


Theres a few differences between Intel and Google here. The first two points are from Peter Thiel:

1. Google really didn't know what to do with its excess money, which was why it was just stockpiling it and not using it to invest in its own business.

Intel, by contrast, always knew (at least in the Noyce/Grove era) what to do with its money.

2. Google's "side projects" were false-flags and smokescreens to make it look like it wasn't a monopoly. (Thiel says it much better himself in his article and speech titled "competition is for losers"-highly recommended).

Intel did a similar thing around the turn of the century, when it started making ARM chips--not because they were the most profitable thing they could be doing but because they didn't want to look like a monopoly.

And, of course, you really hit the nail on the thumb here:

> Of course the founders have majority voting shares

Yeah. The executives of google and facebook just don't face the same accountability from the shareholders that the execs of any honestly-governed public company does.

Its a two-edged sword: if google or facebook needs to spend a lot of money to pivot, they have the flexibility to do that in a way which other companies don't. However, the flip side implication is that, e.g. Zuckerberg can just spend $10 billion on a huge project, without being at all held accountable if it fails.


Right, and the ironic thing is that the founder having majority voting shares was allegedly justified by the idea that it would solve the innovators dilemma by not having the founders ("proven" capable of building successful products) being accountable to the short-term whims of the shareholder.

That said... my original point was that the shareholders don't sue in such situations. Rather, they just vote out the board (and hence CEO). If they could sue the board/CEO for not making as much profit as theoretically possible, they would have already sued Meta or Alphabet already despite having only minority shares. Proving to a court that the board/CEO isn't acting in the shareholder's best interest isn't as easy as comparing the projected returns of two mutually-exclusive deals -- they can argue that the ostensibly less profitable deal actually has great long term benefits.


> don't sue in such situations. Rather, they just vote out the board

cf. with the shareholder suit against Tesla. Why did they sue instead of firing the board of directors?

To vote out the board, you need >50% of the shareholders to back you. To sue the board, all you need is 1 disgruntled investor.


> and the country gets so dependent on just one industry that when the oil runs out, it is a major crisis.

Good thing we aren’t even slightly close to ever running out.


> the kinds of people we are training

I was a physics student who took a nanofab class, fell in love, and desperately wanted to go down this path... until I learned about how complete and utter dogshit the wages were. I kept in touch with two classmates who stuck with it a bit longer. The free market had to beat them over the head a bit longer to get them to let go, but it eventually succeeded. Intel was absolutely printing money for investors the whole time, of course, and now that the material consequences of their poor management have come home to roost they are getting bailouts. Yay, capitalism!

Our education system over-produces qualified scientists, but if semiconductors are a road to not having a house or family and if it pays 5x better to sell ads or stocks... you get what you pay for.

I hope this has changed but I wouldn't bet on it.


The same with auto manufacturing: lack of real competition bred complacency and then real competition came along.


to your point, right around the time of the iPhone (roughly)

https://en.wikipedia.org/wiki/XScale#Sale_of_PXA_processor_l...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: