Calling Nvidia niche feels a bit wild given their status-quo right now, but from a foundry perspective, it seems true. Apple is the anchor tenant that keeps the lights on across 12 different mature and leading-edge fabs.
Nvidia is the high-frequency trader hammering the newest node until the arb closes. Stability usually trades at a discount during a boom, but Wei knows the smartphone replacement cycle is the only predictable cash flow. Apple is smart. If the AI capex cycle flattens in late '27 as models hit diminishing returns, does Apple regain pricing power simply by being the only customer that can guarantee wafer commits five years out?
So let's say TMSC reciprocated Apple's consistency as a customer by giving them preferential treatment for capacity. It's good business after all.
However, everyone knows that good faith reciprocity at that scale is not rewarded. Apple is ruthless. There are probably thousands of untold stories of how hard Apple has hammered it's suppliers over the years.
While Apple has good consumer brand loyalty, they arguably treat their suppliers relatively poorly compared to the Gold standard like Costco.
At this scale and volume, it's not really about good faith.
Changing fabs is non-trivial. If they pushed Apple to a point where they had to find an alternative (which is another story) and Apple did switch, they would have to work extra hard to get them back in the future. Apple wouldn't want to invest twice in changing back and forth.
On the other hand, TSMC knows that changing fabs is not really an option and Apple doesn't want to do it anyway, so they have leverage to squeeze.
At this level, everyone knows it's just business and it comes down to optimizing long-term risk/reward for each party.
Counter argument is that is NVIDIA friendly to their supply chain? I have to think that maybe they are with their massive margins because they can be - their end buyer is currently willing to absorb costs at no expense. But I don't know, and that will change as their business changes.
Your underlying statement implies that whoever is replacing apple is a better buyer which I don't think is necessarily true.
> EVGA Terminates Relationship With Nvidia, Leaves GPU Business
> According to Han, Nvidia has been difficult to work with for some time now. Like all other GPU board partners, EVGA is only told the price of new products when they're revealed to everyone on stage, making planning difficult when launches occur soon after. Nvidia also has tight control over the pricing of GPUs, limiting what partners can do to differentiate themselves in a competitive market.
If your customers are known to be antagonistic to business partners, the correct answer is to diversify them as much as you can, even at reasonable costs from anything else.
No public company will be loyal or nice to their suppliers. That is just not in the playbook for public companies. They have "fiduciary duty", not human duty.
Private companies can be nice to their suppliers. Owners can choose to stay loyal to suppliers they went to high school with, even if it isn't the most cost-efficient.
> they arguably treat their suppliers relatively poorly compared to the Gold standard like Costco.
I’m not saying you’re wrong but you’re previous paragraph sounding like you were wondering if it was the case vs. here you’re saying it’s known. Is this all true? Do they have a reputation for hammering their suppliers?
Agreed TSMC can do whatever they want. in 2027 no other fabs will match what tsmc has today, anything that requires the latest process node is going to get more expensive, so your apple silicone and your AMD chips
I tend to agree with you, feels to me like the root of this is essentially whether foundries will "go all in" on AI like the rest of the S&P 500. But why trade away one trillion-dollar customer for another trillion-dollar customer if the first one is never going away, and the second one might?
I think it is less of a trade and more of a symbiotic capital cycle, if I can call it that?
Nvidia's willingness to pay exorbitant prices for early 2nm wafers subsidizes the R&D and the brutal yield-learning curve for the entire node. But you can't run a sustainable gigafab solely on GPUs...the defect density math is too punishing. You need a high-volume, smaller-die customer (Apple) to come in 18 months later, soak up the remaining 90% of capacity and amortize that depreciation schedule over a decade.
That is the traditional textbook yield curve logic, if I'm not wrong? Smaller area = higher probability of a surviving die on a dirty wafer.
But I wonder if the sheer margin on AI silicon basically breaks that rule? If Nvidia can sell a reticle-sized package for 25k-30k USD, they might be perfectly happy paying for a wafer that only yields 30-40% good dies.
Apple OTOH operates at consumer electronics price points. They need mature yields (>90%) to make the unit economics of an iPhone work. There's also the binning factor I am curious about. Nvidia can disable 10% of the cores on a defective GPU and sell it as a lower SKU. Does Apple have that same flexibility with a mobile SoC where the thermal or power envelope is so tightly coupled to the battery size?
I am curious about the binning factor too since in the past, AMD and Intel have both made use of defect binning to still sell usable chips by disabling cores. Perhaps Apple is able to do the same with their SoCs? It's not likely to be as granular as Nvidia who can disable much smaller areas of the silicon for each of their cores. On the other hand, the specifics of the silicon and the layout of the individual cores, not to mention the spread of defects over the die might mitigate that advantage.
They do bin their chips. Across the range (A- and M-series) they have the same chip with fewer / disabled cpu and gpu cores. You pray a premium for ones with more cores. Unsure about the chip frequencies - Apple doesn’t disclose those openly from what I know.
With current AI pricing for silicon, I think the math’s gone out the window.
For Apple, they have binning flexibility, with Pro/Max/Ultra, all the way down to iPads - and that’s after the node yields have been improved via the gazillion iPhone SoC dies.
NVIDIAs flexibility came from using some of those binned dies for GeForce cards, but the VRAM situation is clearly making that less important, as they’re cutting some of those SKUs for being too vram heavy relative to MSRP.
Datacenter GPU dies cannot be binned for Geforce because they lack fixed function graphics features. Raytracing acceleration in particular must be non-trivial area that you wouldn't want to spend on a datacenter die. Not to mention the data fabric is probably pretty different.
> For Apple, they have binning flexibility, with Pro/Max/Ultra, all the way down to iPads
The Pro and Max chips are different dies, and the Ultra currently isn't even the same generation as the Max. And the iPads have never used any of those larger dies.
> NVIDIAs flexibility came from using some of those binned dies for GeForce cards
NVIDIA's datacenter chips don't even have display outputs, and have little to no fixed-function graphics hardware (raster and raytracing units), and entirely different memory PHYs (none of NVIDIA's consumer cards have ever used HBM).
The desktop 4090 uses the AD102 die, the laptop 4090 and desktop 4080 use the AD103 die, and the laptop 4080 uses the AD104 die. I'm not at all denying that binning is a thing, but you and other commenters are exaggerating the extent of it and underestimating how many separate dies are designed to span a wide product line like GPUs or Apple's computers/tablets/phones.
"Ultra" isn't even binned - it's just 2x "Max" chips connected together.
Otherwise, yes, if a chip doesn't make M4 Max, it can make M4 Pro. If not, M4. If not, A18 Pro. If not that, A18.
And even all of the above mentioned marketing names come in different core configurations. M4 Max can be 14 CPU Cores / 32 GPU cores, and it can also be 16 CPU cores and 40 GPU cores.
So yeah, I'd agree that Apple has _extreme_ binning flexibility. It's likely also the reason why we got A19 / A19 Pro / M5 first, and we still don't have M5 Pro or M5 Max yet. Yields not high enough for M5 Max yet.
Unfortunately I don't think they bin down even lower (say, to S chips used in Apple Watches), but maybe in the future they will.
In retrospect, Apple ditching Intel was truly a gamechanging move. They didn't even have to troll everyone by putting an Intel i9 into a chassis that couldn't even cool an i7 to boost the comparison figures, but I guess they had to hedge their bet.
> yes, if a chip doesn't make M4 Max, it can make M4 Pro. If not, M4. If not, A18 Pro. If not that, A18.
No, that's entirely wrong. All of those are different dies. The larger chips wouldn't even fit in phones, or most iPad motherboards, and I'm not sure a M4 Max or M4 Pro SoC package could even fit in a MacBook Air.
As a general rule, if you think a company might ever be selling a piece of silicon with more than half of it disabled, you're probably wrong and need to re-check your facts and assumptions.
There are two levels of Max Chip, but think of a Max as two pros on die (this is simplification, you can also think of as pro as being two normal cores tied together), so a bad max can't be binned into a pro. But a high-spec Max can be binned into a low-spec Max.
Why are foundries going 'All In' on AI? They fab chips for customers, doesnt matter what chips they are and who the customer is.'Who will pay the most for us to make their chips first' is the only question TMSC will be asking. The market of the customer is irrelevant.
Is the argument that Apple will go out of business? AAPL?
Wait,
> one player has a short-term ability to vastly outspending all the rest.
I assure you, Apple has the long-term and short-term ability to spend like a drunken sailor all day and all night, indefinitely, and still not go out of business. Of course they’d prefer not to. But there is no ‘ability to pay’ gap here between these multi-trillion-dollar companies.
Apple will be forced to match or beat the offer coming from whoever is paying more. It will cost them a little bit of their hilariously-high margins. If they don’t, they’ll have to build less advanced chips or something. But their survival is not in doubt and TSMC knows that.
> Not a system that necessarily works all that well if one player has a short-term ability to vastly outspending all the rest.
Well yeah, people were identifying that back when Apple bought out the entirety of the 5nm node for iPhones and e-tchotchkes. It was sorta implicitly assumed that any business that builds better hardware than Apple would boot them out overnight.
"Apple is smart. If the AI capex cycle flattens in late '27 as models hit diminishing returns, does Apple regain pricing power simply by being the only customer that can guarantee wafer commits five years out?"
That's the take I would pursue if I were Apple.
A quiet threat of "We buy wafers on consumer demand curves. You’re selling them on venture capital and hype"
Nvidia is not a venture capital outlet. They are a self-sustaining business with several high-margin customers that will buy out their whole product line faster than any iPhone or Mac.
From TSMC's perspective, Apple is the one that needs financial assistance. If they wanted the wafers more than Nvidia, they'd be paying more. But they don't.
This article repeatedly cites revenue growth numbers as an indicator of Nvidia and Apple’s relative health, which is a very particular way of looking at things. By way of another one, Apple had $416Bn in revenue, which was a 6% increase from the prior year, or about $25Bn, or about all of Nvidia’s revenue in 2023. Apple’s had slow growth in the last 4 years following a big bump during the early pandemic; their 5 year revenue growth, though, is still $140Bn, or about $10Bn more than Nvidia’s 2025 revenues. Nvidia has indeed grown like a monster in the last couple years - 35Bn increase from 23-24 and 70Bn increase from 24-25. Those numbers would be 8% and 16% increases for Apple respectively, which I’m sure would make the company a deeply uninteresting slow-growth story compared to new upstarts.
I get why the numbers are presented the way they are, but it always gets weird when talking about companies of Apple’s size - percent increases that underwhelm Wall Street correspond to raw numbers that most companies would sacrifice their CEO to a volcano to attain, and sales flops in Apple’s portfolio mean they only sold enough product to supply double-digit percentages of the US population.
I think there's something about both the myth of the unicorn and of the hero founder/CEO in tech that forces a push towards legibility and easy narratives for a company - it means that, to a greater degree than other industries, large tech companies are a storytelling exercise, and "giant corporate blob that sprawls into everything" isn't a sexy story, nor is "consistent 3% YoY gains," even when that's translating into "we added the GDP of a medium-sized country to our cash pile again this year."
Every time a CEO or company board says "focus," an interesting product line loses its wings.
It's because the storytelling needed for Wall Street. It's the only way to get sky high revenue multiples, selling a dream, because if you're a conglomerate all you can do is to sell the P&L - it's like selling an index.
If you have a business division that's does exceedingly well compared to the rest, you make more money by spinning it off.
I think Asian companies are much less dependent on public markets and have as strong private control (chaebols in South Korea for example - Samsung, LG, Hyundai etc).
If you look at US companies that are under "family control" you might see a similar sprawl, like Cargill, Koch, I'd even put Berkshire in this class even though it's not "family controlled" in the literal sense, it's still associated with two men and not a professional CEO.
It might matter that Nvidia sells graphics cards and Apple sells computers and computer-like devices with cases and peripherals and displays and software and services. TSMC is responsible for a much larger proportion of Nvidia's product than Apple's.
I dislike this dramatization in reporting of mundane facts.
So report the facts but sentences like "What Wei probably didn’t tell Cook is that Apple may no longer be his largest client" make it personal, they make you take sides, feel sorry for somebody, feel schadenfreude... (as you can observe in the comments)
Doesn't seem like LLM generated text to me. Even prior to ChatGPT some journalists preferred to write in a novel-style with extraneous fluff like that.
A lot of people don't actually learn good writing at their fancy schools - but they do they learn the stylistic quirks that signal one went to the fancy school.
How do you think it got in the LLM training set in the first place?
It seems a bit odd that data center operators aren’t willing to put their money where their mouth is.
Data center operators say: expand more quickly.
TSMC says: we need long term demand to justify that.
And all the data center guys say is: don’t worry that won’t be an issue, trust us.
I would think that if they were serious they would commit to cofinancing new foundries or signing long term minimum purchasing agreements.
Semiconductors is extremely cyclical. One of the reasons TSMC survived the previous boost-boom cycles is their caution. If you overexpand, you risk going out of business in the next downturn.
AFAIK only Apple has been commiting to wafers up to 3 years in the future. It would be a crazy bet for NVidia to do the same, as they don't know how big will be the business.
If the long term demand disappears, there may not be anyone left for TSMC to collect from on those MPA. This somewhat undermines their utility as a security.
> I would think that if they were serious they would commit to cofinancing new foundries or signing long term minimum purchasing agreements.
That would ruin TSMC and others' independence.
Nvidia already did buy Intel shares so it is a thing.
Nvidia did discuss with TSMC for more capacity many times. It's not about financing or minimum purchasing agreements. TSMC played along during COVID and got hit.
How do you figure? Demand for electronics skyrocketed when everyone working from home bought new laptops webcams, tablets.
There was a fire on a TSMC manufacturing line that caused a shortage early on but capacity recovered, demand stayed strong throughout and there was a massive spike at the end when car manufacturers needed to ramp back up to handle all the paused orders.
As far as I know there was never a demand dip at any point in there.
And what of the natural resources sustaining all of this? This conglomerate of data centers, gpus and other chips will surely have to push manufacturers to the maximum in other industries. I don't think sustainable energy, recycling and carbon credits will be enough to cover for it.
Explains why Apple is looking to diversify their fabs with Intel. If Intel can stay on their current trajectory and become a legitimate alternative they will do very well as a fab with additional available capacity.
The key here is Intel is expanding the idea of operating their fab for an external customer (foundry services). What they’re doing with specific fabs or processes is less important relative to their bigger emphasis on working for a client like Apple.
In some areas they may be shifting resources. But a lot has happened since last summer. They have received some cash infusions and 18a is in full production with yields, apparently, at acceptable levels. Rumors are Apple has already signed on.
New CEO said he'll continue with Foundry provided he gets significant customers to justify the cost. In a recent comment/press release, Intel said they are continuing production on 14A. Ergo, they have external customers (or Trump is bullying him into it, but I suspect it's mostly the former).
> Apple-TSMC: The Partnership That Built Modern Semiconductors
In 2013, TSMC made a $10 billion bet on a single customer. Morris Chang committed to building 20nm capacity with uncertain economics on the promise that Apple would fill those fabs. “I bet the company, but I didn’t think I would lose,” Chang later said. He was right. Apple’s A8 chip launched in 2014, and TSMC never looked back.
That's great! Apple has the resources to incentivize and invest in alternate production capacity(Intel, Samsung, or others). Sure, it will take years, but a thousand mile journey begins with one step...
TSMC is already producing at their first one in Arizona (N4 process), second one comes online for N3 in 2028, and third one (N2) broke ground in April 2025 (online date 2029-30)
The projects seem to go well and then union bosses threaten to shut the whole thing down.
Then the essential skilled personnel can’t come train people because the visa process was created by and is operated by the equivalent of four year olds with learning disabilities. Sometimes companies say fuck it we’re doing it anyway and then ice raids their facility and shuts it down.
I’d post the news articles about th above, but your googling thumbs work as well as mine.
As a heavy user of OpenAI, Anthropic, and Google AI APIs, I’m increasingly tempted to buy a Mac Studio (M3 Ultra or M4 Pro) as a contingency in case the economics of hosted inference change significantly.
The 8-10kW isn’t a big deal anymore given the prevalence of electric vehicles and charging them at home. A decade ago very few homes have this kind of hookup. Now it’s reasonably common, and if not, electricians wouldn’t bat an eye on installing it.
M3 Ultra with DGX Spark is right now what M5 Ultra will be in who knows when. You can just buy those two, connect them together using Exo and have M5 Ultra performance/memory right away. Who knows what M5 Ultra will cost given RAM/SSD price explosion?
yes, I'm using smaller models on a Mac M2 Ultra 32GB and they work well, but larger models and coding use might be not a good fit for the architecture, after all.
FWIW the M5 appears to be an actual large leap for LLM inference with the new GPU and Neural Accelerator. So id wait for the Pro/Max before jumping on M3 Ultra.
You'd want to get something like a RTX Pro 6000 (~ $8,500 - $10,000) or at least a RTX 5090 (~$3,000). That's the easiest thing to do or cluster of some lower-end GPUs. Or a DGX Spark (there are some better options by other manufacturers than just Nvidia) (~$3000).
Yes, I also considered the RTX 6000 Pro Max-Q, but it’s quite expensive and probably only makes sense if I can use it for other workloads as well. Interestingly, its price hasn’t gone up since last summer, here in Germany.
I have MacStudio with 512GB RAM, 2x DGX Spark and RTX 6000 Pro WS (planing to buy a few of those in Max-Q version next). I am wondering if we ever see local inference so "cheap" as we see it right now given RAM/SSD price trends.
Good grief. I'm here cautiously telling my workplace to buy a couple of dgx sparks for dev/prototyping and you have better hardware in hand than my entire org.
What kind of experiments are you doing? Did you try out exo with a dgx doing prefill and the mac doing decode?
I'm also totally interested in hearing what you have learned working with all this gear. Did you buy all this stuff out of pocket to work with?
the thing is GLM 4.7 is easily doing the work Opus was doing for me but to run it fully you'll need a much bigger hardware than a Mac Studio. $10k buys you a lot of API calls from z.ai or Anthropic. It's just not economically viable to run a good model at home.
You can cluster Mac Studios using Thunderbolt connections and enable RDMA for distributed inference. This will be slower than a single node but is still the best bang-for-the-buck wrt. doing inference on very-large-sized models.
True — I think local inference is still far more expensive for my use case due to batching effects and my relatively sporadic, hourly usage. That said, I also didn’t expect hardware prices (RTX 5090, RAM) to rise this quickly.
Alternatively, China could make progress fabricating and exporting its own chips and designing its own GPUs. The entire chip sector could go the way of solar panels and EVs with prices dropping and margins collapsing to near zero.
Yup, they're also like 5-10 years out from their own lithography machines as well. China wanted Taiwan before TSMC was a thing, by the time they take Taiwan back they won't need TSMC.
Buy in-demand fab output today, even at a premium price and even if you can't install or power it all, expecting shortages tomorrow. Which is pretty much the way the tech economy is already working.
So no, no hedge. NVIDIA's customers already beat you to it.
I'm surprised that Apple is not considering opening up its own fabs. Tim Cook is all about vertical-integration and they have a mountain of cash that they could use to fund the initial startup capex.
Semiconductor manufacturing is not an incremental step for Apple. It's an entirely new kind of vertical. They do not have the resources to do this. If they could they would have by now.
Designing CPUs also wasn't their core business and they did it anyway. Apple probably won't care that much about price hikes but if they ever feel TSMC can't guarantee steady supply then all bets are off.
I wonder what will happen in future when we get closer to the physical "wall". Will it allow other fabs to catch up or the opposite will happen, and even small improvements will be values by customers?
Apple has very much been wanted absolute flexibility to adopt major technology changes so much they’ve tried hard to not be the sole customer of a supplier and deal with political ramifications (source: Apple in China/Patrick McGee)
Closer to $40b for a new fab for an established company to do it all correctly. It's a much more major investment to open a fab without ever doing it before, then continually use the brain power/institutional knowledge you've built up to stay near the forefront of fab tech, and then basically have weird incentives to build a foundry for only your products rather than the world at large.
You're setting yourself up for making a huge part of your future revenue stream being set aside for ongoing chipfab capex and research engineering. And that's a huge gamble, since getting this all setup is not guaranteed to succeed.
I think you are referring to thunderbolt cables with their signal conditioning chips, and if that’s the case then I would like to say that TSMC isn’t making those chips. Afaik Intel and maybe some others make the chips that go into thunderbolt cables.
oh, darn. my least favorite walled garden / vertical monopoly / rentseeker will have to raise prices. I'm sure they can spin this as a quality improvement.
How much new capacity is under construction? Seems like it should be a lot, but other than Arizona and Ohio and a few other places I'm not reading about a ton of cutting-edge node fab construction happening.
I find that my cell phone which is 4 generations old and my desktop computer which is 2 generations old are totally adequate for everything I need to do, and I do not need faster processing
I really don't care about most new phone features and for my laptop the M1 Max is still a really decent chip.
I do want to run local LLM agents though and I think a Mac Studio with an M5 Ultra (when it comes out) is probably how I'm going to do that. I need more RAM.
I bet I'm not the only one looking at that kind of setup now that was previously happy with what they had..
Apple has made some good progress on memory sharing over thunderbolt. If they could get that ironed out you maybe could run a good LLM on a cluster of Mac minis.
Again you cannot today but people are working on it. One guy might have gotten it to work but it’s not ready for prime time yet.
But do you use any ai services like chat gpt, Claude, Gemini? If so you’re offloading your compute from a local stack to a high performance nvidia gpu stack operated by one of the big five.
It’s not that you aren’t using new hardware, it’s that you shifted the load from local to centralized.
I’m not saying this is bad or anything, it’s just another iteration of the centralized vs decentralized pendulum swing that has been happening in tech since the beginning (mainframes with dumb terminals, desktops, the cloud, mobile) etc.
Apple might experience a slowdown in hardware sales because of it. Nvidia might experience a sales boom because of it. The future could very well bring a swing back. Imagine you could run a stack of Mac minis that replaced your monthly Claude code bill. Might pay for itself in 6mo (this doesn’t exist yet but it theoretically could happen)
> Imagine you could run a stack of Mac minis that replaced your monthly Claude code bill. Might pay for itself in 6mo (this doesn’t exist yet but it theoretically could happen)
You don't have to imagine. You can, today, with a few (major) caveats: you'll only match Claude from roughly ~6 months ago (open-weight models roughly lag behind the frontier by ~half a year), and you'd need to buy a couple of RTX 6000 Pros (each one is ~$10k).
Technically you could also do this with Macs (due to their unified RAM), but the speed won't be great so it'd be unusable.
Taiwan's TSMC foundries are their nuclear currency: they must keep them to remain protected by others, and yet the others didn't completely build interchangeable and resilient capacity elsewhere to do what essential for them that they had the money to do.
So now Apple, Nvidia, AMD (possibly), and most car manufacturers will be up a creek without a paddle when China invades in 1-2 years. That is unless China's Xi is bluffing to mollify domestic war hawks and reunification zealots by going through the motions of building an army of war machines without intent to use them, but I don't think that's probable. It's possible that Trump already made agreements with Xi to cede "Oceania" if they allow the US to take Greenland and South America for empire-building neocolonialism.
I feel like China invading Taiwan isn't happening in our lifetimes. Yes, they stand to benefit from it, but I doubt any of the people in charge of decision making are that interested in rocking the boat. There's nobody forcing their hand and the country is doing great without needing to invade anyone.
The leader of China literally publicly told his military to have “all options for reunification of Taiwan ready by 2027”
What options do you suppose the military might be working on?
Training to surround, and blockade? (Check)
Information warfare? (Check)
Building high numbers of landing craft? (Check)
Building high numbers of modular weapon systems that can rapidly increase the number of offensive ships? (Check)
Building numerous high volume drone warfare ships and airborne launchers? (Check)
Keep in mind that there are public language cues that preceded invasion such as declarations of the invalidity of the other country’s sovereignty, declarations that the other country is already part of the invading country.
Have you seen any signs of that?
Your persistent doubts require ignorance of strong evidence.
Let's hope China doesn't get a leader like Donald Trump in our lifetimes, then I think your prediction will apply. Despite the political tensions, China and Taiwan are so deeply integrated economically that an invasion would hurt not only Taiwan and the global economy, but also China (directly and indirectly). The EU and the US are making efforts to re-shore some semiconductor manufacturing, but TSMC and others will probably still keep a sizable amount of manufacturing in Taiwan, so I don't think this interconnectedness will change anytime soon...
It seems that their leaders are and have been planning to take over Taiwan for decades. At least according to most of what I’ve read on the topic from all the various sources.
If or when China’s economic and/or demographics issues become problematic is exactly when the CCP likely would want to strike. At least seems to me like it’d be a good time to foment national pride.
Of course hopefully I’m wrong and you’re right.
Many of these larger geopolitical things are decades in the making. Even Trump’s Venezuela action has been a long time brewing. So much so that “US troops in Venezuela” has become a trope in military sci-fi. The primary change with Trump is how he presents and/or justifies it, or rather doesn’t.
There's some intersection point between long term decreasing in China's ability (demographic collapse) and long term increase in China's ability (their current build up of military hardware in air, land, and sea that is currently outpacing America's). Maybe somewhere in 10-20 years where their regional military power is much higher than America can project across the Atlantic but they still have a lot of military aged men.
Atlantic? IDK if China even has aspirations to play World Police like the US. Military protection of things like their interests and the stability of Belt and Road, sure, but I don’t see China trying something like the Gulf War or OEF.
It’s very possible that they will be able to dominate South China Sea and their zone of the Pacific, even now, given the proximity advantages and ship/missile production; and I think that would be satisfactory to them.
20 years from now, China’s sphere and America’s sphere are separate, with China having a lead in competing for Africa, and Europe in a very weird place socially, economically, demographically, and WRT Russia/US competition.
My point is that China can sustain a naval blockade of Taiwan nearly indefinitely, and at some point Taiwan will have to decide whether they want to live under siege forever (poor, cold, getting everything via scarce and expensive air freight), or give up come to a political solution.
I'm not like, rooting for this, I'm just trying to be realistic.
The US has its own TSMC supply (insert comments about it not being cutting edge). And the US will stand-down and let China take Taiwan with no serious conflict in exchange for supply agreements. Not more than 5-10 years out at this point.
The US can't even remotely come close to stopping China in its own backyard today, in another 5-10 years they'll just have that much larger of a Navy. The US knows that's the situation. The US can supply a large one week bombing campaign against China and that's it, based on inventory levels. The US will exhaust its cruise missile supply instantly and the US has almost no meaningful drone-bomb supply. China can build cheap missiles by the tens of thousands perpetually, train them to the coast, and flatten Taiwan and any opponents as necessary. China is the only country that can sustain a multi-year WW2 style bombing campaign today, thanks to its manufacturing capabilities. Imagine them on a full war footing.
Yeah, I just don’t know that there’s the will to blow up the world economy for which flag flies over Taiwan.
China absorbing Taiwan (especially to Americans) just doesn’t seem like a radical, terrifying concept.
A Hong Kong style negotiated transfer might be best for the world - Taiwanese that want to leave can, the US can build up a parallel source of semiconductors, China gets Taiwan without firing a shot.
Is it better than the alternative? Do you think TSMC wants to see a Dongfeng or ATACMS headed for their fab, if the alternative is a negotiated handover?
> The US has its own TSMC supply (insert comments about it not being cutting edge)
USA has been strategically re-homing TSMC to the US mainland for a long time now. 30% of all 2nm and better technologies are slated to be produced in Arizona by 2030.
The real loser in all of this will be the EU which will be completely without the ability to produce or acquire chips. They'll just end up buying from China and USA, which will only further deepen their dependence on those countries.
That's announcing 40k WSPMs of eventual capacity spread across 28nm and 16nm nodes. I mean, it's a start, and I'm sure automakers are totally stoked given the Nexperia debacle, but the EU will remain completely dependent on foreign advanced node semiconductors.
Compare to TSMC's Arizona project, which will supply 30% of TSMC's 2nm and smaller process output. Already just one of the six planned TSMC fabs in Arizona is pumping out ~30k WSPMs at 5nm or smaller.
And that doesn't even get into CoWoS packaging, which is essential for all the highest-performance and highest-margin parts.
The fact is: In semiconductors, Europe is getting left in the dust. Sure they can fab some mature node chips for industrial uses--and that's not nothing--but Smartphone SoCs, "AI" accelerators, DRAM, even boring CPUs simply cannot be made any more in Europe, and to the limited extent that they can, they will be horrendously uncompetitive on the market and outclassed in every performance metric by Chinese and American chips.
EU is on a big sovereignty kick right now, which makes sense given that their foreign dependencies keep blowing up in their faces. So it's strange that EU is so complacent about their foreign dependency on advanced node semiconductors.
Has the Ukraine situation not shown that the EU has relegated itself to second fiddle?
It’s too old, too complacent, and too broke. Even compared to the US and our level of discord, there’s no unity across divisions.
The US absurdly threatens Greenland, but Denmark/EU’s response is “Sanction US tech or kick out US military bases on Europe”, rather than be able to rattle a saber back and show some credible backbone.
Without San Diego based Cymer they can't move forward on their latest and greatest. As far as I know they still do R&D in San Diego even after purchase.
"Our system produces 4X more power that enables better lithographic patterning, which is necessary to manufacture chips with smaller and more efficient feature sizes. In addition to being more powerful, our FEL system has programmable light characteristics that improve current capabilities and enable next-generation lithography (e.g., shorter wavelengths) - uniquely enabling the extension of Moore’s Law for decades. Connecting existing ASML scanners to an xLight FEL significantly improves the tool’s capabilities, delivering next-version scanner performance without the cost and complexities."
Is it supposed to work independently of other technology at some point?
Then anyways: multilateral cooperation is at the heart of scientific progress anyways. It's fitting that ASML is in a country that is culturally strongly influenced by its history of seafaring and trade.
Will see how the braindrain caused by people not wanting to live their lifes in a society taht doesn't share values like these will influence that whole technological armsrace thing.
Some people in Japan are coming up with a successor to EUV as far as I remember, what was their name again?
Is it karma or is it just normal business activities? When you're a large player like this you get pricing power. If another large player moves in and also has pricing power then negotiations and things like that take place. Business deals, profits, &c. all ebb and flow and this is no different.
Weird take. If you want to undertake approximately a bajillion dollars in capex to prove out and scale up a new node, it is extremely to have one massive, anchor customer who will promise well in advance to offtake basically the entire thing for a bit and who has creditworthiness exceeded by few non-sovereign entities, and thus is able to write contracts against which it is easy to lend. Also this customer makes little chips (when your defect rate is higher) and bigger chips (when your defect rate is lower). Of course you don't try to synthesize this profile out of a bajillion tiny customers.
Tim Cook failing on the Cook doctrine ("We believe that we need to own and control the primary technologies behind the products that we make") is ironic.
I'm sure if Apple could manage to run a fab with the quality and costs they get with TSMC, they would. I have little doubt they've been pushing forward on that mission.
Owning a leading edge fab is not practical for most companies, even huge some ones like Apple.
Intel has even struggled with it since they traditionally didn’t sell capacity to other buyers. It worked for Intel because they traditionally had a near-monopoly over the laptop, desktop, and server chip market.
Apple certainly has the money to spin up their own chip fabricator, but there’s no guarantee it would be as good as TSMC, it would cost billions, and they would have less of an ability to sell capacity to other customers.
At the end of that effort they could be left with a chip fab that produces chips that still cost the same or more than what TSMC manufactures them for. It might just be cheaper to try and outbid Nvidia for priority.
Per that very article, Sherman will be for support chips for power and peripherals, on legacy 45nm+ nodes.
Apple's investing heavily in the TSMC fab in Arizona, due to open in 2027, to have 3nm capabilities for its flagship chips, but it's unlikely that would ever cover a majority of that chipmaking.
I am very unhappy with the increased RAM prices - and now general increase in prices for hardware. To me this is collusion, a de-facto monopoly. Governments that don't stop this practice are also part of the mafia.
We really need many more smaller, more independent manufacturers. All the big guns, from NVIDIA, Apple, Intel, AMD, etc... have massively disappointed about 99.9% of us here now.
Here's what G AI estimates when asked about "base on public data, estimate how many mm^2 of apple/Nvdia silicon are produce in TSMC for the past 3 years."
Nvidia is the high-frequency trader hammering the newest node until the arb closes. Stability usually trades at a discount during a boom, but Wei knows the smartphone replacement cycle is the only predictable cash flow. Apple is smart. If the AI capex cycle flattens in late '27 as models hit diminishing returns, does Apple regain pricing power simply by being the only customer that can guarantee wafer commits five years out?
However, everyone knows that good faith reciprocity at that scale is not rewarded. Apple is ruthless. There are probably thousands of untold stories of how hard Apple has hammered it's suppliers over the years.
While Apple has good consumer brand loyalty, they arguably treat their suppliers relatively poorly compared to the Gold standard like Costco.
Changing fabs is non-trivial. If they pushed Apple to a point where they had to find an alternative (which is another story) and Apple did switch, they would have to work extra hard to get them back in the future. Apple wouldn't want to invest twice in changing back and forth.
On the other hand, TSMC knows that changing fabs is not really an option and Apple doesn't want to do it anyway, so they have leverage to squeeze.
At this level, everyone knows it's just business and it comes down to optimizing long-term risk/reward for each party.
Your underlying statement implies that whoever is replacing apple is a better buyer which I don't think is necessarily true.
> According to Han, Nvidia has been difficult to work with for some time now. Like all other GPU board partners, EVGA is only told the price of new products when they're revealed to everyone on stage, making planning difficult when launches occur soon after. Nvidia also has tight control over the pricing of GPUs, limiting what partners can do to differentiate themselves in a competitive market.
https://www.gamespot.com/articles/evga-terminates-relationsh...
That means deprioritizing your largest customer.
Also theres the devil you know and the devil you dont know.
Private companies can be nice to their suppliers. Owners can choose to stay loyal to suppliers they went to high school with, even if it isn't the most cost-efficient.
I’m not saying you’re wrong but you’re previous paragraph sounding like you were wondering if it was the case vs. here you’re saying it’s known. Is this all true? Do they have a reputation for hammering their suppliers?
Apple has responded and has started moving a lot of manufacturing out of China. It just makes sense for risk management.
Nvidia's willingness to pay exorbitant prices for early 2nm wafers subsidizes the R&D and the brutal yield-learning curve for the entire node. But you can't run a sustainable gigafab solely on GPUs...the defect density math is too punishing. You need a high-volume, smaller-die customer (Apple) to come in 18 months later, soak up the remaining 90% of capacity and amortize that depreciation schedule over a decade.
Apple OTOH operates at consumer electronics price points. They need mature yields (>90%) to make the unit economics of an iPhone work. There's also the binning factor I am curious about. Nvidia can disable 10% of the cores on a defective GPU and sell it as a lower SKU. Does Apple have that same flexibility with a mobile SoC where the thermal or power envelope is so tightly coupled to the battery size?
For Apple, they have binning flexibility, with Pro/Max/Ultra, all the way down to iPads - and that’s after the node yields have been improved via the gazillion iPhone SoC dies.
NVIDIAs flexibility came from using some of those binned dies for GeForce cards, but the VRAM situation is clearly making that less important, as they’re cutting some of those SKUs for being too vram heavy relative to MSRP.
As you cut SMs from a die you move from the 3090 down the stack, for instance. That’s yield management right there.
The Pro and Max chips are different dies, and the Ultra currently isn't even the same generation as the Max. And the iPads have never used any of those larger dies.
> NVIDIAs flexibility came from using some of those binned dies for GeForce cards
NVIDIA's datacenter chips don't even have display outputs, and have little to no fixed-function graphics hardware (raster and raytracing units), and entirely different memory PHYs (none of NVIDIA's consumer cards have ever used HBM).
Not binning an M4 Max for an iPhone, but an M4 Pro with a few GPU or CPU cores disabled is clearly a thing.
Same for NVIDIA. The 4080 is a 4090 die with some SMs disabled.
The desktop 4090 uses the AD102 die, the laptop 4090 and desktop 4080 use the AD103 die, and the laptop 4080 uses the AD104 die. I'm not at all denying that binning is a thing, but you and other commenters are exaggerating the extent of it and underestimating how many separate dies are designed to span a wide product line like GPUs or Apple's computers/tablets/phones.
Otherwise, yes, if a chip doesn't make M4 Max, it can make M4 Pro. If not, M4. If not, A18 Pro. If not that, A18.
And even all of the above mentioned marketing names come in different core configurations. M4 Max can be 14 CPU Cores / 32 GPU cores, and it can also be 16 CPU cores and 40 GPU cores.
So yeah, I'd agree that Apple has _extreme_ binning flexibility. It's likely also the reason why we got A19 / A19 Pro / M5 first, and we still don't have M5 Pro or M5 Max yet. Yields not high enough for M5 Max yet.
Unfortunately I don't think they bin down even lower (say, to S chips used in Apple Watches), but maybe in the future they will.
In retrospect, Apple ditching Intel was truly a gamechanging move. They didn't even have to troll everyone by putting an Intel i9 into a chassis that couldn't even cool an i7 to boost the comparison figures, but I guess they had to hedge their bet.
No, that's entirely wrong. All of those are different dies. The larger chips wouldn't even fit in phones, or most iPad motherboards, and I'm not sure a M4 Max or M4 Pro SoC package could even fit in a MacBook Air.
As a general rule, if you think a company might ever be selling a piece of silicon with more than half of it disabled, you're probably wrong and need to re-check your facts and assumptions.
There are two levels of Max Chip, but think of a Max as two pros on die (this is simplification, you can also think of as pro as being two normal cores tied together), so a bad max can't be binned into a pro. But a high-spec Max can be binned into a low-spec Max.
For example the regular M4 can have 4 P-cores / 6 E-cores / 10 GPU cores, or 3/6/10 cores, or 4/4/8 cores, depending on the device.
They even do it on the smaller A-series chips - the A15 could be 2/4/5, 2/4/4, or 2/3/5.
The flat line prediction is now 2 years old...
If Nvidia pays more, Apple has to match.
Not a system that necessarily works all that well if one player has a short-term ability to vastly outspending all the rest.
You can't let all your other customers die just because Nvidia is flush with cash this quarter...
TSMC isn't running a charity, it sells capacity to the highest bidder.
Of course customers as big as Apple will have s relationship and insane volumes that they will be guaranteed important quotes regardless.
Is the argument that Apple will go out of business? AAPL?
Wait,
> one player has a short-term ability to vastly outspending all the rest.
I assure you, Apple has the long-term and short-term ability to spend like a drunken sailor all day and all night, indefinitely, and still not go out of business. Of course they’d prefer not to. But there is no ‘ability to pay’ gap here between these multi-trillion-dollar companies.
Apple will be forced to match or beat the offer coming from whoever is paying more. It will cost them a little bit of their hilariously-high margins. If they don’t, they’ll have to build less advanced chips or something. But their survival is not in doubt and TSMC knows that.
Well yeah, people were identifying that back when Apple bought out the entirety of the 5nm node for iPhones and e-tchotchkes. It was sorta implicitly assumed that any business that builds better hardware than Apple would boot them out overnight.
That's the take I would pursue if I were Apple.
A quiet threat of "We buy wafers on consumer demand curves. You’re selling them on venture capital and hype"
From TSMC's perspective, Apple is the one that needs financial assistance. If they wanted the wafers more than Nvidia, they'd be paying more. But they don't.
This is the "venture capital and hype" being referred to, not Nvidia themselves.
I get why the numbers are presented the way they are, but it always gets weird when talking about companies of Apple’s size - percent increases that underwhelm Wall Street correspond to raw numbers that most companies would sacrifice their CEO to a volcano to attain, and sales flops in Apple’s portfolio mean they only sold enough product to supply double-digit percentages of the US population.
The giant conglomerates in Asia seem more able to do it.
Google has somewhat tried but then famously kills most everything even things that could be successful if smaller businesses.
Every time a CEO or company board says "focus," an interesting product line loses its wings.
I think Asian companies are much less dependent on public markets and have as strong private control (chaebols in South Korea for example - Samsung, LG, Hyundai etc).
If you look at US companies that are under "family control" you might see a similar sprawl, like Cargill, Koch, I'd even put Berkshire in this class even though it's not "family controlled" in the literal sense, it's still associated with two men and not a professional CEO.
So report the facts but sentences like "What Wei probably didn’t tell Cook is that Apple may no longer be his largest client" make it personal, they make you take sides, feel sorry for somebody, feel schadenfreude... (as you can observe in the comments)
Okay, but this isn't a news article, it's an opinion piece on some guy's substack.
How do you think it got in the LLM training set in the first place?
AFAIK only Apple has been commiting to wafers up to 3 years in the future. It would be a crazy bet for NVidia to do the same, as they don't know how big will be the business.
https://youtu.be/K86KWa71aOc?t=483
That would ruin TSMC and others' independence.
Nvidia already did buy Intel shares so it is a thing.
Nvidia did discuss with TSMC for more capacity many times. It's not about financing or minimum purchasing agreements. TSMC played along during COVID and got hit.
As far as I know there was never a demand dip at any point in there.
Which barely impacts TSMC. Most of their revenue and focus is on the advanced nodes - not the mature 1s.
> As far as I know there was never a demand dip at any point in there.
When did I imply there was a demand dip? I said they built out too much capacity.
https://www.manufacturingdive.com/news/intel-layoffs-25-perc...
> Apple-TSMC: The Partnership That Built Modern Semiconductors
In 2013, TSMC made a $10 billion bet on a single customer. Morris Chang committed to building 20nm capacity with uncertain economics on the promise that Apple would fill those fabs. “I bet the company, but I didn’t think I would lose,” Chang later said. He was right. Apple’s A8 chip launched in 2014, and TSMC never looked back.
https://newsletter.semianalysis.com/p/apple-tsmc-the-partner...
Apple can and should do it again!
I know about the existence of the initiative but I don't know how it is progressing / what is actually going on on that front.
There's ~a dozen in the works or under construction
TMSC plans to have 2-3nm fabs operational in the next 2-3 years
So we're 2-3 years behind the standard (currently 2nm), and further behind on the bleeding edge sub-2nm fabs
Are the majority of the staff still shipped in from Asia?
https://www.tsmc.com/static/abouttsmcaz/index.htm
Then the essential skilled personnel can’t come train people because the visa process was created by and is operated by the equivalent of four year olds with learning disabilities. Sometimes companies say fuck it we’re doing it anyway and then ice raids their facility and shuts it down.
I’d post the news articles about th above, but your googling thumbs work as well as mine.
Just look at what people are actually using. Don't rely on a few people who tested a few short prompts with short completions.
What kind of experiments are you doing? Did you try out exo with a dgx doing prefill and the mac doing decode?
I'm also totally interested in hearing what you have learned working with all this gear. Did you buy all this stuff out of pocket to work with?
I don’t know the hedge to position against this but I’m pretty sure China will make good on its promise.
Buy in-demand fab output today, even at a premium price and even if you can't install or power it all, expecting shortages tomorrow. Which is pretty much the way the tech economy is already working.
So no, no hedge. NVIDIA's customers already beat you to it.
Or they could buy out Intel and sell off their cpu design division
I wonder what will happen in future when we get closer to the physical "wall". Will it allow other fabs to catch up or the opposite will happen, and even small improvements will be values by customers?
You're setting yourself up for making a huge part of your future revenue stream being set aside for ongoing chipfab capex and research engineering. And that's a huge gamble, since getting this all setup is not guaranteed to succeed.
As would almost innumerable others.
Also Nvidia's margins are higher which means that they will be willing to pay a higher unit price.
This seems like an open and closed case from TSMC's side.
More likely they will not use leading the leading edge fab process, which TBH is fine for the vast majority.
I wish I were in that situation, but I find myself able to use lots more compute than I have. And it seems like many others feel the same.
I really don't care about most new phone features and for my laptop the M1 Max is still a really decent chip.
I do want to run local LLM agents though and I think a Mac Studio with an M5 Ultra (when it comes out) is probably how I'm going to do that. I need more RAM.
I bet I'm not the only one looking at that kind of setup now that was previously happy with what they had..
I’m not saying this is bad or anything, it’s just another iteration of the centralized vs decentralized pendulum swing that has been happening in tech since the beginning (mainframes with dumb terminals, desktops, the cloud, mobile) etc.
Apple might experience a slowdown in hardware sales because of it. Nvidia might experience a sales boom because of it. The future could very well bring a swing back. Imagine you could run a stack of Mac minis that replaced your monthly Claude code bill. Might pay for itself in 6mo (this doesn’t exist yet but it theoretically could happen)
You don't have to imagine. You can, today, with a few (major) caveats: you'll only match Claude from roughly ~6 months ago (open-weight models roughly lag behind the frontier by ~half a year), and you'd need to buy a couple of RTX 6000 Pros (each one is ~$10k).
Technically you could also do this with Macs (due to their unified RAM), but the speed won't be great so it'd be unusable.
Data is saying demand >>>>> supply.
I mean this is pretty fantastic.
So now Apple, Nvidia, AMD (possibly), and most car manufacturers will be up a creek without a paddle when China invades in 1-2 years. That is unless China's Xi is bluffing to mollify domestic war hawks and reunification zealots by going through the motions of building an army of war machines without intent to use them, but I don't think that's probable. It's possible that Trump already made agreements with Xi to cede "Oceania" if they allow the US to take Greenland and South America for empire-building neocolonialism.
They would benefit in what way?
Because their government seems to benefit a lot from Taiwan existing and being an enemy.
What options do you suppose the military might be working on? Training to surround, and blockade? (Check) Information warfare? (Check) Building high numbers of landing craft? (Check) Building high numbers of modular weapon systems that can rapidly increase the number of offensive ships? (Check) Building numerous high volume drone warfare ships and airborne launchers? (Check)
Keep in mind that there are public language cues that preceded invasion such as declarations of the invalidity of the other country’s sovereignty, declarations that the other country is already part of the invading country. Have you seen any signs of that?
Your persistent doubts require ignorance of strong evidence.
If or when China’s economic and/or demographics issues become problematic is exactly when the CCP likely would want to strike. At least seems to me like it’d be a good time to foment national pride.
Of course hopefully I’m wrong and you’re right.
Many of these larger geopolitical things are decades in the making. Even Trump’s Venezuela action has been a long time brewing. So much so that “US troops in Venezuela” has become a trope in military sci-fi. The primary change with Trump is how he presents and/or justifies it, or rather doesn’t.
It will likely be a naval plus air blockade to force a political solution to avoid the messiness of an invasion, but time is on China's side there.
Long term: demographics are worsening for China relative to now or 5 years ago.
Short term: China doesn’t yet have viable homegrown replacements for ASML, TSMC, etc.
Really short term: China blockading Taiwan and suffering the economic fallout would be much more painful than US blockading Cuba/Venezuela/etc.
A decisive kinetic action or a very soft political action, rather than a blockade seems more viable in the current state.
It’s very possible that they will be able to dominate South China Sea and their zone of the Pacific, even now, given the proximity advantages and ship/missile production; and I think that would be satisfactory to them.
20 years from now, China’s sphere and America’s sphere are separate, with China having a lead in competing for Africa, and Europe in a very weird place socially, economically, demographically, and WRT Russia/US competition.
I'm not like, rooting for this, I'm just trying to be realistic.
The US can't even remotely come close to stopping China in its own backyard today, in another 5-10 years they'll just have that much larger of a Navy. The US knows that's the situation. The US can supply a large one week bombing campaign against China and that's it, based on inventory levels. The US will exhaust its cruise missile supply instantly and the US has almost no meaningful drone-bomb supply. China can build cheap missiles by the tens of thousands perpetually, train them to the coast, and flatten Taiwan and any opponents as necessary. China is the only country that can sustain a multi-year WW2 style bombing campaign today, thanks to its manufacturing capabilities. Imagine them on a full war footing.
China absorbing Taiwan (especially to Americans) just doesn’t seem like a radical, terrifying concept.
A Hong Kong style negotiated transfer might be best for the world - Taiwanese that want to leave can, the US can build up a parallel source of semiconductors, China gets Taiwan without firing a shot.
USA has been strategically re-homing TSMC to the US mainland for a long time now. 30% of all 2nm and better technologies are slated to be produced in Arizona by 2030.
The real loser in all of this will be the EU which will be completely without the ability to produce or acquire chips. They'll just end up buying from China and USA, which will only further deepen their dependence on those countries.
Compare to TSMC's Arizona project, which will supply 30% of TSMC's 2nm and smaller process output. Already just one of the six planned TSMC fabs in Arizona is pumping out ~30k WSPMs at 5nm or smaller.
And that doesn't even get into CoWoS packaging, which is essential for all the highest-performance and highest-margin parts.
The fact is: In semiconductors, Europe is getting left in the dust. Sure they can fab some mature node chips for industrial uses--and that's not nothing--but Smartphone SoCs, "AI" accelerators, DRAM, even boring CPUs simply cannot be made any more in Europe, and to the limited extent that they can, they will be horrendously uncompetitive on the market and outclassed in every performance metric by Chinese and American chips.
EU is on a big sovereignty kick right now, which makes sense given that their foreign dependencies keep blowing up in their faces. So it's strange that EU is so complacent about their foreign dependency on advanced node semiconductors.
It’s too old, too complacent, and too broke. Even compared to the US and our level of discord, there’s no unity across divisions.
The US absurdly threatens Greenland, but Denmark/EU’s response is “Sanction US tech or kick out US military bases on Europe”, rather than be able to rattle a saber back and show some credible backbone.
They sent warships to Greenland. What level of saber rattling do you expect?
Is it supposed to work independently of other technology at some point?
Then anyways: multilateral cooperation is at the heart of scientific progress anyways. It's fitting that ASML is in a country that is culturally strongly influenced by its history of seafaring and trade. Will see how the braindrain caused by people not wanting to live their lifes in a society taht doesn't share values like these will influence that whole technological armsrace thing.
Some people in Japan are coming up with a successor to EUV as far as I remember, what was their name again?
My conspiracy theory is that there is some kind of "gentleman agreement" on this topic between the US and China.
As soon as Taiwan is not needed anymore by the US for chip fabrication, the US will at the very least loose their grip on it.
Note to commenters: that's my theory, does not mean I endorse it in any way.
(Apple is well known for shoving "lesser vendors" out of the way at TSMC)
Intel has even struggled with it since they traditionally didn’t sell capacity to other buyers. It worked for Intel because they traditionally had a near-monopoly over the laptop, desktop, and server chip market.
Apple certainly has the money to spin up their own chip fabricator, but there’s no guarantee it would be as good as TSMC, it would cost billions, and they would have less of an ability to sell capacity to other customers.
At the end of that effort they could be left with a chip fab that produces chips that still cost the same or more than what TSMC manufactures them for. It might just be cheaper to try and outbid Nvidia for priority.
https://appleinsider.com/articles/25/08/22/apple-chips-to-be...
Apple's investing heavily in the TSMC fab in Arizona, due to open in 2027, to have 3nm capabilities for its flagship chips, but it's unlikely that would ever cover a majority of that chipmaking.
https://www.aztechcouncil.org/tucson-chipmaker-tsmc-arizona-...
https://wccftech.com/tsmc-plans-to-bring-3nm-production-to-t...
We really need many more smaller, more independent manufacturers. All the big guns, from NVIDIA, Apple, Intel, AMD, etc... have massively disappointed about 99.9% of us here now.
Also, https://aramzs.xyz/thoughts/dont-post-ai-at-me/