Everyone: things suck, better move my stuff on a small home server.
The hyper-scaler mafia: NOT ON MY WATCH!
The only silver lining is that newer devices will have to scale down memory, so developers will have to ditch memory-sucking frameworks and start to optimize things again.
How many useless living humans do you know? They go somewhere. Something happens to them. Whatever it is it’s about to happen to 30% of the population.
Given this has been going on for years at this point, the high prices of graphics cards through crypto and now AI, it feels like this is the new normal, forever propped up by the next grift.
I don't think this ideology and investment strategy will survive this grift. There's too much geopolitical instability and investment restructuring for it to work again. Everyone is looking at isolationist policies. I mean mastercard/visa is even seen as a risk outside US now.
Well a risk has an abstract level and is either increasing or decreasing. You can look at your risk profile over time and work out how to define policy going forwards. It takes a long time to make changes at country level.
US is medium risk and increasing rapidly. Run away quickly.
> I’m just afraid that prices of $everything will go up soon and will not come down anymore, like they did after Covid.
Just like the price of labour. Your salary went up and doesn't come down
In the UK weekly earnings increased 34% from December 2019 to December 2025.
CPI went up 30% in the same period.
Obviously that CPI covers things which went up more, and things which went up less, and your personal inflation will be different to everyone elses. Petrol prices end of Jan 2020 were 128p a litre, end of Jan 2025 they are 132p a litre [0]. Indeed petrol prices were 132p in January 2013. If you drive 40,000 miles a year you will thus see far lower inflation than someone who doesn't drive.
> I’m just afraid that prices of $everything will go up soon and will not come down anymore, like they did after Covid.
That's how inflation works. In this case it seems more narrow though, there's hope the prices will go down. Especially if the AI hype finds a reason to flounder.
The main thing that the powers that be have always underestimated is the insane creativity the common people have when it comes to wanting things, but being forced to use alternative ways. Not going to say it won't suck, but interesting ways will indeed be found.
You’re going to find what, ways to make hand crafted survival RAM and drives in your backyard chip foundry?
Call me cynical if you like, but I don’t see this optimism that assumes the banal idea that somehow good always wins, when that’s simply not possible and in fact bad-guys have won many times before, it’s just that “dead men tell no tales” and the winners control what you think is reality.
I come from a time when people had to use 1 to a maximum of 48 kilobytes for their entire computer. Later on i once went to Helsinki to watch with my own eyes what people can program in a restriction of 4 to 64 kilobytes. Computers have amazed me, but people who used them, have amazed me a lot of more times.
I wasn't saying it'll be good and that the good guys win, but a lot of insane creativity to circumvent restrictions will pop up.
The Chinese have end-to-end production capacity for lower capacity, lower performance/reliability consumer HDDs, so these are quite safe. Maybe we'll even see enterprise architectures where that cheap bottom-of-the-barrel stuff is used as opportunistic nearline storage, and then you have a far lower volume of traditional enterprise drives providing a "single source of truth" where needed.
Saw this one coming and got my personal stuff out. It's running on an old Lenovo crate chucked in my hallway.
Work is fucked. 23TB of RAM online. Microservices FTW. Not. Each node has OS overhead. Each pod has language VM overhead. And the architecture can only cost more over time. On top of that "storage is cheap so we won't bother to delete anything". Stupid mentality across the board.
It is one tiny sliver of silver lining that “storage/memory/compute is cheap” nonsense that has produced all kinds of outsourced human slop code. That mentality is clearly going to have to die.
It could even become a kind of renaissance of efficient code… if there is any need for code at all.
The five guys left online might even get efficient and fast loading websites.
Honorable mention of the NO-TECH and LOW-TECH magazine site because I liked the effort at exploring efficient use of technology, e.g., their ~500KB solar powered site.
We went from using technology to solve problems to the diametric opposite of creating new problems to solve with technology. The latter will have to contract considerably. As you say, many problems can be solved without code. If they even need to be solved in the first place.
On the efficiency front, most of what we built is for developer efficiency rather than runtime efficiency. Also needs to stop.
I'm a big fan of low tech. I still write notes on paper and use a film camera. Thanks for the link - right up my street!
Unless people notice that they just built lots of useless datacenters and push back towards a mainframe + terminal setup, because ah sorry, modern software just runs much better that way, and you can save money on our inexpensive laptop with subscription model
> The only silver lining is that newer devices will have to scale down memory, so developers will have to ditch memory-sucking frameworks and start to optimize things again.
No. Prices will just go up, less innovation in general.
We aren't just dealing with a shortage; we're dealing with a monopsony. The Big tech companies have moved from being "customers" of the hardware industry to being the "owners" of the supply chain. The shortage isn't just "high demand", but "contractual lock-out."
It is time to talk seriously about breaking up the hyperscalers. If we don't address the structural dominance of hyperscalers over the physical supply chain, "Personal Computing" is going to become a luxury of the past, and we’ll all be terminal-renters in someone else's data center.
> "Personal Computing" is going to become a luxury of the past, and we’ll all be terminal-renters in someone else's data center.
This is the game plan of course, why have customers pay one time for hardware when they can have you constantly feed them money over the long term. Shareholders want this model.
It started with planned obsolescence, now this new model is the natural progression.. There is no obsolescence even in discussion when you're only option is to rent a service, that the provider has no incentive to even make competitive.
I really feel this will be China's moment to flood the market with hardware and improve their quality over time.
This is the result of the long-planned desire for consumer computing to be subscription computing. Ultimately, there is only so much that can be done in software to "encourage" (read: coerce) vendor-locked, always-online, account-based computer usage; there are viable options for people to escape these ecosystems via the ever growing plethora of web-based productivity software and linux distributions which are genuinely good, user friendly enough, and 100% daily-drivable, but these software options require hardware.
It's no coincidence that Microsoft decided to take such a massive stake in OpenAI - leveraging the opportunity to get in on a new front for vendor locking by force-multiplying their own market share by inserting it into everything they provide is an obvious choice, but also leveraging the insane amount of capital being thrown into the cesspit that is AI to make consumer hardware unaffordable (and eventually unusable due to remote attestation schemes) further enforces their position. OEM computers that meet the hardware requirements of their locked OS and software suite being the only computers that are a) affordable and b) "trusted" is the end goal.
I don't want to throw around buzzwords or be doomeristic, but this is digital corporatism in its endgame. Playing markets to price out every consumer globally for essential hardware is evil and something that a just world would punish relentlessly and swiftly, yet there aren't even crickets. This is happening unopposed.
Honestly; I don't know. I don't think there really is a viable solution that preserves consumer computation. Most of the young people I know don't really know or care about computers. Actually, most people at large that I know don't know or care about computers. They're devices that play videos, access web storesfronts, run browsers, do email, save pictures, and play games for them. Mobile phones are an even worse wasteland of "I don't know and I don't care". The average person doesn't give a shit about this being a problem. Coupled with the capital interests of making computing a subscription-only activity (leading to market activity that prices out consumers and lobbying actions that illegalize it), this spells out a very dire, terrible future for the world where computers require government and corporate permission to operate on the internet, and potentially in ones home.
Things are bad and I don't know what can be done about it because the balance of power and influence is so lopsided in favor of parties who want to do bad.
The problem with any of these tells is that an individual instance is often taken as proof on its own rather than an indicator. People do often use “it isn't X, it is Y” like constructs¹ and many, myself included sometimes, overuse “quotes”², or use m-dashes³, or are overly concerned about avoiding repeating words⁶, and so forth.
LLMs do these things because they are in the training data, which means that people do these things too.
It is sometimes difficult to not sound like an LLM-written or LLM-reworded comment… I've been called a bot a few times despite never using LLMs for writing English⁴.
--------
[1] particularly vapid space-filler articles/comments or those using whataboutism style redirection, which might be a significant chunk of model training data because of how many of them are out there.
[2] I overuse footnotes as well, which is apparently a smell in the output of some generative tools.
[3] A lot of pre-LLM style-checking tools would recommend this in place of hyphens, and some automated reformatters would make the change without access, so there are going to be many examples in training data.
[4] I think there is one at work in VS which I use in DayJob, when it is suggesting code completion options to save typing (literally Glorified Predictive Text) and I sometimes accept its suggestion, and some of the tools I use to check my Spanish⁵ may be LLM based, so I can't claim that I don't use them at all.
[5] I'm just learning, so automatic translators are useful to check what I'm written isn't gibberish. For anyone else doing the same: make sure you research any suggested changes preferably using pre-2023 sources, because the output of these tools can be quite wrong as you can see when translating into a language you are fluent in.
[6] Another common “LLM tell” because they often have weighting functions especially designed to avoid token repetition, largely to avoid getting stuck in loops, but many pre-LLM grammar checking tools will pick people up on repeated word use too, and people tend to fix the direct symptom with a thesaurus rather than improving the sentence structure overall.
I wonder what the ratio is of "constructive" use of AI is, verses people writing pointless internet comments.
It seems personal computing is being screwed so people can create memes, ask questions that take 30 seconds to find the answer to with Google or Wikipedia, and sound clever on social media?
If you think AI as the whole discipline, there are very useful applications indeed, generally in pattern recognition and regulation space. I'm aware a lot of small projects which rely on AI to monitor ecosystems, systems or used as nice regulatory mechanisms. Also, same systems can be used for genuine security applications (civilian, non-lethal, legal and ethical).
If we are talking generative AI, again from my experience, things get a bit blurry. You can use smaller models to dig data you own.
I personally used LLMs, twice up to this day. In each case it was after very long research sessions without any answers. In one, it gave me exactly one reference, and I followed that reference and learnt what I was looking for. In the second case, it gave me a couple of pointers, which I'm going to follow myself again.
So, generative AI is not that useful for me, uses way too much resources, and industry leading models are well, unethical to begin with.
I do agree with the sentiment of the AI comment, and was even weighting just letting it slide because I do fear the future tht comment was warning against.
ChatGPT does this just as much, maybe even more, across every model they've ever released to the public.
How did both Claude and GPT end up with such a similar stylistic quirk?
I'd add that Kimi does it sometimes, but much less frequently. (Kimi, in general, is a better writer with a more neutral voice.) I don't have enough experience with Gemini or Deepseek to say.
What's next is no custom built PCs. They want us running dumb thin clients and subscribing to compute. Or it will be like phones. We'll get pre-built PCs that we aren't allowed to repair and they'll be forced to be obsolete every few years.
"they"? i see companies jacking their prices up, plain and simple. and us idiots still pay. ask yourself does intel no longer wish to sell CPUs to consumers? doesnt sound reasonable that intel would want to decimate their main market so AI companies can rule the world for some reason
It'll be fine. The supply chain for these components is inelastic, but that means once manufacturing capacity increases, it'll stay there. We'll see lower prices, especially if there is an AI crash and a mass hardware selloff like some people are predicting.
The number of HDDs sold has been in decline for over a decade. I doubt there is massive appetite for expanding production capacity
On the other hand the total storage capacity shipped each year has risen, as a combination of HDDs getting larger and larger, and demand shifting from smaller consumer HDDs to larger data center, enterprise and NAS HDDs. I'm not sure how flexible those production lines are, but maybe the reaction will be shifting even more capacity to higher-capacity drives with cutting-edge technology
Server grade hardware (rack blades) is already poor fit for consumer needs and AI dedicated hardware straight up requires external liquid cooling systems. It will be expensive to adopt them.
Strip it for usable parts (DRAM/VRAM/NANAD chips, maybe also some of the controllers), then recover any valuable metals (copper heat sinks, gold on contact pins), simple as that. :)
> According to Mosley, Seagate is not expanding its production capacities for now. Growth is to come only from higher-capacity hard drives, not from additional unit numbers.
If it takes 2 years to increase, after 2 years everything will be thin clients already. Completely locked in, fully under control and everybody used to it. Very dystopian TBH.
True if production capacity increases but it's an oligopoly and manufacturers are being very cautious because they don't want to cut into their margins. That's the problem with concentration. The market becomes ineffective for customers.
It's not about cutting in to their margins, if they end up scaling up production it will take several years and cost an untold amount of billions. When the AI bubble pops, if there's not replacement deman there's a very real chance of them going bankrupt.
I'll go against the grain and claim this might be a good thing long term. Yes, it sucks also, I was planning to expand my NAS but guess I'll figure out how to compress stuff instead.
Which goes into why I think this might be good. Developers have kind of treated disks as "oh well" with binaries ballooning in size, even when it can easily solved, and there is little care to make things lightweight. Just like I now figure out a different solution to recover space, I'm hoping with a shortage this kind of thing will be more widespread, and we'll end up with smaller things until the shortage is over. "Necessity is the mother of all invention" or however it goes.
Do the guys that buy out the market have real use for all the hardware - or is it just hype? A solution against investors trying to corner the market would be to sell virtual hardware. Let them buy as much options on virtual "to be delivered" hardware" as they want. We also need an option market for virtual LLM-tokens, where the investors can put all their money without affecting real people.
Looks like we need a computer hardware reserves the same way there are regional reserves for food, fuels and other critical commodities?
And for the same reason - to avoid the dominant players going "oh shiny" on short term lucrative adventures or outright trying to manipulate the market - causing people to starve and making society grind to a halt.
The real "computer hardware reserves" is the used market. Average folks and smaller businesses will realize that their old gear now has resale value and a lot more of it will be entering the resale/refurbishment market instead of being thrown away as e-waste.
Still, at least in the short to medium term, there are many companies, institutions and individuals that are used to getting new hardware that is under warranty - developer hardware refreshes, store hardware updates and deployments, some schools and academic institutions that provide hardware to students or even just people getting a laptop for schools use.
I think people would be used to getting new hardware for all of these & might be pretty much left out cold without a machine or having to opt for a reused or resold machine without the necessary technical knowledge to support it themselves.
I picked up a few hundred TB from a chia farm sale. Glad for it. I think I'm set for a while. Honestly, the second they started buying this stuff I started buying hardware. The only problem for me is that they're even ruining the market for RTX 6000 Pro Blackwells.
It just goes to show how totally corporations have captured western aligned governments. Our governments are powerless to do anything (aside from some baby steps from the EU).
China is now the only solution to fix broken western controlled markets.
If component prices keep going up and the respective monopoly/duopoly/triopoly for each component colludes to keep prices high/supply constrained, then eventually devices will become too expensive for the average consumer. So what’s the game plan here? Are companies planning to let users lease a device from them? Worth noting that Sony already lets you do this with a ps5. Sounds like we’re headed towards a “you will own nothing and be happy” type situation
"You will use AI, because that will be the only way you will have a relaxed life. You will pay for it, own nothing and be content. Nobody cares if you are happy or not."
We could also vote the policians protecting these uncompetitive markets out of power and let regulators do their job. There has been too many mergers in the component market.
You also have to look at the current status of the market. The level of investment in data centers spurred by AI are unlikely to last unless massive gains materialize. It's pretty clear some manufacturers are betting things will cool down and don't want to overcommit.
> We could also vote the policians protecting these uncompetitive markets out of power and let regulators do their job.
Could we though? Even if gerrymandering and voter suppression weren't already out of control, and getting worse, there are very few politicians who could or would do anything about all this.
That's the electronics industry in general though. The shortages are real and a normal part of growing pains for any industry that's so capital-intensive and capacity constrained.
Repairability, upgradability and standards compliance needs to be minimum in consumer products. No to proprietary connectors. No soldered SSD or RAM. For home use, allow relaxed licensing options for discarded enterprise products like switches, Wifi Access Points etc. (Juniper Mist APs are fantastic, but are a brick without their cloud).
Currently, I cannot put in a market bought SSD in my Macbook. I cannot put in a SSD in my Unifi router without buying their $20 SSD tray. I cannot put third party ECC-RAM and SSDs in my Synology NAS because the policy has only been lifted on HDDs but nothing else.
I fear opposite will happen. Only leveraged companies have access to DRAM and NAND and hence will use it to lock us into their ecosystem as consumers won't even get access to storage in the open market otherwise.
There are many new data-centers they are being filled with servers. Most servers have at least 2 HDD (mirror) for the OS. I would not be surprised if on a huge scale even 2 HDD per server could cause HDD shortage.
There are likely models which are trained on 4k video and it should be stored somewhere too.
Even things like logs and metrics can consume petabytes for a large (and complex) cluster. And the less mature the software the more logs you need to debug it in production.
In the AI race investments if not unlimited at least abundant. In such conditions optimization of hardware usage is the waste of time and velocity is the only things which matters.
So at least for HDDs it is unlikely they will have super custom "AI HDDs" like thay have with custom High Bandwidth Memory displacing regular RAM in fab output & being harder to reuse later.
So hopefully once they all go bust, there should be a lot of cheap enterprise HDDs on the marked as creditors pick through the wreckage. :)
Honestly looks highly suspicious to me. Because ok they might need some big storage like petabits. But how can this be a match in proportion with the capacity that is currently usually needed for everything that is hard drive hungry. Any cloud service, any storage service, all the storage needed for private photo/video/media storage for everything that is produced everyday, for all consumer hardwares like computers...
Gpu I understand but hard drive looks excessive. It's like if tomorrow there is a shortage of computer cabling because ai datacenter needs some.
If you're building for future training needs and not just present, it makes more sense. Scaling laws say the more data you have, the smarter and more knowledgeable your AI model gets in the end. So that extra storage can be quite valuable.
The TV is best device for unleashing your creativity by upvoting your favourite Sora creators! Become an expert at any field by activating premium prompts from our partners! By connecting camera you can have meaningful motivating discussions with your deceased loved ones (camera required for fraud prevention. Remember, not looking at the screen during ads is punishable by law)
You have 31/999 credits remaining. What activity would you like to productively participate in today?
I feel traditional "rust" hard disks would be inefficient for AI use. Unless they include SSDs (which I feel these data centers are more likely to be using) in the definition as well...
In a large rack-mount SAN with hundreds or thousands of disks connected to your computing equipment via fiberchannel, the performance difference is pretty negligible.
The real advantage of SSDs in this use case is storage density and power efficiency. On the other hand, your compute resources might be packed in so tight with power-intensive stuff that you appreciate the spinning rust “wasting” space.
The only silver lining is that newer devices will have to scale down memory, so developers will have to ditch memory-sucking frameworks and start to optimize things again.
If it’s temporary I can live with it.
I guess this was inevitable with the absolute insane money being poured into AI.
How many useless living humans do you know? They go somewhere. Something happens to them. Whatever it is it’s about to happen to 30% of the population.
What’s the opposite of survivor bias?
Given this has been going on for years at this point, the high prices of graphics cards through crypto and now AI, it feels like this is the new normal, forever propped up by the next grift.
US is medium risk and increasing rapidly. Run away quickly.
Sure you have to isolate certain rogue states - North Korea, Russia, USA. Always the way.
Big tech will be deemed "too big to fail" and will get a bail out. The tax payers will suffer.
Just like the price of labour. Your salary went up and doesn't come down
In the UK weekly earnings increased 34% from December 2019 to December 2025.
CPI went up 30% in the same period.
Obviously that CPI covers things which went up more, and things which went up less, and your personal inflation will be different to everyone elses. Petrol prices end of Jan 2020 were 128p a litre, end of Jan 2025 they are 132p a litre [0]. Indeed petrol prices were 132p in January 2013. If you drive 40,000 miles a year you will thus see far lower inflation than someone who doesn't drive.
[0] https://www.rac.co.uk/drive/advice/fuel-watch/
That's how inflation works. In this case it seems more narrow though, there's hope the prices will go down. Especially if the AI hype finds a reason to flounder.
Call me cynical if you like, but I don’t see this optimism that assumes the banal idea that somehow good always wins, when that’s simply not possible and in fact bad-guys have won many times before, it’s just that “dead men tell no tales” and the winners control what you think is reality.
I wasn't saying it'll be good and that the good guys win, but a lot of insane creativity to circumvent restrictions will pop up.
Same way the price of groceries going up means people buy only what they need and ditch the superfluous.
Work is fucked. 23TB of RAM online. Microservices FTW. Not. Each node has OS overhead. Each pod has language VM overhead. And the architecture can only cost more over time. On top of that "storage is cheap so we won't bother to delete anything". Stupid mentality across the board.
It could even become a kind of renaissance of efficient code… if there is any need for code at all.
The five guys left online might even get efficient and fast loading websites.
Honorable mention of the NO-TECH and LOW-TECH magazine site because I liked the effort at exploring efficient use of technology, e.g., their ~500KB solar powered site.
https://solar.lowtechmagazine.com/about/
We went from using technology to solve problems to the diametric opposite of creating new problems to solve with technology. The latter will have to contract considerably. As you say, many problems can be solved without code. If they even need to be solved in the first place.
On the efficiency front, most of what we built is for developer efficiency rather than runtime efficiency. Also needs to stop.
I'm a big fan of low tech. I still write notes on paper and use a film camera. Thanks for the link - right up my street!
No. Prices will just go up, less innovation in general.
/s
It is time to talk seriously about breaking up the hyperscalers. If we don't address the structural dominance of hyperscalers over the physical supply chain, "Personal Computing" is going to become a luxury of the past, and we’ll all be terminal-renters in someone else's data center.
This is the game plan of course, why have customers pay one time for hardware when they can have you constantly feed them money over the long term. Shareholders want this model.
It started with planned obsolescence, now this new model is the natural progression.. There is no obsolescence even in discussion when you're only option is to rent a service, that the provider has no incentive to even make competitive.
I really feel this will be China's moment to flood the market with hardware and improve their quality over time.
Yep. My take is that, ironically, it's going to be because of government funding the circular tech economy, pushing consumers out of the tech space.
post consumer capitalism
It's no coincidence that Microsoft decided to take such a massive stake in OpenAI - leveraging the opportunity to get in on a new front for vendor locking by force-multiplying their own market share by inserting it into everything they provide is an obvious choice, but also leveraging the insane amount of capital being thrown into the cesspit that is AI to make consumer hardware unaffordable (and eventually unusable due to remote attestation schemes) further enforces their position. OEM computers that meet the hardware requirements of their locked OS and software suite being the only computers that are a) affordable and b) "trusted" is the end goal.
I don't want to throw around buzzwords or be doomeristic, but this is digital corporatism in its endgame. Playing markets to price out every consumer globally for essential hardware is evil and something that a just world would punish relentlessly and swiftly, yet there aren't even crickets. This is happening unopposed.
It's so hard to grasp as a problem for the lay person until it's too late.
Things are bad and I don't know what can be done about it because the balance of power and influence is so lopsided in favor of parties who want to do bad.
These things are cyclical.
Why is this allowed on HN?
1) The comment you replied to is 1 minute old, that is fast for any system to detect weird comments
2) There's no easy and sure-fire way to detect LLM content. Here's wikipedias list of tells https://en.wikipedia.org/wiki/Wikipedia:Signs_of_AI_writing
How do you know that ? Genuine question.
The "isn't just .., but .." construction is so overused by LLMs.
LLMs do these things because they are in the training data, which means that people do these things too.
It is sometimes difficult to not sound like an LLM-written or LLM-reworded comment… I've been called a bot a few times despite never using LLMs for writing English⁴.
--------
[1] particularly vapid space-filler articles/comments or those using whataboutism style redirection, which might be a significant chunk of model training data because of how many of them are out there.
[2] I overuse footnotes as well, which is apparently a smell in the output of some generative tools.
[3] A lot of pre-LLM style-checking tools would recommend this in place of hyphens, and some automated reformatters would make the change without access, so there are going to be many examples in training data.
[4] I think there is one at work in VS which I use in DayJob, when it is suggesting code completion options to save typing (literally Glorified Predictive Text) and I sometimes accept its suggestion, and some of the tools I use to check my Spanish⁵ may be LLM based, so I can't claim that I don't use them at all.
[5] I'm just learning, so automatic translators are useful to check what I'm written isn't gibberish. For anyone else doing the same: make sure you research any suggested changes preferably using pre-2023 sources, because the output of these tools can be quite wrong as you can see when translating into a language you are fluent in.
[6] Another common “LLM tell” because they often have weighting functions especially designed to avoid token repetition, largely to avoid getting stuck in loops, but many pre-LLM grammar checking tools will pick people up on repeated word use too, and people tend to fix the direct symptom with a thesaurus rather than improving the sentence structure overall.
In this case “it’s not x, it’s y” pattern and its placement is a dead giveaway.
It's not ironic, but bitterly funny, if you ask me.
Note: I'm not an AI, I'm an actual human without a Claude account.
It seems personal computing is being screwed so people can create memes, ask questions that take 30 seconds to find the answer to with Google or Wikipedia, and sound clever on social media?
If we are talking generative AI, again from my experience, things get a bit blurry. You can use smaller models to dig data you own.
I personally used LLMs, twice up to this day. In each case it was after very long research sessions without any answers. In one, it gave me exactly one reference, and I followed that reference and learnt what I was looking for. In the second case, it gave me a couple of pointers, which I'm going to follow myself again.
So, generative AI is not that useful for me, uses way too much resources, and industry leading models are well, unethical to begin with.
I do agree with the sentiment of the AI comment, and was even weighting just letting it slide because I do fear the future tht comment was warning against.
ChatGPT does this just as much, maybe even more, across every model they've ever released to the public.
How did both Claude and GPT end up with such a similar stylistic quirk?
I'd add that Kimi does it sometimes, but much less frequently. (Kimi, in general, is a better writer with a more neutral voice.) I don't have enough experience with Gemini or Deepseek to say.
What's next, the great CPU shortage of 2026?
On the other hand the total storage capacity shipped each year has risen, as a combination of HDDs getting larger and larger, and demand shifting from smaller consumer HDDs to larger data center, enterprise and NAS HDDs. I'm not sure how flexible those production lines are, but maybe the reaction will be shifting even more capacity to higher-capacity drives with cutting-edge technology
This is assuming most of what we stored are either images or video.
Which goes into why I think this might be good. Developers have kind of treated disks as "oh well" with binaries ballooning in size, even when it can easily solved, and there is little care to make things lightweight. Just like I now figure out a different solution to recover space, I'm hoping with a shortage this kind of thing will be more widespread, and we'll end up with smaller things until the shortage is over. "Necessity is the mother of all invention" or however it goes.
And for the same reason - to avoid the dominant players going "oh shiny" on short term lucrative adventures or outright trying to manipulate the market - causing people to starve and making society grind to a halt.
Still, at least in the short to medium term, there are many companies, institutions and individuals that are used to getting new hardware that is under warranty - developer hardware refreshes, store hardware updates and deployments, some schools and academic institutions that provide hardware to students or even just people getting a laptop for schools use.
I think people would be used to getting new hardware for all of these & might be pretty much left out cold without a machine or having to opt for a reused or resold machine without the necessary technical knowledge to support it themselves.
China is now the only solution to fix broken western controlled markets.
There is appetite in some circles for a consumer boycott but not much coordination on targets.
its not being used anywhere lol where are they meant to boycott?
That's when I sell of my current hardware and house, buy a cow and some land somewhere in the boondocks and become a hermit.
"You will use AI, because that will be the only way you will have a relaxed life. You will pay for it, own nothing and be content. Nobody cares if you are happy or not."
You also have to look at the current status of the market. The level of investment in data centers spurred by AI are unlikely to last unless massive gains materialize. It's pretty clear some manufacturers are betting things will cool down and don't want to overcommit.
Could we though? Even if gerrymandering and voter suppression weren't already out of control, and getting worse, there are very few politicians who could or would do anything about all this.
Elections aren't going to save us.
Then they came for the RAM, but I did not speak out, for I had already closed Firefox.
Then they came for the hard drives, but I did not speak out, for I had the cloud.
Then my NAS died, and there was no drive left to restore from backup.
Also, the Return of PKZIP.
> They largely come from hyperscalers who want hard drives for their AI data centers, for example to store training data on them.
What type of training data? LLMs need relatively little of that. For example, DeepSeek-V3 [1], still a relatively large model:
> We pre-train DeepSeek-V3 on 14.8 trillion diverse and high-quality tokens
At 2 bytes per token, that's 29.6 terabytes. That's basically nothing compared to the amount of 4K content that is uploaded to YouTube every day.
1: https://arxiv.org/html/2412.19437v1
There are many new data-centers they are being filled with servers. Most servers have at least 2 HDD (mirror) for the OS. I would not be surprised if on a huge scale even 2 HDD per server could cause HDD shortage.
There are likely models which are trained on 4k video and it should be stored somewhere too.
Even things like logs and metrics can consume petabytes for a large (and complex) cluster. And the less mature the software the more logs you need to debug it in production.
In the AI race investments if not unlimited at least abundant. In such conditions optimization of hardware usage is the waste of time and velocity is the only things which matters.
So hopefully once they all go bust, there should be a lot of cheap enterprise HDDs on the marked as creditors pick through the wreckage. :)
Gpu I understand but hard drive looks excessive. It's like if tomorrow there is a shortage of computer cabling because ai datacenter needs some.
You have 31/999 credits remaining. What activity would you like to productively participate in today?
The real advantage of SSDs in this use case is storage density and power efficiency. On the other hand, your compute resources might be packed in so tight with power-intensive stuff that you appreciate the spinning rust “wasting” space.