2026 is going to suck for hardware, but 2027 might be better if this nonsense blows over. For one thing, AMD’s RDNA 5 was announced for 2027, which is supposed to be more comparable to Nvidia for compute workloads, including real RTX cores. AMDs recent SoCs have been pretty impressive, so I’m looking forward to AMD SoCs that are competitive with Nvidia discrete GPUs beyond just rasterization, except without artificially constrained VRAM and lower power requirements.
Who produces the chips that make AMD products? They are the bottleneck. If those fabs are already overloaded, a new product won’t help in any way.
32GB of DDR4 3200Mhz cost me $115 in October, now the listing is out of stock and says $392. 240% increase.
We’re in the late stages of the AI bubble.
Ram is the new Bitcoin.
I was about to say “atleast this is a physical thing not some completely made-up digital thing with no real value”
Then I realized these prices are partially based on speculation… So I guess it’s still just a made up digital thingy"
Fuck man turns out all money is fake this is bullshit I want a redo
When the yet-to-be data centers never get built because AI slop bubble pops, we will be able to build houses out of RAM sticks for the poor
Theyll just manufacture another reason to keep prices high
Ahhh the de beers technique
deleted by creator
I can’t believe how lucky I was to upgrade my desktop before the surge. This is an outrage!

I did my desktop but skipped my server.
Even decade+ old used surplus server DDR4 didn’t escape the apocalypse.
I upgraded January last year. My only regret now is not getting 64GB of ram
Same … I hadn’t upgraded since 2012, and had some extra cash, so rebuilt in August. Feeling pretty lucky to have done it then, and really glad I went ahead and put 64GB RAM in it.
You waited a long time holy shit. I regret not getting 64 gb in 2024 when I did my current build.
I did as well which is a polite way of saying I blew all my RAM savings on the how overpriced GPUs were at the time
Same. Feel like I could sell my rig for more than it cost me a 18 months ago.
Minus the case and video card, I have an entire 3rd gen i7 machine sitting in a box that would actually make a pretty good machine for a lot of different uses.
Im on Linux and it requires just as much memory as it did in 2018. No problem here.
I upgraded mine from 16GB to 32GB two years ago beacuse RAM was cheap. I didn’t really need it, and have probably never hit 16GB usage anyway.
Meanwhile my work Windows laptop uses 16GB at idle after first login.
Windows has always been wasteful computing, and everyone just pays for it.
Requiring less RAM is good, but conceptually, it’s Linux that is “wasting” the RAM by never using it. It’s there, and it’s reusable, fill it up! Now, does Windows make good use of it? No idea. Wouldn’t bet on it, but I could be surprised.
Linux uses “free” ram for caching, so it’s not really wasted.
Empty ram is inherently wasteful.
Linux doesn’t waste RAM. All unused RAM becomes a disk read cache, but remains available on demand.
Storing data in ram isn’t wasteful though, I have a lot of criticisms of windows but the memory management isn’t one of them. I’d rather have as much predictive content be staged in ram as possible as long as it’s readily dumped out if I go to do something else, which is my experience. Like I don’t earn interest for having unused RAM on my computers (for reference I have an endeavorOS, rhel, fedora, and windows computers under my desk connected to a dual monitor kvm right now; it isn’t like I don’t regularly use/prefer Linux; I mostly access my windows machine via rustdesk for work related stuff I don’t feel like having to dick with on Linux like the purchase order system and Timecard system), I just don’t get this critique.
I wish I had a 32gb ram laptop.
I can have 3 development IDEs open at once, and with all the browser tabs open and a few other programs here and there its stretching the limits on my Mac.
I have 32GB on my Windows PC laptop it can’t do three at once.
Running the backend (java) and the frontend (react native) in intellij uses 29GB RAM, so I must run Android on real hardware over ADB+USB. Running an android simulator pushes it over the edge.
Also: Laptops are shit. On Windows, the tau is so bad that the cores are throttled straight after boot because the cooling is rubbish. It almost never hits full speed. It can’t survive more than 40 minutes on a full battery. It might as well be a NUC.
Also: Laptops are shit. On Windows, the tau is so bad that the cores are throttled straight after boot because the cooling is rubbish. It almost never hits full speed. It can’t survive more than 40 minutes on a full battery.
That’s the reason I have not bought a new laptop in years. Everything must be as thin as possible because apple did it. Fuck that. I want my laptop as thick as a brick to have enough cooling for CPU, GPU and a 6l V8 engine, and a battery that will outlast the sun!
Does Clevo still make the fat laptops? My last was one of theirs and it was almost as thick as my fore arm. It also weighed a ton but on the plus side insanely easy disassembly. I probably should’ve got another one, my MSI is shit to open
Ya, macs are definitely more efficient with their ram.
I’ll have Android Studio open for my main work, Intellij Idea for all the backend work, and Xcode when I need to tweak some iPhone things. (edit: usually it’s just 2 of the 3, but sometimes its all 3)
I also mainly use real devices for testing,and opening emulators if all 3 are open can be a problem, and it’s so annoying opening and closing things.
Damn never thought the gaming PC I built two years ago would actually be APPRECIATING in value over time.
Just about all electronics older than a year or so have. Even a Switch, which came out 9 years ago, costs more to buy now than it did then!
Wait what? I still remembered it as a recent console…
I feel like my brain is stuck. When I think of most powerful GPU, my brain’s muscle memory replies with 1080 Ti.
It’s truly mental. I don’t think I could afford to build my PC at the same spec today with RAM and SSD prices being what they are.
I have 128 GB of ddr5 memory in my machine. I paid 1400 for my 7900xtx which I thought was crazy and now half my ram is worth that.
Never thought I would see the day where the graphics card was not the most expensive component.
Maybe you should have bought 256 GB
I should not have even gotten the 128.
I can use it but barely at 4600 because ryzen chips can’t handle 4 dimms of 32gvb.
I honestly didn’t even bother to check at the time of purchase and it is is still a roll of the dice if I restart.
This article sucks… I think they felt the need to excuse AI lest they upset corporate masters
While it’s easy to point the finger at AI’s unquenchable memory thirst for the current crisis, it’s not the only reason.
Followed by:
DRAM production hasn’t kept up with demand. Older memory types are being phased out, newer ones are steered toward higher margin customers, and consumer RAM is left exposed whenever supply tightens.
Production has not kept up with demand… demand being super charged by AI purchases
…newer ones are steered towards higher margin customers… again AI
consumer RAM is left exposed whenever supply tightens… because of AI
You see, it’s easy to blame AI data centers buying all the RAM - but that’s only half the story! the other half of the story is manufacturers selling to these data centers
The LLM writing this feels almost sentient lol.
Apple over here not raising their RAM prices because they’ve always been massively and unjustifiably inflated. Now, they’re no longer unjustifiably inflated.
I dunno. “AI companies bought literally everything” seems like an unjustifiable reason still.
Perhaps. I guess my point is they no longer are as out-of-line with the rest of the market. Comment meant as a backhanded “compliment” toward Apple.
They also buy allotments months in advance. Just waiting to see how much apple will charge soon.
I mean, they are, just for a different reason
Apple: see!? We do not inflate our ram prices!!!
Me to my 10 year old gaming pc: “I guess it’ll be another couple of years, buddy.”
Give it a lil pat pat for good measure
Saaame. I’m still chugging along with an RTX 2060 😭
I’m running a 1070.
Another couple of decades, even!
They aren’t really making a ton of games that justify a costly upgrade anyways.
I agree. I recently swapped out my aging 2600x6core for a 5950x32core processor and upgraded from an 3070ti to a 5070 (well actually more of a sidegrade - my vram was simply too small). before this, my system had already a few years where there wasn’t much difference regarding gaming - In the current configuration and the glacial speed gaming developed i’d say i have a decade before upgrades are really needed.
Honestly, my PC at this point plays FFXIV and that’s basically it. And I’m OK with that.
Can’t wait for yhem to push forced end of life on all computers

Microsoft tried that with TPM. Which you can bypass for the most part with Rufus and a clean install. Still some kernel anticheat games you can’t do so so easily.
I’ve already switched over to Linux, just got one more system to migrate. So far 100% worth it to not deal with Microslop.
What I am truly afraid of, is they somehow get ahold of Linux, and force slopware onto it, including that EoL bs. Keep offline devices as backup.
Same here, running my 3700X with a 3080.
I should’ve pulled the trigger on the 9800X3D last year like I wanted, but thought it was just too expensive.
Welp.
Don’t worry, I suspect Cloud Terminals will be super cheap. You won’t even need that ol’ thing anymore!
I can’t believe I snagged strix halo 128GB ram mini PC for 1600$ when it first came out
I used to scavenge all the ram I needed from the trash, nothing in months
Increasing RAM (from 16) and SSD space (from .5TB) on a laptop now is easily +$1k and up.
Apple users:

for you
- posted from my Lenovo Thinkpad x250






















