My wife’s old gaming pc lasted 2016 - 2025 with the only upgrade being a new and larger SATA SSD. I’ve never had a gaming pc last that long before and the cost per year turned out really great.
I really hope that the consumer market get back on track before I need to make any more larger purchases.
As an elder millennial I grew up in an era where a few years meant the difference between bleeding edge and obsolete. This continued until the late 2010s and things just seemed to seriously stagnate after that.
I sorta feel like there was a significant dropoff after the first generation that included ddr4 and pcie4 unless you specifically except for GPUs if you specifically used the RTX features in terms of real-world performance* in the higher end consumer desktop side. I rarely had issue with my i7-4790k setup and it was a ddr3/pcie3/limited nvme support generation. Only replaced it last year because the mobo died.
*Based entirely on my own use cases.
My GTX 1070 became obesolete after around 4 years. Not in the “Can’t launch games” obsolete, but more “Can’t keep a stable fps in newer games” obsolete. However my 3070 is still going strong, with no real issues in anything after 5 years. I see absolutely no reason to upgrade it.
Maybe it’s time to value debloated software…
Maybe a mouse driver doesn’t have to take 1 GB of RAM and send usage statistics to the manufacturer…
2026 year of the Linux desktop
I’m so old, i even remember when computers got faster over time.
It’s the end of the Silicon Valley era. In a few years I guess new and faster stuff will come from China.
Unless there is a paradigm or materials breakthrough I wouldn’t expect a major leap anytime soon.
Its kinda like the same with TV quality/video game graphics. We’ve been squeezing a lot out of the current technology, but further enhancements will be incremental.
The paradigm shift will be going back to low level programming. Quake type innovations like fast sqrt
Yeah I think we’re at the point of technology plateauing. Ways to interact with that tech may change, like VR vs traditional screens, but we can only compress so many pixels on a surface until it just starts looking marginally better for more money.
I still watch a lot of youtube and movies and stuff no more than 1080p because the increase in quality beyond that doesn’t affect the quality of the video enough for me to give a shit. I’ve seen 4k and 8k tvs in tech stores like Best Buy and while they look awesome, it wouldn’t change how I consume the same media.
I remember back in the day you could buy a hard drive that was shown on the box as having a something like 300 MB capacity, then when you opened the box you found out it actually had 500 MB capacity - because the advances in manufacturing were outpacing their ability to print new boxes.
Damn we had it good then.
Kind of unimaginable these days, since you’d expect it to be artificially limited to 300 MB, even if the hardware could support 500.
Like how on the 970(?), on some video cards, you could overclock it into being a 970Ti if you were lucky, since it was before they started lasering parts off the chips to stop people doing that to lower-grade cards, and binning was just a matter of whether it passed muster or not to match the requirements for the higher-grade cards. They could sometimes be pushed to that level, just not reliably for all of them.
20 years ago or so, I was at a computer parts store, pricing up parts to build a computer. I was on a limited budget and had already decided on a CPU, graphics card, and a motherboard.
“Ah, crap, I forgot about RAM”, I said. “No problem”, the shopkeeper replied, “RAM is cheap”
I don’t remember what the CPU I got was, but the GPU was an Nvidia GeForce 6600GT.
“Ah, crap, I forgot about RAM”, I said. “No problem”, the shopkeeper replied, “RAM is cheap”
That was also the mindset in programming up to very recently. Memory is cheap and plentiful, so you didn’t need to worry too much for all but the most conservative of memory management.
Strange to think that with how computers are today, we’re looping back around to resource-limited compute, not because of software bloat or anything, but that people won’t be able to afford to change their computers out for something more powerful.
I bought my current PC 13 years ago. Haven’t changed a part out of it. At the time I bought it, it had a GTX 745 in it, i7 4790 @ 3.60GHz, 12gb DDR3 ram, and I got it for less than a $1000. While I can’t play games like RDR2 or the AAA modern day games on it, it still gets the job done for the games I enjoy playing in 2026.
I’m so old, i got excited for a new couch. a few months ago, I got excited because i got a new rug.
I have a video about me being giddy that I have an air fryer that has two compartments that you can set up to finish at the same time, so if you have something you need to have it in there for 15 minutes and the other thing only for 10 minutes, then the 10 minute side waits 5 minutes and starts when the 15 minute side counts down to 10 minutes, so now my protein and the sides get finished at the same time.
That’s awesome for me.
I have one that does that too!! It’s seriously awesome!
wow. That’s an epic appliance! I’m jelly :)
Hell yeah aging. It’s funny how such simple things can bring satisfaction but if stuff like that makes you happy, i say you’re doing a-ok 🥂
I don’t have much (am long term unemployed/disabled), so the few things I can afford really do get me excited
I ‘panic bought’ (OK, not out of panic, but mild concern) a 22TB drive since the price seemed not too astronomical, and the local store had a few left. Just in case.
Seems the supplies really are drying up. Fuck these AI companies. Doesn’t matter if they actually intended to wage a war on personal computation; their hoarding of the supply chain for years to come really is an assault on our ability to afford local self-hosted computing. I hope the bubble bursts, soon and hard.
Doesn’t matter if they actually intended to wage a war on personal computation
I think they are intentionally waging war on personal hardware. They want everything cloud based so they have direct access and ownership of all user activity. Their wet dream is everyone’s computer being a streaming stick plugged into a monitor.
I know RAM and graphics cards are up, but computers are still getting cheaper. A $400 laptop today vs. a $400 laptop 10 years ago, is slightly faster but also, due to inflation, the newer one is cheaper.
That only works if any regular persons pay has actually increased in that time.

RAM, graphics cards, SSDs, and HDDs are all up, some multiple-fold in price, and the disruption looks like it may continue for years. Combined with the death of Moore’s Law, a $400 computer at today’s prices might actually be worse than what you’d get for that money a decade ago.
I find that hard to believe. Do you have an example of a laptop from 2016 that was sold for $400 that would perform better than a 2026 laptop costing $400?
The 2026 laptop would need to be $550 to account for inflation.
The pricing shocks are slower to hit OEM’s, but rest assured they will. A $400 laptop from July 2025 is going to cost $800 in July 2026.
10 years ago, sure. But what about 2 years ago? Progress has slowed significantly, with core parts only very recently exploding in price
Computer are still getting cheaper as long as you bring our own RAM and storage. /s
That’s only because Intel/AMD hasn’t figured out how to cash in on the artificial BS yet :)
give em a week, they’ll tell us that all their processors are purchased through 2030
More energy efficient as well.






