• 0 Posts
  • 23 Comments
Joined 2 years ago
cake
Cake day: September 6th, 2023

help-circle
  • So it just happens that I’m always unlucky with the hardware in all 3 gaming desktop PCs that I’ve built over the last ~14 years that they would have this behaviour? (Bluescreen out of nowhere, without any intensive hardware usage or weird behaviour or anything that justifies triggering this)

    I haven’t ran Linux on this machine yet, but I’m more likely to go for it with each passing day. I can let you know how it goes and how many crashes I get. I don’t expect any, as I’ve been using Linux on my laptops for close to 10 years now and the only time I had a hardware problem was because I was using my old laptop as a server and its HDD died - but that’s because HDD laptops are not meant for these usage patterns, so totally expected tbh.

    I find it super interesting that there is this pushback from you and the other user here saying “Windows doesn’t have any problems”, which now turns into “Ah but that’s hardware and Linux would have it too!” - How can you know?? How can you make such an apt diagnosis that it’s not a Windows-only issue with practically zero information??

    Because if we are just blindly applying patterns then yes I can do the same - I’ve literally never had a running Linux system crash on me, from running it in VMs, Cloud VPSs, Laptops and even a Raspberry Pi. And in the same timeframe I’ve had more random blue screens on Windows than I can count. And I can tell you that I looked quite hard at a few of them to try and avoid triggering them. The only one I managed to figure out was that in a Laptop X USB-powered headset combo I could never start up the system with the headset connected, as if the cable was messed with and the dongle lost connection, it would immediately blue screen the laptop. But if it booted with the USB dongle disconnected and I connected it later, it was totally fine!! - Even with this crash being caused by hardware, how can you say that this is a reasonable behaviour from a “production-ready” OS? The latter behaviour tells us that it is totally possible to handle this hardware error without a full system crash, but it can happen all the same!

    I honestly don’t understand what you or the other user get from being so adamant about Windows being good, or being “not that bad”, or just “Windows isn’t the problem here, you are” 🙃 I hope at least Microsoft pays you well for the advertising!





  • I’ve been loving the series, just working my way through the last book now, and already getting the feeling in the OP from seeing that there’s not that much book left anymore :(

    I somehow at the same time feel that it has no right being this good, but also enjoy it a ton. It’s a weird feeling but I’ll take it - it really appeals to my taste.


  • From the blog you quoted yourself:

    Despite improving AI energy efficiency, total energy consumption is likely to increase because of the massive increase in usage. A large portion of the increase in energy consumption between 2024 to 2023 is attributed to AI-related servers. Their usage grew from 2 TWh in 2017 to 40 TWh in 2023. This is a big driver behind the projected scenarios for total US energy consumption, ranging from 325 to 580 TWh (6.7% to 12% of total electricity consumption) in the US by 2028.

    (And likewise, the last graph of predictions for 2028)

    From a quick read of that source, it is unclear to me if it factors in the electricity cost of training the models. It seems to me that it doesn’t.

    I found more information here: https://www.technologyreview.com/2025/05/20/1116327/ai-energy-usage-climate-footprint-big-tech/

    Racks of servers hum along for months, ingesting training data, crunching numbers, and performing computations. This is a time-consuming and expensive process—it’s estimated that training OpenAI’s GPT-4 took over $100 million and consumed 50 gigawatt-hours of energy, enough to power San Francisco for three days.

    So, I’m not sure if those numbers for 2023 paint the full picture. And adoption of AI-powered tools was definitely not as high in 2023 as it is nowadays. So I wouldn’t be surprised if those numbers were much higher than the reported 22.7% of the total server power usage in the US.






  • That is the fallacious paper millionaire argument. They have more than enough liquidity, can take loans against their “non-liquid” wealth, and are anyway working with multi-year plans to sell assets and have enough liquidity. I believe this is also explained in https://mkorostoff.github.io/1-pixel-wealth/ and there they also explain that the US market cap is bigger than their stocks and so. So they could sell a lot in one go, of course losing “efficiency”, but the market would be able to cope without any issue.

    I think we are on the same side here judging from the rest of your comment, but I find it important to refute this typical argument, because it does not help that there is some sort of billionaire apologism by saying that they “don’t actually have this money in their bank account to spend”.






  • Mostly yes, but they’re in general oversimplified (for obvious reasons)

    But it’s more about offensive cyber security than necessarily the Linux part. The Linux part is just file system navigation and not much more, the rest is the “hacking” part, and that’s what I’m talking about

    Disclaimer: I did not complete it, but I got pretty far, and I worked in the cyber security area.

    I would however say it’s not a good place if you want to learn as that’s not really the game’s focus. There are better resources out there like overthewire and linuxjourney for that