• 0 Posts
  • 15 Comments
Joined 6 years ago
cake
Cake day: January 18th, 2020

help-circle
  • BusyBoredom@lemmy.mltoWholeSomeMemes@lemmy.mlWednesday!
    link
    fedilink
    arrow-up
    3
    arrow-down
    2
    ·
    16 days ago

    I am not a fan of this culture of identifying yourself by your problems.

    Making a problem your identity just errodes all the great things about the individual. Being an amputee isn’t a personality, and neither is being depressed or having anxiety. People are bigger than that, they have individual interests, skills, and virtues that are much healthier to identify by.

    I think of myself as a friend, a technologically inclined person, an athletic person, a smart person, a person who loves games and octopuses, etc.

    I might also have some of the traits or feelings in this infographic, but they aren’t me. They are problems that I, the cool techy athletic animal lover, am solving.

    Play your strengths, fix your weekenesses. Don’t make your biggest problem into your middle name.


  • Yep that’s right, below the plank length you can’t make position measurements without destroying what you’re trying to measure.

    And you are right that it can be fully explained without needing to be in a simulation, that is how these were discovered after all. The simulation angle is pretty far outside of the math of respectable physics.

    The reason the simulation hypothesis bleeds into the discussion is because its natural to ask “why” things break down at that specific size. Humans don’t like vague answers like “because god likes that number”, we prefer to tell ourselves stories that fit the numbers into physical things in our minds. Just like the Bohr model of the atom was a useful story about how atoms are structured for decades despite not being rigorously proven (and even being firmly disproven nowadays), one story we can tell ourselves to make sense of quantization is to view it as an attempt at limiting precision for computation.

    It is only a story at the end of the day though. We don’t really know why physics was set up exactly this way any more than we know why the big bang happened in the first place. Just lots of different peoples guesses, telling plausible stories about the math.


  • Yep that’s right!

    I was using “grid” to be more easily understood, but what we really have is quantization. We get plank length from the uncertainty principle, as well as plank time, which is reminiscent of common strategies for reducing the computational requirements of simulations (reduced precision calculations). This pattern is repeated in the quantization of charge, etc.

    So you’re right its not a grid. Just a continuous cap on precision. But the point stands, it looks familiar to programmers and is a fair reason to suspect we’re in a simulation. Not proof of course, just a neat little hint.


  • The high level idea (uncertainty principle looks like we’re on a grid, which feels familiar to programmers) is fine, that’s a common and pretty solid argument for living in a simulation.

    The code you’ve written though has very little (actually none, as far as I can see) connection to that idea. Its like you’ve tried to do an experiment in physics but all your code is for a modern art exhibit.

    AI is leading you astray here. I have a bachelors in physics, and none of this is actual physics. I suggest setting AI aside and doing this the old fashioned way with textbooks and paper, because the code here isn’t doing anything even remotely related to what you want.


  • I have been doing a bit of compute work on nixos with both AMD and nvidia, and I’d say it depends on what you’re doing.

    If you’re doing your compute via compute shaders, you’ll have a great experience on AMD. Zero hiccups for me, I just wrote my shaders and ran them no problem. Vulkan is incredible.

    If you have to interact with other people’s compute crap though, it might be a bad time. Most folks do GPU compute with cuda, and that won’t be fun for you on AMD. Yes there are translation layers, and you can make them work for some use cases, but its a bad experience. And yeah rocm exists… but does it really? Not many cards actually support rocm, and software support for it is just as sparse.


  • Well yeah sure if you want a set algorithm to perfectly reproduce this exact universe deterministically, that’s not gonna work out so well.

    But a simulation doesn’t have to be perfectly consistent and deterministic to “work”. If anything, the fact that some things can’t be predicted is evidence in favor of us being in a simulation, not against.

    This paper just rules out a class of algorithms. Were not in a specific type of simulation. Doesn’t mean we’re not in a simulation at all.


  • It seems to be getting a little better lately, but historically they just didn’t give firmware updates. I went 3 years without a stable bios update for my 12th gen Intel mainboard on linux despite known vulnerabilities since launch (just got its first update last month).

    I actually upgraded my mainboard within that time, so I went the full lifetime of the product with an insecure BIOS and none of the firmware improvements that were promised at launch like thunderbolt 4 certification. For all practical purposes, firmware support ended when it left the factory until just last month.

    That said, my new ryzen ai 350 main board just got its first update to patch some vulns that were disclosed a month ago. So still not in time for the coordinated disclosure, but a month is way better than 3 years so I’ll take what I can get.





  • BusyBoredom@lemmy.mltoLinux@lemmy.mlAsking About Tuxedo Computers
    link
    fedilink
    arrow-up
    13
    arrow-down
    1
    ·
    7 months ago

    Unfortunately no, that is only for installation via windows. There is no Linux updater for it. Still no stable update on Linux for the 12th gen :(

    Edit: oh nice actually I see they added an EFI updater! That’s great progress :) so I stand corrected – 3 years after launch, it got its first and only stable update on linux! Thanks for the correction.


  • BusyBoredom@lemmy.mltoLinux@lemmy.mlAsking About Tuxedo Computers
    link
    fedilink
    arrow-up
    12
    arrow-down
    1
    ·
    7 months ago

    Not trying to be disingenuous, I have a framework 13 and I love it. Just pointing out some real tradeoffs with it that folks ought to be made aware of.

    You can get a lenovo or a dell or something else with the same specs for cheaper if you’re willing to give up repairability. I’m not, so I have a framework, but that’s not everyone.

    For firmware support, the 12 gen frameworks still haven’t received a single stable update on Linux since launch despite known vulnerabilities. That’s an ongoing issue, its not fixed. For some people in security a critical environments, that’s a deal breaker. Similarly the 13th gen only got 1 stable update on linux and the ryzen 300s (which I have now) again have started to get vulns reported but no patches yet.

    Most vendors patch vulnerabilities according to the coordinated disclosure, and framework doesn’t. Not saying framework is the devil or anything, just that there’s a real tradeoff for some people.


  • BusyBoredom@lemmy.mltoLinux@lemmy.mlAsking About Tuxedo Computers
    link
    fedilink
    arrow-up
    15
    arrow-down
    5
    ·
    7 months ago

    Some caveats on the framework recommendation:

    • Its value proposition is in its repairability and upgradeability. You save money by repairing and upgrading the existing laptop later instead of buying a new one. If you don’t like tinkering, its not a great value.
    • Framework does not provide firmware support. Most framework mainboards only receive one or two firmware updates within the normal lifespan of a device. So if you need a secure bios, framework is not an option.
    • The actual performance per dollar of framework laptops the day you buy them is poor. A lenovo or a dell will beat it in perf/$ anyday.Frameworks laptops are only attractive perf/$ wise later when you start benefiting from cheap upgrades.