Spend money on servers. Verify your players. I don't care how you do it, but you don't get kernel-level access to my machine because some asshole script kiddies are aimbotting. You can never trust the client. This is basic shit that game devs will make up a whole host of bullshit to try to justify. (FWIW: I spent a solid decade as a professional game dev and I was as disappointed in this horseshit then as I am now. At least players are starting to figure it out now, too.)
People say "kernel level" anticheat as if that would be necessary for some reason, but I don't really see it catching on in the linux world. Steam doesn't even have root normally. Even if it did, not everyone runs exactly the same linux kernel and the only practical way to distribute a module that's going to work for most people is through dkms, which means you build it from source, which means proprietary super-obfuscated shit is not going have its intended effect (assuming it ever does.)
There's nothing stopping them from doing all the same bullshit in userspace instead.
Microsoft even sees it as a big mistake. They're creating APIs that won't require anti-cheat to be in the kernel like that. There shouldn't be any reason it needs to be in the Linux kernel.
That said, "don't trust the client" is a nice thing to say, but it's basically impossible to make games work like that. There are certain protocol design considerations that are needed for fps games to work in multiplayer with somewhat laggy connections, and they're not completely compatible with "don't trust the client". If we all had the fiber optic connections and IPv6 that we were promised in the 90s, things would be different. The wack-a-mole game against cheaters is the best that can be done otherwise.
It can be routed more efficiently and has generally lower latency. Though how much it matters in practice is debatable, and real world data has fluctuated.
One thing it definitely enables is easier setup of home servers for games without NAT nonsense.
"Never trust the client" renders entire genres of games inaccessible for a big corporation. But those genres have billions of dollars of potential profit in them. So they will go as far as they can to make the client almost trustable. The average player of a first-person shooter doesn't really think about the implications of kernel-level anticheat at all so it's not a hard choice for them. 95% of them are on windows after all and that already gives kenel access to their PC to some entity they really have no good reason to trust.
Ultimately you either have basically google stadia (with all its technical problems) or you are trusting the client to render the game.
Even if the client only has exactly the absolute minimum amount of information needed to draw all the things that are visible, that still allows a cheat to see the player coordinates and the coordinates of visible entities, which usually makes eg. an aimbot trivial to make.
Servers would just be checking to make sure a player can actually do something and if they can't don't let the client do it.
Then any changes made client side would only affect the player making the changes.
It boggles my mind Riot would rather use a hacky method like kernel level anti-cheat instead of just having the servers mirror the game logic and confirm everything. Its all about saving money at our expense.
No shit sherlock. Rendering requires information about the game, and that information is enough to allow cheating. Aimbots don't need to perform "invalid actions" in order to wreck a game. They just need to be faster and more accurate than most human players. Trying to heuristically detect aimbots is also commonly used alongside other anticheat methods, it just doesn't work (unless you have people manually reviewing individual reported cheaters, but companies try to avoid that because it's expensive and risks false positives).
Right. Nice to find someone here who actually understands some of the problems that make kernel level anticheat important. In modern FPS, server authoiritative everything is just not feasible. No gamer is going to accept the latency that would cause. Or when you look and suddenly you momentarily desynced and now your camera jerks back because the sever decided you tried to look around too quickly? People are so willfully ignorant about this topic.
maybe we could get away from needing it as badly if valve would provide a strong attestation that the kernel running is unmodified with a secure tpm solution, but they haven't, so here we are.
...and just to be clear, this is a multiplatform problem. There's a single mediocre 'easy' option in windows land and a very tinkery option in linux land.
Doesn't seem like any OS has caught up to the idea of fast streaming desktops quite yet. I'm convinced it's the future of computing though. Way better than old VDI options from days of yore.
I setup Sunshine on my Kubuntu machine last night. Took me fucking AGES to figure it out. Recently set it up on my M1 Mac mini, which took me a couple of minutes.
It cannot generate a virtual display. It only uses attached displays which by default are real powered on monitors.
I’ve gotten around this on windows with parsec and a virtual display adapter that someone keeps updated on GitHub which can spawn backup displays if none are present, but I find still sometimes fails to spawn them. Parsec is fairly reliable at spawning them when the windows solution fails but it’s not perfect either.
A hack job virtual display on Linux will be more difficult to work with. It’s going to eat my desktop and be fairly hidden.
Dummy plugs exist, but I specifically don’t want to put dummy plugs in all my remote hosts. Seems like an unga bunga solution to something which should be software.
Something with VNC or simply ssh with some scripting could be the workaround I use to get back in when a virtual display fails to work as expected, but I am lazy and want something effectively bulletproof.
Well, dummy plugs are also kinda useful on windows - can't seamlessly switch to client's resolution without setting up the resolution profiles first, and that requires a device to apply the profiles to.
Also, you can create virtual displays fairly easy on Xorg, but yes, the entire sunshine setup is infinitely easier on windows.
non-arm-Windows client to Windows host you can use something like parsec where you can adjust resolution on the fly. You can install a fallback virtual display for when the display is off from right inside the app in a single click + UAC prompt. I really don't like parsec though because I know the enshittification will come, and I don't really trust them to be secure or to not abuse their backdoor accesses.
This project allows you to create virtual displays that are persistent, and you can configure them so that when the primary monitor is off the backup one is enabled in windows, just by using the default windows display manager options. You can change the resolution freely... because this is using the same vd driver parsec created https://github.com/nomi-san/parsec-vdd - this works pretty well overall with Sunshine
Ultimately though, Sunshine and Parsec are the only two things i'm aware of with great low latency and high fidelity remote capabilities, aside from niche implementations like what the PS5 has. If something like Xorg had similar quality and latency parity i'd be interested, but i'm under the impression everything is like old school vnc or rdp where it is functional when necessary but not very pleasant overall.
It creates virtual displays on the fly and runs completely in the background. Requires a little bit of setup as well as Docker running but it's pretty neat.
I've seen this but I don't really want the docker part.
I think it could be phenomenal for some kind of beefy VDI implementation for low demanding games or some kind of monster server with multiple GPUs, but it just feels wrong for an individual who wants to remotely stream their desktop on demand and has no plans on having others share the host.
Maybe i'm overthinking it, or haven't thought it through enough - but my gut says this has more drawbacks than i'm realizing.
I'm in search for the same white whale. There's quite a bit of documentation for multi-seat configurations for Linux, i.e. supporting the use of multiple screens and keyboards for separate simultaneous logins.
However I'd like to remote into a separate game scope session with its own human interface inputs and virtual audio and video outputs, as the same primary user normally logged in active desktop environment session. I'd like it so the remote and local sessions would not interfere with each other state, but without necessitating multiple Linux users for each session use case.
That last bit is what makes it more tricky and very niche. Supporting essentially multiple desktop environments probably demands separate user debus sockets. Using c groups via containers makes that viable, but like you, I also like to avoid containers and extensive volume mounts.
Somehow I feel like mentioning Nix and NixOS is the new 'I use arch btw'.
"I use Nix btw"
Rolls off the tongue in the same way. And, honestly, "I use Arch btw" just isn't the same hipster know-it-all contrarian meme that it used to be. It has a graphical installer now, and a popular retail device (the Steam Deck) comes with a user-friendly derivative of it installed out of the box.
Meanwhile, NixOS has a huge learning curve that's off-putting to most non-technical users and even Linux hobbiests. I mean, really—having to configure everything through a functional programming language masquerading as a configuration file format? That's just the kind of thing that would attract masochists and pedants!
I just left nixos after about two years, and now on cachyos (arch). nixos is pretty cool in a lot of ways, but trying to stay on the bleeding edge of packages and kernels means living in nixos-unstable land, where broken builds are common. and I just got tired of it all.
Fun fact: if you have a Steam Deck, Nix (the package manager) is pretty much the only vendor-approved way to safely install extra packages that aren't otherwise available as a flatpak.
Trying to screw with overlayfs to make pacman usable is/was a thing, and it was a very good way to break the OS install despite it having atomic updates.
Does Steam Deck not have rpm-ostree (or an arch equivalent since RPM is fedora-specific)?
Needs "pac-ostree" or something...
Also, what about distrobox?
I haven't really tried to do anything package manager-related on my Deck, so I'm going on what I know from Bazzite, but there are several ways to install non-flatpak software on it. In fact, I even installed yay on an Arch distrobox, and I can install things from the AUR (as well as the official repositories).
Does Steam Deck not have rpm-ostree (or an arch equivalent since RPM is fedora-specific)?
Steam Deck has a custom solution involving an A/B partition scheme of immutable btrfs filesystems and overlayfs for layering changes on top of that.
Also, what about distrobox?
If there's a way to install containerization software with Flatpak, maybe. Docker isn't available out of the box, though.
I haven't really tried to do anything package manager-related on my Deck, so I'm going on what I know from Bazzite, but there are several ways to install non-flatpak software on it. In fact, I even installed yay on an Arch distrobox, and I can install things from the AUR (as well as the official repositories).
You can use pacman, but it's volatile and requires making intentional changes to restore its functionality.
The first option is to disable the read-only flag on the root filesystem, then set pacman back up so it can pull packages. Whenever the root filesystem image is updated, you'll lose the changes, though.
The second option is to add an overlayfs to persist the changes in a different partition or inside a disk image on the writable storage. There was a tool called "rwfus" that did this, and it worked well enough if you were careful. If you ended up upgrading a package that came installed on the base image, though, it would end up breaking the install when the next update came around.
With all the caveats, when Valve made /nix available as a persistent overlay a couple of years ago, I just bit the bullet and learned how to use Nix to install packages with nix-env -i.
Distrobox works really well in Bazzite, in fact I'm currently typing this comment in LibreWolf in a Fedora toolbox because I was getting a weird lag with the flatpak version. You wouldn't even know if you didn't set it up yourself, since it's just an icon on my launcher like any other program. No noticeable overhead whatsoever either.
I’m just now starting my degree is software engineering. I’m 31. I’d gotten comfortable enough with Linux that I wanted to try NixOS to avoid having my system get borked again (in my case, KDE Plasma started having shell crashes at log in).
If I was only using NixOS to run a basic computer set up? Sure, no problem. If I want to rice and customize it? No, I wasn’t ready.
But but Phoronix just told me yesterday that Linux users went dramatically down on Steam from 3% to 2.99%! Almost like this weekly/monthly claim of Linux users "crossing" another imaginary threshold line hold as much value as this comment /s
I've had so many problems with Nvidia GPUs on Linux over the years that I now refuse to buy anything Nvidia. AMD cards work flawlessly and get very long-term support.
When people switch to Linux they don’t do a lot of research beforehand. I, for one, didn’t know that Nvidia doesn’t work well with it until I had been using it for years.
To be fair, Nvidia supports their newer GPUs well enough, so you may not have any problems for a while. But once they decide to end support for a product line, it's basically a death sentence for that hardware. That's what happened to me recently with the 470 driver. Older GPU worked fine until a kernel update broke the driver. There's nobody fixing it anymore, and they won't open-source even obsolete drivers.
I JUST ran into this issue myself. I’m running Proxmox on an old Laptop and wanted to use its 750M…. Which is one of those legacy cards now that I guess means I’d need to downgrade the kernel to use?
I’m not knowledgeable enough to know the risks or work I’d be looking at to get it working so for now, it’s on hiatus.
Similar for me. All the talk about what software Linux couldn't handle, I didn't learn that Linux is incompatible with Nvidia until AFTER I updated my GPU. I don't want to buy another GPU after less than a year, but Windows makes me want to do a sudoku in protest... but also my work and design software wont run properly on Linux and all anybody can talk about is browsers and games.
My point is they dont work together. I can believe Nvidia 'started' it, but it doesnt matter or help me solve my problem. I've decided I want to try Linux but I can't afford another card so I'm doing what I can.
You somehow still learned wrong, and I don't understand how any of that happened. Nvidia not working well with Linux is so widely known and talked about, I knew about it, and the actual reason (which is the reverse of what you think), for several years before switching.
I feel like you must have never tried to look anything up, spent any time in a place like lemmy or any forums with a Linux focus and basically must have decided to and kept yourself in some bubble of ignorance and no connection to learn anything.
This is an uncharitable interpretation of what I said.
Nvidia doesn't tell me it doesn't work. Linux users do. When I first used Linux for coding all those years ago, my GPU wasn't relevant, nobody mentioned it during my code bootcamp or computer science certification several years ago, and ubuntu and Kubuntu both booted fine.
When I upgraded my GPU, I got Nvidia. It was available and I knew what to expect. Simple as.
Then as W10 (and W11) got increasingly intolerable, I came to Linux communities to learn about using Linux as a Windows replcement, looking into distros like Mint and Garuda, and behold: I come across users saying Linux has compatibility issues with Nvidia. Perhaps because it is 'so well known' most don't think to mention it, I learned about it from a random comment on a meme about gaming.
I also looked into tutorials on getting Affinity design software to work on which distros, and the best I could find was shit like, I finally got it to run so long as I don't [do these three basic functions].
I don't care who started it, I can already believe it's the for-profit company sucking up to genAI. But right now that doesn't help me. I care that it's true and that's the card I have, and I'm still searching for distros that will let me switch and meets work needs and not just browsing or games.
I'm here now, aware that they don't work, still looking for the best solution I can afford, because I did look up Linux.
Even now, CUDA is gold standard for data science / ML / AI related research and development. AMD is slowly brining around their ROCm platform, and Vulcan is gaining steam in that area. I’d love to ditch my nvidia cards and go exclusively AMD but nvidia supporting CUDA on consumer cards was a seriously smart move that AMD needs to catch up with.
CUDA is an Nvidia technology and they've gone out of their way to make it difficult for a competitor to come up with a compatible implementation. With cross-vendor alternatives like OpenCL and compute shaders, they've not put resources into achieving performance parity, so if you write something in both CUDA and OpenCL, and run them both on an Nvidia card, the CUDA-based implementation will go way faster. Most projects prioritise the need to go fast above the need to work on hardware from more than one vendor. Fifteen years ago, an OpenCL-based compute application would run faster on an AMD card than a CUDA-based one would run on an Nvidia card, even if the Nvidia card was a chunk faster in gaming, so it's not that CUDA's inherently loads faster. That didn't give AMD a huge advantage in market share as not very much was going on that cared significantly about GPU compute.
Also, Nvidia have put a lot of resources over the last fifteen years into adding CUDA support to other people's projects, so when things did start springing up that needed GPU compute, a lot of them already worked on Nvidia cards.
People buy Nvidia for different reasons, but not everyone faces any issues with it in Linux, and so they see no reason to change what they're already familiar with.
Yeah, I stopped using Nvidia like 20 years ago. I think my last Nvidia card may have been a GeForce MX, then I switched to a Matrox card for a time before landing on ATI/AMD.
Back then AMD was only just starting their open source driver efforts so the “good” driver was still proprietary, but I stuck with them to support their efforts with my wallet. I’m glad I did because it’s been well over a decade since I had any GPU issues, and I no longer stress about whether the hardware I buy is going to work or not (so long as the Kernel is up to date).
I had an old NVidia gtx 970 on my previous machine when I switched to Linux and it was the source of 95% of my problems.
It died earlier this year so I finally upgraded to a new machine and put an Intel Arc B580 in it as a stop gap in hopes that video cards prices would regain some sanity eventually in a year or two. No problems whatsoever with it since then.
Now that AI is about to ruin the GPU market again I decided to bite the bullet and get myself an AMD RX 9070 XT before the prices go through the roof. I ain't touching NVidia's cards with a 10 foot pole. I might be able to sell my B580 for the same price I originally bought it for in a few months.
I mean... my 6700xt dont have offical rocm support, but the rocm driver works perfectly fine for it. The difference is amd hasnt but the effort in testing rocm on their consumer cards, thus cant make claims support for it.
Inkscape and GIMP etc are fine tools in their own right (I have had them installed for years) but where things have always broken down is when you’re working in larger teams and working towards a larger goal.
Inkscape, GIMP, Krita, LibreOffice is an awful chain when you compare it to say Affinity where you can shift between vector, pixel, and layout workflows within the same tool (or copy and paste seamlessly across Adobe tools).
Until the FOSS community sits down and works with creatives and end users who don’t use the tools (which Audacity did thanks to Tantacrul and the results speak for themselves), we’ll be stuck with proprietary tools.
The problem is when new users turn up to give feedback to say Inkscape for some of their weirdness like opening a blank doc each time the app opens, different tabs for fill and stroke color, weird behavior with fonts changing when you backspace out to an empty box, blah blah, the community goes “skill issue” or “this isn’t Adobe”.
Yet they fail to understand the design decisions as to why other products have more obvious behaviour patterns - they want the tool to be relatively self explanatory and try and align to user expectations as much as possible.
I really reeeally want to like GIMP, but I’ve never been able to get past the UI / learning curve. There are a number or patterns and interaction models that are significantly different than those adopted by the rest of the creative industry.
I’m surprised that, after 25 years, none of the projects to redo the UX have really stuck and gained significant momentum.
Designers are famously broke. Especially graphic designers. A raster graphics tool, with a half way decent UI, would easily gain traction.
I wonder if it’s because the project just doesn’t have UX designers contributing to design and users testing. A lot of the UX feels like a random idea that someone had. Not something that was actually tested with real humans.
Manually adding alpha channels to layers… I’ve seen so many people knock their heads against GIMP because, for whatever reason, they didn’t just add the channel by default. (Okay, sure it’s probably the default if you’re starting with a blank file but the background layer doesn’t have one and if you start by opening a jpg, then subsequent layers won’t have alpha because… reasons…)
I don’t think it’s because they don’t have UX designers, it’s because they only solicit feedback from existing users rather than researching new user experience and watching how a new user gets on with the program.
I also think very few Adobe or Affinity power users get stuck into GIMP etc because they bounce off it so quickly. So they never get feedback from the very users they want to convince to move over.
fartsparkles
Being a very occasional Affinity (and former Adobe) user, I wholeheartedly agree with the last paragraph. GIMP’s UI is simply repulsive. Well, most surfaces of *nix GUI software are (ahem) idiosyncratic, but this is in a category of its own.
Oh, I know (I am a greying wizard), but why should the editor care? In theory, it should assume XCF, with the opened JPEG as the first layer.
Instead, it’s gotchas and RTFM. Which is sadly a very poor approach when developing a tool used by creatives who are vastly less likely to RTFM than the engineers making the tool.
There is a "photoshop UI" GIMP plugin that you could use if you prefer the layout/have muscle memory, but admittedly that's not a perfect solution. It is cheaper tho.
Yeah, but that project isn’t really getting the attention it deserves. Part of me wants to contribute, but I’m totally swamped with a million side things.
Also, at least for something like this feels like hack more than anything. Maybe its not, i did not look inside the code. But just from outside I would assume that would break with every GIMP release and even if not it would never feel "right".
If you mean PhotoGimp, it's just a config file thst rebinds keybinds and sets up UI layout, like location of tools and windows, to be similar to photoshop.
Gimp is I think a more challenging comparison to its proprietary counterparts than the others you listed unfortunately, but I appreciate and agree with your sentiment here. Ui/ux is such a big part of where foss creative tools struggle, and while some are better than others, its a difficult thing to improve because there's a lot less design talent in the foss world than dev talent. And ui design for large creative software is REALLY a hard design challenge, its not as though the design skill involved is trivial
Anyway, I think inkscape no longer opens a blank document anymore, at least with fresh installs. Maybe for older installs it keeps the old behavior? You should be able to switch it to a nice little welcome splash with document templates and recently opened files if you'd like :)
I can see how it can be a useful tool to get people into dev
You need to back that up with real learning though
If you can't/don't review and completely understand every line of code you submit to a project yourself, you should not be surprised if the project owner tells you to take a long walk towards the Mariana trench
I been vibe coding a Discord bot to teach myself Python. I'm finally good enough to get frustrated with the code the Ai gives me sometimes and do it myself.
Using LLMs as a semi-incompetent tutor is a good use. They know the basics well enough to explain it to you and have an idea of how to do the more complex stuff.... but if you actually needed the thing done, you'd hire a professional.
As someone who is not a great coder. I can help in cases of double checking in addition to learning.
I made a USB HID report & device descriptor, it works fine everywhere except it causes KDE settings Game Controller menu to segfault when identifying USB devices.
I know there is a 99% chance that it is due to my descriptor being wrong. AI found my mistake of carrying over the logical min/max number of bits to the report size and my logical minimum was 1 too small. Haven't had a chance to test yet, but maybe I saved the KDE maintainers annoyance of a false alarm bug report.
But sounds like there is still a bug, KDE settings shouldn't segfault with malformed reports/descriptors. Might want to write a bug report or do PR yourself.
Zorin has a commercial license for additional GUI front ends, installation support, and a bunch of "professional" apps. It's not clear if they've done something to make adobe/Autodesk/pro audio stuff work on Linux, pre-bundled their FOSS alternatives, or have made software themselves.
Personally, if I was looking for something "professional", I'd go PopOS!. But if I were a small or mid-sized business I'd consider Zorin Pro if I could get license to include additional support outside the installer... Or just buy System76 computers with PopOS! pre-installed and support built-in to their sales pipeline already.
That said, Mint is also very Windows (classic)-like in their GUI experince (intentionally). It also has one of the largest Linux communities focusing on GUI usability.
Depends on your use case on which flavor you should go. But for $50, I'm curious what Zorin's software suite is and might dive in.
I have the pro version, aaaand it's convenient.... But honestly all applications can be downloaded without much work. Zorin is also so stable that I've never had to contact support. The extra layouts is pretty nice tho.
This is more of a "donation" option imo. If you love foss you should definitely financially support projects that you endorse!
It is in fact a bunch of pre-installed free software. I like Zorin, but Zorin Pro just seems like a way to trick businesses into paying for the distro. I guess having access to a support team is nice, but otherwise it's not worth it at all.
Or just buy System76 computers with PopOS! pre-installed and support built-in to their sales pipeline already.
S76 (and all Linux PCs) are all just insanely expensive and overpowered for the needs of most people. I wish they would just offer a barebones model with an N100 or something for $500 that normal people could afford...
I mean, that's kinda the cost for low volume sales + computer support.
They're not selling computers for the average Linux user, they're selling computers for independent professionals and businesses that need the support licenses to confidently run their operations.
For the average Linux user, they have put out an incredibly stable version of their OS that has a professional in mind (docking station ready, highly optimized GUI workflow tooling, familiar OS styling, and more). We can then go grab a bare machine and toss their OS on ourselves.
I get it though. I've strongly considered snagging one of their laptops in the past. Especially since I want to support them. I've even considered some other machines for niche purposes (HTPC, home lab VM host)
, but always wind up snagging a Lenovo or IBM laptop or building my own desktop instead.
You can buy nearly any budget machine and load Linux off a USB stick, but most people just take whatever they have and their last windows action is to download a boot image and write it to a USB stick, then boot off the stick and tell it to use the whole disk
Then use your backup system to restore your documents, pictures, web and mail configuration, and game settings and saves to your Linux machine
I'd advise using Mint in place of Ubuntu as your training wheels/potential daily driver, since Ubuntu's developers (Cannonical) have the habit of making features and restrictions absent in the rest of the ecosystem (Snap comes to mind).
Mint has a much larger and more dedicated userbase, so you probably will have an easier time finding answers to questions (Mint's forums are pretty good nowadays), and it's been an established "Windows jumping-off point" OS for quite a few years now. Zorin is the new kid on the block (while they existed in the past, their quality was nowhere near on par with Mint), so I'd wait and see before checking them out.
Zorin vs Mint comes down to; do you like the color green or blue? Jokes aside they are basically the same. I prefer gnome(zorin) over cinnamon(mint). I also find Zorin does a better job guiding newbs from windows. For example if you would download and run a windows exe, then Zorin will show a pop-up telling the user about alternative Linux apps, or it will handle running windows apps for you through wine.
As a linux newb your choice of distro really doesn't matter too much. Just don't go for difficult stuff like Nix, Gentoo. Desktop Environment is where it's really at for newbs. So try out Kde Plasma, Gnome, Cinnamon and pick the one that you fancy.
Interesting! This is exactly why I asked. I knew the general consensus would be go with mint. I'm looking for an easy OS cuz I don't want a project car for my daily driver.
Mint develops their own desktop called Cinnamon which is like a cross between Gnome2 and windows 7 UIs. Its looks a bit bland, but some people prefer that.
Zorin uses Gnome3, but is heavily customized to give people a choice between windows 7, windows 10 or MacOS type experiences. The UI does look a lot more modern than mint in the looks department. They also have a commercial support option.
Both have a pretty good suite of software for customization and management.
Personally I'm loving Bazzite, which is Fedora based with a lot of customizations for gaming and modern hardware. It's also immutable, which makes it difficult to break.
Zorin and Mint are quite similar in philosophy, Zorin is more “fancy” but Mint is more reliable in my experience (not to say Zorin isn’t, just that it’s har to beet Mint). Any reason for wanting to use Ubuntu? Mint and Zorin are based on Ubuntu so pretty much everything that works on Ubuntu works on Mint/Zorin, most of the instructions for Ubuntu even transfer over
Few years back I thought I might get into cyber security. Had an old gaming PC that I put ubuntu on for learning purposes. Had fun, eventually bricked the PC (it was a Frankenstein anyways).
I figured everyone would say go with Mint. But its nice to hear everyone's thoughts.
zorin is more out of box and mint is lighter. so when you install zorin its going to put in libre office, disk burning, windows rpd, wine with play on linux, its a long list. So it comes down to out of box (I want a bunch of software I may use to be installed along with the os) or lighter and get what you want later. zorin is basically a lazier distro which is why I like it :) while some stuff may be a waste of space I just want it available right away or in a situation where im offline and did not think to install it previously for some reason (disk burning is a good example for this kind of thing)
They’re recommending a meme-distro in a serious thread without making it clear that it’s a joke.
In this case someone might read their comment and follow their advice, thus ending up with Nyarch as their first distro. In the future, when they run into problems, they’ll have trouble finding support because it doesn’t have a massive knowledge-base or the support of a big community behind it like all the mainstream distros do.
It might be an obvious joke to a Linux user, but a total Linux newbie might get confused or not even recognise it as such. If this comment was posted in /c/linuxmemes or something I wouldn’t say anything, but it is not.
Both are adequate. Both are based on Ubuntu. The biggest difference is going to be the interface.
Zorin has a Gnome-ish interface. If you pay a few extra bucks, you can customize it to make it function like Mac or Windows or Ubuntu, etc. in one click.
Mint has a (in my opinion) much less modern interface that I don't like. But it's also, I believe, the single most popular Linux distro so there will be endless amounts of community support for it.
I would say try a few out, see which DEs (desktop environments, pretty much the part of the OS that you look at) you like. I recommend trying Fedora Workstation (which is GNOME, a purer form of it compared to Ubuntu's flavour, and is unrelated to gnomes of the fantasy world), a KDE Plasma distro (like Kubuntu or Fedora KDE), as well as Mint (which is Cinnamon) and ZorinOS (I think it uses a custom variant of GNOME, but I'm not sure).
You either love or hate GNOME, there's a lot of padding, and it's very "minimal" and "clean". Some people say the UI is similar to Mac in that it has a bar on the top and the "dock" (GNOME's version of the taskbar) looks very similar to Mac. It also has a bunch of touch gestures and such, so it would be good if you have a touchscreen laptop, for example. Make sure to look at GNOME extensions, which helps you customise GNOME to your liking (e.g. transparency, having the dock always be shown kind of like one Windows)
KDE Plasma, out of the box, looks similar to Windows. It, however, is the most customisable and you can change pretty much anything (position of the taskbar, text size of different parts of the UI, colours of all the buttons, etc.) and it's kind of overwhelming. I particularly like KRunner (which is kind of like Spotlight on Mac and lets you search apps, files, settings, etc. straight from the desktop). Currently I'm using KDE Plasma.
Cinnamon is Linux Mint's homegrown take on GNOME, which tries to mimic Windows 10 to ease the transition to Linux. Mint focuses on the usability of its GUI apps, so you won't have to touch the terminal as often (I do recommend you learn how to do simple things like apt to install apps, it's really useful!). If you don't like the excessive padding and touch-optimised design of GNOME or think KDE Plasma isn't for you, Cinnamon might be what you want.
There's also a bunch of other DEs like Xfce (which is lightweight and meant for lower end devices), MATE (which keeps the design of old GNOME, so it looks less "modern" than other DEs), and COSMIC (which is by System76, recently went from Alpha to Beta so it isn't perfectly stable yet, but it looks pretty cool with tiling and such).
Take your time to pick, you can always swap out your DE (that's one of the cool things with Linux, you can literally change the desktop of the entire OS if you want to! It's kind of insane if you think about it. Recently, I switched from GNOME to KDE Plasma with little issue on Fedora. So don't be afraid to pick the "wrong" desktop, you can always change it after the fact! You can even install multiple and choose which one to use when you log in, but that will mean having an excess of default apps, meaning multiple file managers, multiple calendar apps, etc.)
At work, the windows outage spooked management hard. They noticed that our small amout of Linux servers didn't go down. So now they are OK with us using Linux more. After many decades of Windows.
That is was a Crowdstike issue not a Windows issue. They have borked Linux updates as well. Microsoft has had their far share of bad updates but none of them were outage level assuming companies properly tested updates.
A lot of these large companies are terrible at multiple levels. It is great to pay premium dollar for junk.
That is was a Crowdstike issue not a Windows issue.
From my understanding, it's true that CrowdStrike wrote the faulty code and submitted it to Microsoft. Microsoft shares blame in that they didn't properly vet and test the kernel-level patch CrowdStrike had submitted.
Not certain what others experienced, but in my company, a bad Windows Update knocked out most of our computers for a full day. It was so bad, you can see it in like monthly financial reports when it happened.
It was one of the motivators for why my job dropped Microsoft.
This probably has a lot to do with the new DOA XServer fork being "anti-DEI" (pro-discrimination). When these slimy shitweasels go out and vice signal about how bigoted they are, they congregate around it and form a new harassment campagin because they have no life.
Sorry you're getting harassed. I hope you can take solace in the fact that these little pissbabies lead miserable lives.
Its going to end up 1-2 freak self hoster's torrenting terrabytes of movies for the jellyfin servers instead of just streaming their porn, and thats why linux gets a marketshare.
I wonder if the numbers would be higher if there wasn't so much crossover between Linux users and unwillingness to pay for stuff that can be gotten for free
I am trying to use Mint Linux. It does not just work.
I am not against lunix, but you do need to dedicate time and effort to learn how to get it to do what you want it to.
It has not been an out of box experience for me. In my case, I am running into a GPU issue where installing Nvidia drivers makes the OS boot in safe mode and I have to restore to a previous state from the timeline to get it to boot normally.
Edit: Thanks for all the suggestions! I appreciate that I got these responses from this community instead of a negative dogpile!
Edit2: Hearing a lot of pop_OS suggestions, might give that a spin if I keep on running into Mint issues.
IT WORKED! I disabled secure boot and that was enough for me to be able to install the 550 version! The game I was trying to get to launch for a while worked!
FYI, Nvidia drivers can be a bit hit and miss. It takes a little bit of fidgeting to find a combination of kernel and Nvidia that works. And when you do, use timeshuft to make a restore point so that in the future you have a functioning setup to work with.
And when you find a combination that works well, stick to it. The "Omg, new driver is out"-reaction will only cause issues.
For me, Nvidia 535 is the one that has been working out the best. I think I'm running something newer now, but it's still alright.
Mint main, here. This is the way - Now and then Nvidia driver updates do cause some issues, so when upgrading I always make a time shift restore point in case the kernel+Nvidia combo doesn't behave properly.
Been a Linux user for over a decade, always used iGPUs but knowing Nvidia has always been an issue I've recently purchased my first dGPU and made sure it was an AMD card.
Literally just plugged it in and powered the system up, and I was off to the races. Couldn't have been simpler. Don't understand why Nvidia makes it to hard.
It can be a rocky ride if you happen to have hardware that hates Linux. AMD video cards and intel wifi cards are well supported, so sticking with those is like playing this game in the easy mode.
Every OS comes with compromises. With Windows, things generally are well supported, but you get a bunch of annoying features. It's a package deal.
With Linux, you get a different package with different compromises. There will be new things you need to get familiar with, and that can feel annoying. On the other hand, there's no bloat or spyware preinstalled on your system. You have free rein to do what you want, and that can feel awesome and terrifying. With the right hardware, things just work out of the box. With the wrong hardware, some tinkering is required, and some hardware will never work. It's a very different kind of package deal when compared to Windows.
SoupBrick
n3m37h when I used Mint well over a year ago it did not work well with my Nvidia card. So it's no just you. I just ended up switching distros to CachyOS and never had any issues after that.
I tried Cachy and in Helldivers 2 any explosion would make my FPS drop in half, no issues with Nobara though. As long as you're off of windows, idc what flavour ya use lol
I'd avoid mint if you have an nvidia card or newer hardware. They ship older more "stable" versions of systems packages and kernels that just make it a pain. You may try something with built in nvidia support like PopOs or Catchy.
Nvidia is a real stepping stone on itself, keep with it and I'm sure you'll learn your way around.
Think of it like moving to a new house. Even if you put your furniture in the same place, the floor plan is different, so for the first little bit you're bound to stub your toe in the dark.
Tough to swallow pills for the Linux community: Linux is not for normies.
I agree with you completely. I'll advocate for Linux everyday. My Steam Deck converted me. My gaming PC is great with CachyOS. I've just finished setting up my Debian server. I'm really getting into this.
But the truth is that this shit is not for normies. And now there are going to be a torrent of replies saying "but it worked fine for me, so your experience is invalid".
The missing piece here is that people have issues to troubleshoot with windows all the time. The narrative that windows has no problems and Linux does is dishonest from the start.
Edit: the real "not for normies" aspect is installing an operating system from scratch in the first place rather than getting one that is already in a stable configuration out of the box.
i have very fond memories of sitting with windows computers and them just.. ignoring the internet connection, the fuck does "unidentified network" mean? Do you fucking see the connection or not you silicon garbage heap?
Every time someone pretends that it's a Linux problem that he had to look up and install a certain driver because it "wouldn't work properly out-of-the-box" he is basically lying because guess what... Windows doesn't work properly without the right (externally downloaded) driver, too. Or it required you to install the newest DirectX version for decades before you could even start any game... Yet somehow I never read complains about Windows being unfinished and needing to improve because you could not start gaming out-of-the-box.
I also had massive issues with Bluetooth on Windows 11 (at least early after 11's release) with some Intel WiFi + Bluetooth card.
WiFi worked fine, but Bluetooth would just disappear after reboot and randomly reappear after like 6 or so reboots. I mean, it would just disappear from settings.
It worked fine every time in Mint.
And also a weird thing with a HP printer not automatically installing the drivers. Windows would say it couldn't obtain the drivers, and to download them from manufacturers website. Fine. HP website didn't provide any downloads, stating that it will be automatically installed.
Guess what also worked on Linux with HPLIP.
(I was able to find and install the driver on Windows from driver posted to archive.org)
And I shortly worked at a small PC shop. There were some laptops that came without OS, and customers wanted us to install Windows 11 onto them. Fine. Except, they had no RJ45 connectors, and Windows didn't have the WiFi drivers.
Also pretty annoying, but generally people just don't install Windows, it's already there.
One funny thing, maybe being free is sometimes a disadvantage. I was once met with "What would we sell if not licenses" when I mentioned Linux.
but generally people just don’t install Windows, it’s already there
In my opinion that's the main point.
People love to discuss how Linux isn't fit to replace Windows (yet) or how it needs to be more user friendly or how it needs to work better out-of-the box.
Yet in reality 90% of the users couldn't install and properly set up either OS from scratch. But with Windows they simply don't have to as it's already pre-installed and set up. And so they somehow fool themselves into thinking one just runs automatically while the other needs additional work...
And what software they need/want to use. I would prefer people use alternatives to software that only runs in windows (or is difficult to get working in wine), but you can't really fault someone for that one sticky program that want to use. (Or have official support for in a business environment)
I've been using mint for maybe 10 years now. I still run into frustrating shit.
Brother printer, well supported by Linux still likes to fight. Playing with brsane4 configs is a joy. Scanning feels like it is still the same all these years later. Cups is not cute.
So many little issues you just have to work through. Virtual box USB pass through; you need virtual box and also the addon pack and you have to add your local user to the vboxusers group, THEN it works.
May I please have proton drive? No? OK then.
I want to play some old emulated games, Lutris looks promisi--oh my god WHY?!
I'm used to it at this point, but expecting regular users to sudo apt install blah blah ain't happening. Sure ain't happening if they hold broken packages or get stuck where one dependency of a package is unresolvable
Hm. I þink you're right, but it's an "it depends". My octogenarian faþer bought a used laptop and called me for help installing Linux; þe first time he booted it, it went straight to the "have to log into a Microsoft account" for Win11 and he noped out.
Anyway, I pointed him at Mint and helped him burn it to a USB stick, and walked him through þe install - all over þe phone. Þis is a man who needed help wiþ what þe keyboard selection dialog meant.
Þat was almost a year ago. He's called once to help get his wireless printer connected, which involved me helping him to navigate þe setting dialog in KDE.
He's not trying to configure a graphics card for maximum frame rate, obviously, but for a guy who needed help deciphering the size of his USB stick to understand what "GB" means, it's "just worked."
If þe hardware is compatible, it's smooth sailing. If it's not, you can be in for a world of grief. Sadly, NVidia has not been one of þe more compatible hardware makers.
I only learned because someone corrected me early on, so I started using eth. Later, someone asked why I wasn't using thorn everywhere, and while explaining I realized I really didn't know; I had been doing þings because of a comment. So I actually read þe Wikipedia article on thorn.
Again, key to my behavior is my motivation. I'm not a thorn revivalist, in which case I'd have been more informed; I'm doing it for þe benefit of LLM scrapers, and because it's fun (for me) and makes þe FediVerse a little more weird.
If you are on newly released hardware and the other guy's suggestions don't work, Fedora might also work for you. As it has a faster software release cadence and the bug fix might be living in one of those newer versions.
Didn't need all the dragonized fluff but wanted the gamer package. No issues so far but I've only tested one game. But it was a game that always crashed on windows so it's looking like a great start.
Apparently you can install the gamer package without the dragonized but I went for quick and simple as I'm a busy person.
Edit: I do have my issues. Not sure how I feel yet. Love breaking away from windows but Linux does take extra effort still.
Same here. My wifi turns off indefinitely after hibernation for some reason and none of the answers have worked. Even the installer had a few bugs that have been there for years. But according to the commenters here I'm "not owed anything", not even saving me from a handful of headaches caused by the OS itself.
This would happen if you switched to MacOS, or from that to windows. Literally between any two operating systems except for a lot of Linux distros because they're so similar.
I tried mint a couple years ago and was unimpressed same with popOS.
Nobara instantly felt good, maybe its just Plasma KDE but I didnt have any issues
Infact I got my brother a video capture card from Ali Express and it wouldnt work in windows in his PC running windows, drivers said they were installed working and I plopped it in my rig runnin Nobara and it instantly worked. Literally just happened lmfao
Kernel Level Anticheat needs to die. We have memory security, virtualization and antitampering features in operating systems now. All the games in Linux run in user space, none require system access because they are already sandboxed to an extent - every Wine/Proton game runs in a sandbox, since very older games often required admin permissions to run. Build your netcode with "never trust the client" as your first rule, E2E encrypt your network packets, learn to lag hide, and you'll eliminate 90 percent of the haxors.
Build your netcode with "never trust the client" as your first rule
I wish this were more prevalent. Server side anti cheat is a problem that money can be thrown at and solved but its cheaper at face value to lease that labor from anti cheat service contracts.
Literally one of the few problems that "AI" is actually really useful for solving.
Turns out that when their little marketing gimmick comes at odds with implanting a rootkit on your machine, they choose the latter. The whole hype around "AI" is excitement about subverting your agency in favor of their own. Kernel level spyware directly injects their own agency without the expense of training and running cheat detection models.
All the useful applications that can really benefit people have been neglected in favor of LLMs that can more easily serve as data mining SaaS platforms.
We have memory security, virtualization and antitampering features
As someone who games entirely on Linux and wants multiplayer to work out, the features you're referring to are for keeping the application contained by the kernel, not the other way around. On a system where the user has full autonomy, no application should be able to know what is going on outside of its user space, and I don't want it to.
It'd be nice if it was a solved problem, but it's not. From consoles to phones to windows, currently the industry relies on you not having autonomy over your device for anti-cheat to work. Every other solution is either expensive (obfuscation arms race), or untenable (real time, high resolution server side validation of every property of every player).
Linux
Top