I have this question. I see people, with some frequency, sugar coating the Nvidia GPU marriage with Linux. I get that if you already have a Nvidia GPU or you need CUDA or work with AI and want to use Linux that is possible. Nevertheless, this still a very questionable relationship.
Shouldn’t we be raising awareness about in case one plan to game titles that uses DX12? I mean 15% to 30% performance loss using Nvidia compared to Windows, over 5% to 15% and some times same performance or better using AMD isn’t something to be alerting others?
I know we wanna get more people on Linux, and NVIDIA’s getting better, but don’t we need some real talk about this? Or is there some secret plan to scare people away from Linux that I missed?
Am I misinformed? Is there some strong reason to buy a Nvidia GPU if your focus is gaming in Linux?
Edit: I’m adding some links with the issue in question because I see some comments talking about Nvidia to be working flawless:
https://forums.developer.nvidia.com/t/directx12-performance-is-terrible-on-linux/303207
Please let me know if this is already fixed on Nvidia GPUs for gaming in Linux.
From what I’ve heard, ROCm may be finally getting out of its infancy; at the very least, I think by the time we get something useful, local, and ethical, it will be pretty well-developed.
Honestly, though, I’m in the same boat as you and actively try to avoid most AI stuff on my laptop. The only “AI” thing I use is I occasionally do an image upscale. I find it kind of useless on photos, but it’s sometimes helpful when doing vector traces on bitmap graphics with flat colors; Inkscape’s results aren’t always good with lower resolution images, so putting that specific kind of graphic through “cartoon mode” upscales sometimes improves results dramatically for me.
Of course, I don’t have GPU ML acceleration, so it just runs on the CPU; it’s a bit slow, but still less than 10 minutes.