Ranter
Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Comments
-
I'm fairly new to the world of Linux, but isn't xorg broken by design and the new way(version) is Wayland?
-
I feel your pain.
This isn't what you want to hear, but I also experienced worse performance using the Nvidia GPU than using the inbuilt Intel HD Graphics.
So I gave up and use Windows if I need to do something GPU-heavy, which to be fair is almost never -
@HitWRight wayland is still at a point where they have a compatibility layer for xorg because so many applications use it, and that's aside for it being a pain to use with nvidia.
-
@unsignedint were you using Bumblebee? There's a reason why I'm not using Bumblebee.
-
Anything graphical about Linux is broken by design (or, more accurately, lack thereof)
I'll make some baseless claim here: I think the developers were more concerned about non-graphical programs and the graphical stuff always took a back seat and never kept up with the times. It just always fell by the wayside.
Take the ANSI standard for terminal colors for example: technically, any compliant terminal on your linux system only needs to support 7 colors! Seven!
Linux development is HUGE so it's difficult to make blanket claims about any part of it but I do really think that only recently has any graphical environment become important to Linux, and we're standing on shaky legs. -
@unsignedint Are you using the closed source driver? The (default) open source driver development is actively prevented by nvdidia, so the performance is very poor, only the closed source one works reasonable.
Related Rants
Argh! (I feel like I start a fair amount of my rants with a shout of fustration)
Tl;Dr How long do we need to wait for a new version of xorg!?
I've recently discovered that Nvidia driver 435.17 (for Linux of course) supports PRIME GPU offloading, which -for the unfamiliar- is where you're able render only specific things on a laptops discreet GPU (vs. all or nothing). This makes it significantly easier (and power efficient) to use the GPU in practice.
There used to be something called bumblebee (which was actually more power efficient), but it became so slow that one could actually get better performance out of Intel's integrated GPU than that of the Nvidia GPU.
This feature is also already included in the nouveau graphics driver, but (at least to my understanding) it doesn't have very good (or none) support for Turing GPUs, so here I am.
Now, being very excited for this feature, I wanted to use it. I have Arch, so I installed the nvidia-beta drivers, and compiled xorg-server from master, because there are certain commits that are necessary to make use of this feature.
But after following the Nvidia instructions, it doesn't work. Oops I realize, xrog probably didn't pick up the Nvidia card, let's restart xorg. and boom! Xorg doesn't boot, because obviously the modesetting driver isn't meant for the Nvidia card it's meant for the Intel one, but xorg is to stupid for that...
So here I am back to using optimus-manager and the ordinary versions of Nvidia and xorg because of some crap...
If you have some (good idea) of what to do to make it work, I'm welcome to hear it.
rant
nvidia
beta
aur
xorg
suggestions
arch