Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Looking through the features

  Gnome 3.36
  - X11 fractional scaling.
I got all excited, because this means that now finally Ubuntu will be usable on my Lenovo X1E with High-DPI display.

But then further down below:

  Fractional scaling does not work with the NVIDIA proprietary driver (bug 1870736, bug 1873403).
OK, nevermind, back to Windows with WSL...


Just set font scaling to 1.3x or whatever it is you need and everything works great. I've been doing that for many years for 1440p in a laptop.

Ubuntu actually regressed there since switching back to GNOME from Unity. In Unity font scaling was neatly exposed in the same slider as integer scaling and all apps used that setting. Now I have to set one thing for GNOME and another for Firefox to get the same effect. "Proper" fractional scaling is a waste of resources in most situations. It's only really worth it if you want really consistent sizing of things when you have several screens with very different DPIs.


As someone who uses Ubuntu on multiple machines, does this and uses a high DPI laptop display AND an external display, this only sort of works and the outcome is much worse than what Windows or MacOS do out of the box.

Honestly, I don't get how the Ubuntu ecosystem is so much worse at handling something so basic, when other OSes have had almost no issues with it for over a decade..


I use external screens just fine. I can't even think of a situation that's a big issue:

- 4K laptop screen and 1080p external, just use integer scaling

- 1440p laptop screen and 1080p external, use 1.3x font scaling and everything looks fine

I currently have a 1440p laptop, a 4K external, and a 1280x800 projector in my work-from-home setup. The external screen replaced a 1200p one before. I use sway that supports fractional scaling per screen in whatever setup I want. And yet I prefer to just set 1.3x font scaling for everything and not touch the scaling at all. Everything looks great.

Fractional scaling is a missing feature for some people and Linux desktops are definitely behind. But compared to font scaling I don't see how it's really much more than a little bump in functionality (some controls sized a little better) for a big drop in performance (calculating >2x the pixels in a lot of situations) and even some loss of sharpness.

It's probably a huge preference thing. I've seen people online waiting impatiently for fractional scaling because things were too small in their 1080p laptop screen otherwise. Even though 1080p on a laptop is sort of the definition of 1x.


> I use sway that supports fractional scaling per screen in whatever setup I want.

So not x11/xorg, but Wayland? I recently switched from xorg/i3 to wayland/gnome on 18.04 - and while I miss the tiling, external screens behave a bit better. I'll probably try sway/Wayland when I have the time to upgrade to 20.04.

> And yet I prefer to just set 1.3x font scaling for everything and not touch the scaling at all.

Doesn't fonts end up too big on a typical 21" 1080p display?


Unity still exists, it got renamed to Lomiri - you should still be able to install that. So maybe that feature still works.


I've switched to sway now and have that done manually like many other things to make them "just right". I just meant that the default Ubuntu desktop regressed on that from the Unity to GNOME switch. It was one of the things that I thought Unity had done particularly well within the constraints of what the software could do but very few people seem to know about it. I've explained how to get font scaling to dozens of people online to fix their issues. For most people it's just a much better solution than the much awaited fractional scaling.


> I just meant that the default Ubuntu desktop regressed on that from the Unity to GNOME switch.

I already felt like that with the switch from Gnome 2 to Unity. I switched to Mate as a result.


> It's only really worth it if you want really consistent sizing of things when you have several screens with very different DPIs.

So useful for all laptop-users with an external monitor then.


Yet another proof that proprietary drivers are hindering innovation and holding back Linux desktops, not helping them.


I can recommend Kubuntu, KDE has better support for fractional scaling and is a very usable and distraction-free window manager. In general I'd try to stay away from NVIDIA drivers on Linux, unless you're doing GPU-intensive work the embedded graphics is often the better choice as it's less power-hungry as well and sufficient for most tasks.


KDE is also a lot less garish than it used to be. That was the one thing holding me back from using it.


And for KDE+Ubuntu it's always advisable to also check the official KDE distro KDE Neon, based on Ubuntu LTS. I don't know of the timeline for 20.04 based version though.


What's the advantage of Neon over Kubuntu?


Up to date and optimally integrated KDE directly from the project.

Looks like 18.04 came for KDE Neon only in August 2018, so that's a bit slower process then I suppose.


Can you do scaling with xrandr? I scale gnome to 2x and then have this line in my .xprofile: xrandr --output eDP-1 --scale 1.4x1.4

I disable it whenever playing video games on steam.


Setting up scaling via xrandr when you have multiple (different-sized) displays is probably doable but incredibly tricky.


If you're not letting gnome manage the settings for you, arandr and autorandr can help (the first for adjusting, the second for saving/restoring profiles):

https://christian.amsuess.com/tools/arandr/

https://github.com/wertarbyte/autorandr


I'm afraid to update and check the different displays scaling.. It's going to be absurdly hard to setup in a non-terrible way again, I guess.


Does the Nvidia chip drive the display directly? I think you also have an Intel GPU that drives the display, the Nvidia GPU will only be used to offload rendering.

I don't know how it is implemented, but from my understanding, as long as the Intel chiip does the scaling you should be fine.


If you only ever want to use the laptop's screen then you can indeed run the X1E on the intel gpu only. Unfortunately the display ports are hardwired to the discrete gpu so you need run the nvidia chip in that case.


Confirming this. When I bought X1E it was a pain to get 18.04 working. But 18.04 on this device was for me the worst Linux experience since the 90s :-( It is probably because of Nvidia, but I blame Gnome too - the latency got worse than 20 years ago. I just want to be able to type on the terminal and have a browser.

Am now to leave Ubuntu, that I used for the last 15 years or so. Thanks for helping me out that long. Going to NixOS and hope it will make my life easier, even if the start is more involved.


Interestingly enough, Fractal scaling worked for me in 18.04, but it doesn't work in 20.04 (the thing just crashes). Anyways, hoping for a fix


This is surprising to learn, since I am using nvidia-driver-435 and fractional scaling (150%) on one monitor.


Throw your nVidia card in the bin and be free. After many years with nVidia I removed the card and I've been happily using Intel hardware for a year now.


Because God forbid you actually have to use the card you paid for, oh no, we can't have that. No one should work with machine learning, edit video or play games, Intel HD should be enough for everybody /s


Lenovo X1E is a laptop... So removing the GPU will be a bit of a challenge ;)

I thought of turning the GPU off under Linux, but external displays (through the thunderbolt port) can only be used with the nVidia GPU. I need external displays for my work, so disabling the nVidia GPU is not an option for me.


You may be able to either:

- Use Nouveau instead of the proprietary driver

- Use a nested X/wayland server, which should then presumably support fractional scaling

- Use non-fractional scaling and adjust font sizes


I guess you don't play videogames on your computer.


How's ML training on Intel integrated graphics? :)


Do you have to use the nvidia as your graphics adapter if you are only using it for ML training?


It definitely won't work in the bin.


Desolder the NVIDIA card on a laptop?


Or just disable it in bios?


Throw your Linux OS in the bin and be free. After many years with Linux I removed it and I've been happily using Apple/Windows hardware for a year now. /s


The commenter is lamenting the fact that he can't run Ubuntu. I, on the other hand, am perfectly happy with what I have.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: