Just set font scaling to 1.3x or whatever it is you need and everything works great. I've been doing that for many years for 1440p in a laptop.
Ubuntu actually regressed there since switching back to GNOME from Unity. In Unity font scaling was neatly exposed in the same slider as integer scaling and all apps used that setting. Now I have to set one thing for GNOME and another for Firefox to get the same effect. "Proper" fractional scaling is a waste of resources in most situations. It's only really worth it if you want really consistent sizing of things when you have several screens with very different DPIs.
As someone who uses Ubuntu on multiple machines, does this and uses a high DPI laptop display AND an external display, this only sort of works and the outcome is much worse than what Windows or MacOS do out of the box.
Honestly, I don't get how the Ubuntu ecosystem is so much worse at handling something so basic, when other OSes have had almost no issues with it for over a decade..
I use external screens just fine. I can't even think of a situation that's a big issue:
- 4K laptop screen and 1080p external, just use integer scaling
- 1440p laptop screen and 1080p external, use 1.3x font scaling and everything looks fine
I currently have a 1440p laptop, a 4K external, and a 1280x800 projector in my work-from-home setup. The external screen replaced a 1200p one before. I use sway that supports fractional scaling per screen in whatever setup I want. And yet I prefer to just set 1.3x font scaling for everything and not touch the scaling at all. Everything looks great.
Fractional scaling is a missing feature for some people and Linux desktops are definitely behind. But compared to font scaling I don't see how it's really much more than a little bump in functionality (some controls sized a little better) for a big drop in performance (calculating >2x the pixels in a lot of situations) and even some loss of sharpness.
It's probably a huge preference thing. I've seen people online waiting impatiently for fractional scaling because things were too small in their 1080p laptop screen otherwise. Even though 1080p on a laptop is sort of the definition of 1x.
> I use sway that supports fractional scaling per screen in whatever setup I want.
So not x11/xorg, but Wayland? I recently switched from xorg/i3 to wayland/gnome on 18.04 - and while I miss the tiling, external screens behave a bit better. I'll probably try sway/Wayland when I have the time to upgrade to 20.04.
> And yet I prefer to just set 1.3x font scaling for everything and not touch the scaling at all.
Doesn't fonts end up too big on a typical 21" 1080p display?
I've switched to sway now and have that done manually like many other things to make them "just right". I just meant that the default Ubuntu desktop regressed on that from the Unity to GNOME switch. It was one of the things that I thought Unity had done particularly well within the constraints of what the software could do but very few people seem to know about it. I've explained how to get font scaling to dozens of people online to fix their issues. For most people it's just a much better solution than the much awaited fractional scaling.
I can recommend Kubuntu, KDE has better support for fractional scaling and is a very usable and distraction-free window manager. In general I'd try to stay away from NVIDIA drivers on Linux, unless you're doing GPU-intensive work the embedded graphics is often the better choice as it's less power-hungry as well and sufficient for most tasks.
And for KDE+Ubuntu it's always advisable to also check the official KDE distro KDE Neon, based on Ubuntu LTS. I don't know of the timeline for 20.04 based version though.
If you're not letting gnome manage the settings for you, arandr and autorandr can help (the first for adjusting, the second for saving/restoring profiles):
Does the Nvidia chip drive the display directly?
I think you also have an Intel GPU that drives the display, the Nvidia GPU will only be used to offload rendering.
I don't know how it is implemented, but from my understanding, as long as the Intel chiip does the scaling you should be fine.
If you only ever want to use the laptop's screen then you can indeed run the X1E on the intel gpu only. Unfortunately the display ports are hardwired to the discrete gpu so you need run the nvidia chip in that case.
Confirming this. When I bought X1E it was a pain to get 18.04 working. But 18.04 on this device was for me the worst Linux experience since the 90s :-( It is probably because of Nvidia, but I blame Gnome too - the latency got worse than 20 years ago. I just want to be able to type on the terminal and have a browser.
Am now to leave Ubuntu, that I used for the last 15 years or so. Thanks for helping me out that long. Going to NixOS and hope it will make my life easier, even if the start is more involved.
Throw your nVidia card in the bin and be free. After many years with nVidia I removed the card and I've been happily using Intel hardware for a year now.
Because God forbid you actually have to use the card you paid for, oh no, we can't have that. No one should work with machine learning, edit video or play games, Intel HD should be enough for everybody /s
Lenovo X1E is a laptop... So removing the GPU will be a bit of a challenge ;)
I thought of turning the GPU off under Linux, but external displays (through the thunderbolt port) can only be used with the nVidia GPU. I need external displays for my work, so disabling the nVidia GPU is not an option for me.
Throw your Linux OS in the bin and be free. After many years with Linux I removed it and I've been happily using Apple/Windows hardware for a year now. /s
But then further down below:
OK, nevermind, back to Windows with WSL...