I really wish system76 would start giving options for 4k displays. I have a FHD display from them, only a few months old, and it's noticeably worse than my old macbook pro display.
This is MY experience. I have run 4k on my ThinkPad X1 Extreme with Linux. Its not been a good experience for me. I am getting old and having vision issues, so I decided to get the 4k screen because my old MacBookPro 2015 had the high res and worked well with my eyes, it just didn't have the 15" screen.
Display Managers:
When Linux boots in a 4k screen, the GRUB menu is impossible to read, so I have it memorized. I have tried many Display Managers, some recognize 4k and adapt with no issue, many do not. GDM and SDDM work well, XDM not so much, fonts are all over the place.
X11 (it has an NVIDIA card)
There are some apps which refuse to adapt to 4k if you are running GNOME (and yes I use the QT_SCALE_FACTOR). Then there are apps written in WXWIDGETS. Steam menus and items are unreadable to me. Some Window Managers barely support HiDPI displays (XFCE I am looking at you). It goes on and on, by the end of the day its a very frustrating experience. GNOME (and it's derivatives) do it the best so far, KDE has issues with icon sizes and the panel. XFCE has multiple issues, so much that I gave up on it quickly and can't remember them all (and it looked awful). i3 was a little tricky at first but it can be configured to work well, but you need to adapt all the apps and bars too.
Other OSes:
Even MS Windows has frequent problems with the screen magnification.
macOS just gets it right, and I never had ANY of these issues, except small fonts in the firmware screen.
For my next laptop I will go back to 1920x1080 and avoid all these problems and start using AMD GPU's so I dont have to turn off secure boot and stop using X11.
Tip: install Pop!_OS - Nvidia edition. I also have a (1st gen) Thinkpad X1 Extreme with a 4K screen. In addition to that, I have 3 external 4K screens connected to it (so 4x 4K screens in total), and it runs smoothly without having needed any additonal config/customization.
How much work does it take to get Ubuntu to a good state with Nvidia drivers. e.g. equivalent of ease of use of Pop!_OS. I have a Thinkpad X1 Extreme (Gen3) on the way, and I am researching Pop!_OS vs. Ubuntu 20.04 LTS. I am leaning Ubuntu but if you tell me there is something magical in the Nvidia Pop!_OS distribution that is hard/impossible to replicate in Ubuntu 20.04 I would consider it.
If I install Xfce instead of Gnome desktop will I break the magic?
tangent: Who the hell let them name Pop!_OS, such an engineer name.
Fwiw, my default method of managing the NVIDIA drivers on Ubuntu 18.04 is by installing the System76 driver package using the directions from their site. I found that to make it way easier than doing it manually, especially bc I needed to manage two CUDA versions on one machine.
I wouldn't guess that switching away from GNOME would break hardware compatibility drivers, but I could be wrong.
Using XFCE 4.14-r2 (Gentoo) and things are okish with Intel GPU (Lenovo P71, 17' 4k panel). But I did not use the option "Window Scaling" (currently set to 1x) instead I changed in the Fonts-tab the "Custom DPI settings" setting it to 192.
The only problem that I have left is with Google Chrome & Vinagre: whenever I move the cursor inside their windows it stops being scaled up (therefore it becomes tiny). Used to work fine, changed during some recent update :(
I'm in the same boat where my eyes were not as good as they used to be and I often run a scaled display rather than 1:1. I'm using Ubuntu 18.x LTS and it comes out of the box with gnome but I primarily use i3wm. I was able to scale my display to be 25% larger and every app by adding this line to my .Xresources:
This gets into the technical details more than I am able to, but I agree completely. I also have older eyes, and 4k was just unusable (on a Galago with a 13" screen). I could have scaled down by a factor of two, but that was too coarse.
I exchanged the Galago for a 14" Darter, and 1920x1280 is just right.
Same. I use the Xresources and scale factor environment variables for my desktop with its 32 inch screen and I love it. It's great. But for a laptop, you get a little crispness on the text, but in general there's no reason to get a 4K display on a 13~15 inch screen. It's not worth all that effort.
> Also, KDE is better in all regards. Ditch gnome.
Yup, did that, GNOME looks nice, but... KDE has some issues with HiDPI, like a mouse pointer that is way too small on window controls. Yes I did change the cursor size to 48, but it does not work with what I assume is a kwin issue. So the cursor hops from one size to another all over the place.
This may be a bug in an older KDE version. I'm on debian testing with running the latest stuff from https://www.preining.info/blog/2020/12/debian-kde-plasma-sta... . I'm no longer running into those HiDPI issues in gnome or kde apps (some wxwidgets stuff is still wonky). Nothing ends up the wrong scale when using Plasma Wayland, but there are still other issues that aren't worked out that keep it from being perfect either. You might quickly try the KDE Neon distro on a live usb disk to see if it behaves better and then you'll know if the bugs you're running into have been fixed. https://neon.kde.org/
Since you are of the opinion that it is a small matter of programming, and it is an open-source project that takes contributions, you should definitely go write that.
Since it has not already happened, you may assume one of the following:
- it is in the works but has not made it to the version you are using.
- nobody else thought of that, congratulations
- it turns out to be harder than just identifying the issue
The MacBook Pro display is better in many more ways than just resolution. It's also has a wide gamut, excellent factory calibration, good contrast, and an effective antireflective coating on its glossy panel. I'd prioritize each of these over increasing the resolution from 1080p.
Couldn't disagree more (writing from MacBook display). I'd take mate FHD pver Macbooks smudgy, glossy retina any time of the _day or year_. Glossy displays in laptops should never have been a thing to begin with and I'm not sure how apple managed to convinced world that this should continue.
I used to think this way too, up until this summer when I spent a lot of time at my parent's house working from the garden.
My personal pc is a 2013 MBP, so glossy screen. My work pc is a hp probook 430 g5 (~2 years old) with a matte screen.
The one that I would take out in the garden when it was sunny was the mac. Sure, I could see myself in it, I had to set the backlight to 100%, but it was possible to focus my eyes such that the text on screen was actually readable. Of course, this implies black text on white background and to sit in such a way as not to have too contrasty a background.
The matte screen of the HP was completely useless. It was just a big blotch of white, it was next to impossible to see the text. I even took out an actual external monitor once to try it out, a Dell P2415Q with a backlight that's blinding inside. Same problem.
Now sure, when there's not a high level of ambient light, the matte screen is more easy to use as there are no reflections. But as soon as there's a source of light that shines on it (and there are cases when you can't control it) it's game over.
Protip: clean the macbook screen from time to time. Even if it doesn't look dirty. I wear glasses, so I use the cleaning spray on the screen. It always amazes me how much better it is afterwards.
In general, and for working outside especially, i recommend the Solarized colorscheme. It is optimized for good readability and contrast, while being easy on the eyes. https://github.com/altercation/solarized
Honest question: what is an example of a good accessible color theme? I'm assuming Solarized is not a good choice for those with color blindness? Please explain, I'm very interested in color and accessibility.
Good for you is the one you enjoy using. Good for me is the one I enjoy using. Even if I didn't have a problem with Solarized's lower contrast, I would still think that its colors are hideous.
The problem for many with low-contrast themes is that they often aren't focusing on keeping contrast high. They're mostly focused on keeping contrast low. They aim to avoid being "too contrasty" without targeting being "goodly contrasty".
> color blindness?
Color blindness is its own set of problems. To the best of my knowledge I'm not colorblind. I just can't easily read poorly differentiated text unless I really crank up my screen's brightness or make the text much larger. The problem I think is mostly one of pupil physics relating to the impact of external light on pupil dilation and how that affects both focus accuracy (pinhole cameras focus perfectly without lenses) and also the amount of light entering the eye (smaller aperture means less total light, less light means less difference between light things and and dark things). Neural color perception is another issue that is related but only sort of. Or maybe there's an analog to colorblindness that we just don't talk about that deals with sensitivity to luminosity variations. IDK.
Pardon the choice of words, but it is the only effective way I could convey my opinion re solarized and most color schemes.
Most color schemes try too hard to differentiate everything, lack good contrast and do not aid in understanding the code. Many of them and as a good example solarized, lack good contrast. I had so many issues trying to find a good scheme, that I resorted to creating my own for Jetbrain software, and make due with some of the rainglow schemes on VSCode. The themes are mostly black text on white background and the highlights are not obtrusive so as to draw more attention that the core. In the end, I spend more effort trying to mute the loudness of the colours than reading the content.
I have a complete opposite experience. I recently switched from Thinkpad t490s to macbook pro 16". I tend to work outside in the mornings and the glare has been driving me crazy. I'd rather have slightly less bright screen but actually see the content instead of my own morning bed-hair!
My MacBook Pro screen has less glare, or at least better color than my LG Ultrawide in similar conditions. Even at the worst angles, I can still read everything just fine. I'm not going to be editing photos in an airport, but it doesn't get in my way when coding.
They convinced the world by pumping up the brightness and providing better color than other matte screens.
Just close your lid to find out! Everytime I open my macbook I'm greeted with a lovely print of my keyboard on my screen! You could call me out on my cheetos fingers but I use an external keyboard... As if the laptop wasn't heavy enough I have to carry a whole cleaning suite to actually see something.
It’s also actually higher res too! My Late 2013 13” Macbook Pro has a resolution of 2560x1600. I wish more manufacturers would remember that there are specs between 1080p and 4K.
This is true for dektop monitors as well. I had to scout around for a 2560x1440 display for my desktop Ubuntu box. You would think they would be relatively common, but as you say everything was 1080p or 4k, at least when I was shopping in-person pre-covid. I ended up with an AOC gaming monitor that I really like that I bought online.
Agreed. The displays, speakers, touch pads on Apple products are amazingly good. But the pro laptops have been in steady decline as Apple gears them to the average user more and more over the years, instead of the power user like they were originally built for. I think System76 has the right direction for power users, shiny screens and crisp sound doesn't help me hunt through gigabytes of data and millions of lines of code.
I really do wonder where exactly 4K would begin if more and weirder resolutions were commercially available. Like, the fact that 3840x2160 is considered the canonical 4K resolution means you don’t actually need 4000 pixels, just a number that’s close to 4000. I wonder what would happen if someone dropped a 3800x2000 display just to mess with people.
That is just double 1980x1080, and if you shuffle it a little you get a 4000x2000 pixels, but that would mean your aspect ratio is different than what people are used to.
I agree with some other posters here that I don't need it to be 4k, but 1080p is bad enough to be a deal killer for me. The only reason I don't buy System76 laptops is their terrible displays — I love PopOS (so much that I donate monthly to it) and would buy their hardware in a heartbeat otherwise.
But given that Linux still doesn't have perfect support for display scaling (compared to macOS, for example) and the fact that it's a 15" screen, I would still probably opt for the 1080p if given the option.
Actually Linux has my favorite display scaling --- if you just have one screen.
I spend all day with Firefox, terminals, and emacs in XMonad and they all scale perfectly with no raster UI elements. (XMonad's pixel boarder I didn't bother to scale.)
On Windows and MacOS last I checked, there was a lot more resized raster bullshit.
Even with multiple displays it's not that difficult - just pass the scale you want for a given display when calling `xrandr` and things generally just work as long as you're downscaling from the "native" dpi you have X using.
It's 2020, we can make rockets land on self-driven barges, and somehow we're still forced to use xrandr manually...? This is the sort of thing that keeps me from going back to the Linux desktop.
You don't have to. Ubuntu 20.04 and Gnome desktop provide a fine hidpi experience. There are a few warts, but it all generally works. If you start dabbling with other desktop environments like xmonad or i3, it gets tricky.
I can never get it to work properly with multiple displays. I have a 4k monitor and two 1080p monitors. Using 192dpi with 2x scale (param passing to xrandr), all kind of display errors popup (wrong color, weird fonts etc).
Yeah, I wanted to configure hidpi on linux recently, when I had to work on FHD 14" screen without external monitor for a while. I found no config options in UI, and guides involving xrandr and fifteen different steps on the internet. I thought hell no and abandoned that idea altogether, cranked font size in the terminal and worked like that. And I actually "can" work in Linux and configure easier stuff. As far as I'm concerned at least some distros/DEs have non-existent hidpi support right now.
If in Gnome, fractional scaling is possible but not enabled by default. I use 125% and it has been fine. I did it via Gnome-tweak, or you can manually enable it [1].
As for me: I haven't figured it out yet, sometimes a fractional scale works, sometimes you get vague errors. In any case what I have noticed is that quality drops significantly if you use fractionals.
For my set-up at home (1080 - 4k - 1080) I have a simple script, that really was more of a `write once - use always`-experience. E.g.
So basically what I do is, I virtually up-scale my other screens up to 4k, such that system-wide zoom settings are (more or less) consistent and enjoyable.
No, with `xrandr` you can scale to any fractional amount. For example, I had 2 24" external monitors (full-hd, not 4k), and a 13" thinkpad (chromebook, booted into linux). Obviously, the respective DPI's were way off.
Just a quick calculation in a repl, then called xrandr with the right scaling number, and presto, I had them exactly the same.
Then, with precise lining up of the displays, it was basically one seamless "sheet of paper". If I had a long web page or pdf spanning the displays (the laptop was directly under the monitors), then scrolling through it was quite "trippy".
I think because this was now a unified display where the bottom 1/3 was tilted differently to the top 2/3'rds (think of a curved ultrawide, but in portrait mode)
Pop!_OS (their own Linux distro) has DPI scaling built in. I can actually turn on x2 scaling, and while it's huge on the poor 1080 display, it works perfectly.
Also a lot of the big linux distro's now have support for this. I know Manjaro and Ubuntu can do it with minimal effort.
Is there any end in sight for the pixel doubling hacks? Why do we even use them any more? Apps should have had plenty time to adapt to higher DPI displays by now, and X11 has supported per display DPI since the 90s and a scale factor on top of this seems obviously wrong.
And for the niche case where you have a big disparity in DPIs of multiple displays + apps that can't adapt to DPI on the fly, scaling up is surely the wrong solution, you should be scaling down no the smaller screen.
I've tried looking a bit into this at one point and my understanding is that it's more a toolkit issue than an app issue.
QT has an option to adapt the size of the widgets according to the reported screen resolution, which seems to kind of work. I don't use that many QT apps, so not sure how this turns out in practice.
However, GTK just doesn't care.
All this is on X11, I hear the situation is somewhat better on Wayland.
I use a Skylake on a 4K laptop with a 4K external monitor above it. It is plenty fast, with no tearing. Under Qubes, where app VMs don't get to use the GPU, mpv plays full-screen 1080p video using 3 cores.
(Oddly, the external monitor will do 4K only at 30 Hz. I don't know if that is a limitation of the laptop or of the monitor.)
I have a 13" HP Spectre from mid-2017 with a 4k display. It's noticeably sharper (and brighter) than the 1080p alternatives. Prior to that, I had a QHD laptop from circa 2013. No font size issues on Linux that affected my work.
Ubuntu Unity, which I still use today, has working fractional scaling (in 12.5% steps) for ages now. In fact, I'm pretty sure it had that before even Windows did.
Gnome seems to have finally caught up as well, even with Nvidia drivers. Like it or not, 4K is the current standard and going backwards makes no sense.
2K is a generic term for resolutions that have a horizontal resolution of approximately 2,000 pixels. In terms of cinema the DCI standard for 2K is 2048x1080. There is also the flat cropped resolution of 1998x1080 and the cinemascope cropped resolution of 2048x858. 1080p has the same vertical resolution as DCI 2K and it's horizontal resolution is also approximately 2,000. 2560x1440 is 560 pixels more than 2,000 compared to 1920x1080 which is only 80 pixels away. That is 7x further away from exactly 2,000. Calling 2560x1440 2K is too big of a stretch. If you wanted to give it a name based upon its approximate horizontal resolution then you should call it 2.5K.
The main problem is that it is low dpi. It means you can't really connect a high external dpi monitor and then use both displays at the same time, because linux can't (or at lest couldn't within the last year) handle displays with different DPI.
I have one, and it's an amazing laptop. However, as Purism are focused on privacy and free (as in freedom) software, they won't put in a gpu.
System76, for me, seems to have a balance between coreboot and as much free software as possible. But also the addition of a proprietary blob full gpu.
I could take or leave 4k, but at least 2k (~2560x1440)?
Having their 2020 high-end 15.6" maxing out at 1080p is just ridiculous and I'd probably have bought at least one of their laptops if they had a sub-2kg model with just one step up in resolution.
Anything above 1080p doesn't give you any more usable screen space at that physical size. It's very debatable whether just having richer blacks is worth the battery life hit.
Speaking from experience and having played around with fonts, scaling and DPI, I still feel a bit hampered switching back to my Thinkpad t470s (14" 1080p) from a long time on my t490s (14" 1440p). I do use pretty small terminal fonts.
yes it does. the market clearly wants higher res screens, that's why "Retina Display" has been a major marketing point for Apple for years, and you literally haven't been able to buy a Macbook with a 1080p screen for years. This is not even for tech enthusiasts or professional uses, it's basic tablestakes in the market now.
> you literally haven't been able to buy a Macbook with a 1080p screen for years.
You are technically right, but to be fair, it has been barely more than one year. Apple sold their 2017 model of the 13" MacBook Air, 1440x900, in July 2019:
No it doesn't. There are mostly PC laptops with 1080p precisely because of people like me who only want 1080p because more pixels are pretty useless. Have you actually shopped for one recently? Top hits on Amazon are all FHD.
> "Retina Display" has been a major marketing point for Apple for years...
Exactly. And that's all it is - marketing, particularly for egoists who want to feel like they're better however they can. The vast majority of the market, mostly those who don't use Macs, don't give a shit.
Right, because Mac users aren't generally known for carrying around this air of superiority by saying things like "everybody wants what I have with my Retina/M1/touchpad gestures/etc."
If you like the way text looks on a Retina display just say that. Instead you're here on a thread about a PC product that you'll probably never use, acting like anyone that doesn't have a 4k display must be missing out. (You're right, I must be projecting :) Personally, I like desktop rigs with 1080p, a 5 button vertical mouse and 32-64gb of RAM in a quiet private office. I don't optimize for working on planes, trains and in meetings even though I do own a PC and Mac laptop as well...
I see two on the top page with 2k. But none with >16GB RAM and they’re all Windows - guess only egotistic people want Linux and 32GB RAM?
Some people do make use of a higher res screen, just because you don’t doesn’t make it useless. I haven’t wanted or owned an Apple computer for 10 years.
> ... none with >16GB RAM and they’re all Windows - guess only egotistic people want Linux and 32GB RAM?
Huh? Hardly anybody sells Linux laptops, mostly not on Amazon either. I think you'll be hard pressed to find many laptops pre-configured with 32GB RAM on Amazon too.
> Some people do make use of a higher res screen, just because you don’t doesn’t make it useless.
Okay, but the person I was responding to said that 4k is "table stakes". It's clearly not. Point being - not everybody wants a 4k screen. I didn't say that they were useless to everybody, I said that essentially "most people don't want or need them".
I've been a professional programmer for over 20 years and I have tried many, many different screen resolutions. I have a 2015 Macbook Pro right here next to my Acer E5-575G with 15.6" Full HD (1080p) screen.
Comparing them directly - I have absolutely no clue what you are talking about. Fonts, images and the UI look just as good if not better. I'll say better since I think the Mac UI is hideous.
You're right though - it's definitely not about screen real-estate because with Retina/2k/4k the OS typically has to scale everything up so you basically lose any real-estate that would have been gained in the first place. That's ridiculous IMO.
The only place I use Retina/2k/4k is for gaming and movies and I definitely am not doing either of those activities on a lowly laptop.
Retina display are simply more dense but add nothing to the screen real estate , nobody uses them at 1x I bet many don't even know that every pixel they see is actually 4 pixels on the monitor
Better have a larger monitor than a 13 inches packed with useless and expensive pixels
AFAIK they just resell commodity generic laptops under their brand and they do some System76 specific things to them such as open source bios firmwares. I may be wrong but when I looked into them years back thats what I uncovered. I think if they want to differentiate themselves as a high end ubuntu/developer machine they're going to need the things people want (or don't know they want). My personal needs are high res, bright, color accurate, crisp displays (currently loving my Samsung Galaxy Chromebook) and high quality keyboards. I don't really know of a device that has both.
Yep, kind of a deal breaker for me, which is a bummer because this almost looks perfect for me. It seems this is a weakness across their product range.
AMD integrated graphics needs a bit of explaining to me is this the performance monster variety or Intel graphics is actually faster type of deal. How many external screens can I drive with this setup?
My main use case for this would be using Darktable. For the same reason I want a good HDPI screen. Otherwise, this would be a development machine (docker, misc jetbrains stuff, homebrew, some data engineering, and all the usual stuff). So looking for ram and CPU that is faster than what I have today (2018 15" MBP).
My mac book pro is literally falling apart (keyboard) at this point and I'm not eager to jump on the ARM bandwagon yet as it will mess up my tool workflow short term (Java, docker, intellij, homebrew) and cause me compatibility headaches I just don't need in my life right now. I am liking the performance though. All the software that I use runs fine on Linux. So, not a hypothetical option for me. I'd be up and running in a few hours with little to no loss of functionality.
I don't care about legacy ports; USB-C dongles are cheap and effective and I have a few in my bag. Likewise with hdmi to usb-c. What I do care about is being able to connect my Thunderbolt USB-C phone, fuji camera, hard drives, external screen, etc. I'm thinking of adding a headphone to that mix. 2 USB-3.2 ports seems like it's limited.
I just picked up a Lenovo Yoga Slim 7. The only thing I can fault it on is the screen, the brightness is a bit anemic and there is some bleed. However, the performance is fantastic. I've been using it for image processing, and the 8C16T gets through my problems at the same speed as my desktop Ryzen 2700X. Similarly the onboard GPU is sufficient for playing Witcher 3, at just a slightly lower performance than my RX580.
It has 2 legacy USB-A ports, MicroSD, HDMI and a USB-C-3.2 which I've used to run a 4K screen at 60Hz.
Price was very reasonable. I'm happy, this has stepped up my laptop CPU performance by a factor of 4, the GPU can now do all the stuff I want from a laptop, and it's compact and has good standby life.
I bought a Oryx Pro about 2 1/2 years ago to get a laptop with an NVidia GPU. I bought the high resolution version, but sometimes run it at the lower 1920x1080 resolution.
I just looked and they don’t seem to offer the higher resolution right now on that model.
I would prefer 2K res on a laptop. On a 15 inch screen, 4K is overkill - I have to increase desktop scaling to read text, so there's no increase in screen real estate. Higher res drains the battery faster as well.
Not even a 4:3 display! 1080 vertical is almost useless for coding. Seems much of their target market of Linux enthusiasts would have a lot of coders wanting better than that.
There you go. 3000x2000 on a 13.9" screen with a reasonably current processor. They have a few updated variants out there. (Was shopping for a 3:2 laptop when Dell finally got around to shipping a 16:10 7" display)
Sure it can work, but it’s certainly not for me since I believe it’s a poor choice of default screen size for anything but primarily watching movies, as the aspect ratio limits the vertical viewing area significantly.
I like the way one blogger describes a 1440p vs a 1080p: “A 1440p monitor will give you 3x code windows side by side and 76 lines of code. This is a pretty big deal. You might not realize it until you’ve tried it for a while but being able to comfortably have 3 editor files open and being able to view about 50% more vertical lines of code at a glance is huge.” ( https://nickjanetakis.com/blog/how-to-pick-a-good-monitor-fo... )
Though someone else mentioned they have 4K version, not just a 1080 version! That can work better with the right scaling. The rest of the laptop looks nice.
I also use a mac pretty regularly and a 1440p desktop. I was getting at the aspect ratio issue, which it seems like you were talking about first, before now talking about resolution (?). Yes, a higher resolution can fit more pixels. The absolute actual size in inches also matters though for how big the text ends up appearing.
Pretty much every IDE now assumes wide screen and default to a layout with a central editor and navigation/tools sidebars on the sides. And it actually works out great!
I don't think a majority of linux enthusiasts codes mainly in vim or emacs anymore.
No idea. I still use emacs because it's still the most programmable editor. I had some hopes for Atom, but after Microsoft bought GitHub... seems dead-ish.
I really wish system76 would start giving options for 4k displays. I have a FHD display from them, only a few months old, and it's noticeably worse than my old macbook pro display.