Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Display: 15.6″ 1920×1080 FHD, Matte Finish

I really wish system76 would start giving options for 4k displays. I have a FHD display from them, only a few months old, and it's noticeably worse than my old macbook pro display.



This is MY experience. I have run 4k on my ThinkPad X1 Extreme with Linux. Its not been a good experience for me. I am getting old and having vision issues, so I decided to get the 4k screen because my old MacBookPro 2015 had the high res and worked well with my eyes, it just didn't have the 15" screen.

Display Managers: When Linux boots in a 4k screen, the GRUB menu is impossible to read, so I have it memorized. I have tried many Display Managers, some recognize 4k and adapt with no issue, many do not. GDM and SDDM work well, XDM not so much, fonts are all over the place.

X11 (it has an NVIDIA card) There are some apps which refuse to adapt to 4k if you are running GNOME (and yes I use the QT_SCALE_FACTOR). Then there are apps written in WXWIDGETS. Steam menus and items are unreadable to me. Some Window Managers barely support HiDPI displays (XFCE I am looking at you). It goes on and on, by the end of the day its a very frustrating experience. GNOME (and it's derivatives) do it the best so far, KDE has issues with icon sizes and the panel. XFCE has multiple issues, so much that I gave up on it quickly and can't remember them all (and it looked awful). i3 was a little tricky at first but it can be configured to work well, but you need to adapt all the apps and bars too.

Other OSes: Even MS Windows has frequent problems with the screen magnification. macOS just gets it right, and I never had ANY of these issues, except small fonts in the firmware screen.

For my next laptop I will go back to 1920x1080 and avoid all these problems and start using AMD GPU's so I dont have to turn off secure boot and stop using X11.


Tip: install Pop!_OS - Nvidia edition. I also have a (1st gen) Thinkpad X1 Extreme with a 4K screen. In addition to that, I have 3 external 4K screens connected to it (so 4x 4K screens in total), and it runs smoothly without having needed any additonal config/customization.


How much work does it take to get Ubuntu to a good state with Nvidia drivers. e.g. equivalent of ease of use of Pop!_OS. I have a Thinkpad X1 Extreme (Gen3) on the way, and I am researching Pop!_OS vs. Ubuntu 20.04 LTS. I am leaning Ubuntu but if you tell me there is something magical in the Nvidia Pop!_OS distribution that is hard/impossible to replicate in Ubuntu 20.04 I would consider it.

If I install Xfce instead of Gnome desktop will I break the magic?

tangent: Who the hell let them name Pop!_OS, such an engineer name.


I’ve tried both. There is no magic except few additional tools to enhance desktop experience like power manager [0] and hi resolution daemon[1]

[0]https://github.com/pop-os/system76-power

[1]https://blog.system76.com/post/174414833678/all-about-the-hi...


Thanks. I’m playing around with popos this weekend.


Fwiw, my default method of managing the NVIDIA drivers on Ubuntu 18.04 is by installing the System76 driver package using the directions from their site. I found that to make it way easier than doing it manually, especially bc I needed to manage two CUDA versions on one machine.

I wouldn't guess that switching away from GNOME would break hardware compatibility drivers, but I could be wrong.


I tried popos, but still have an unreadable grub screen.


Is that the only issue there? Sounds like it's not worth going back to 1080p over.



I quite like the QHD ThinkPad displays (2560x1440), very similar to the quality of a Retina.


Specifically regarding GRUB:

Add these lines to /etc/default/grub: GRUB_GFXMODE=1024x768x32 GRUB_GFXPAYLOAD_LINUX=keep

And then run: sudo grub-mkconfig -o /boot/grub/grub.cfg

(see https://wiki.archlinux.org/index.php/GRUB/Tips_and_tricks#Se...)


> Display Managers: When Linux boots in a 4k screen, the GRUB menu is impossible to read, so I have it memorized.

weird, I have a 4k screen but the bios, grub, etc are just scaled across the screen. A bit blurry of course but definitely readable.


Using XFCE 4.14-r2 (Gentoo) and things are okish with Intel GPU (Lenovo P71, 17' 4k panel). But I did not use the option "Window Scaling" (currently set to 1x) instead I changed in the Fonts-tab the "Custom DPI settings" setting it to 192.

The only problem that I have left is with Google Chrome & Vinagre: whenever I move the cursor inside their windows it stops being scaled up (therefore it becomes tiny). Used to work fine, changed during some recent update :(


I'm in the same boat where my eyes were not as good as they used to be and I often run a scaled display rather than 1:1. I'm using Ubuntu 18.x LTS and it comes out of the box with gnome but I primarily use i3wm. I was able to scale my display to be 25% larger and every app by adding this line to my .Xresources:

  Xft.dpi: 125


This gets into the technical details more than I am able to, but I agree completely. I also have older eyes, and 4k was just unusable (on a Galago with a 13" screen). I could have scaled down by a factor of two, but that was too coarse.

I exchanged the Galago for a 14" Darter, and 1920x1280 is just right.

The Pangolin looks pretty nice to me.


Same. I use the Xresources and scale factor environment variables for my desktop with its 32 inch screen and I love it. It's great. But for a laptop, you get a little crispness on the text, but in general there's no reason to get a 4K display on a 13~15 inch screen. It's not worth all that effort.


You can change the GRUB loader resolution in config, my dear penguin.

Also, KDE is better in all regards. Ditch gnome.


> Also, KDE is better in all regards. Ditch gnome.

Yup, did that, GNOME looks nice, but... KDE has some issues with HiDPI, like a mouse pointer that is way too small on window controls. Yes I did change the cursor size to 48, but it does not work with what I assume is a kwin issue. So the cursor hops from one size to another all over the place.


This may be a bug in an older KDE version. I'm on debian testing with running the latest stuff from https://www.preining.info/blog/2020/12/debian-kde-plasma-sta... . I'm no longer running into those HiDPI issues in gnome or kde apps (some wxwidgets stuff is still wonky). Nothing ends up the wrong scale when using Plasma Wayland, but there are still other issues that aren't worked out that keep it from being perfect either. You might quickly try the KDE Neon distro on a live usb disk to see if it behaves better and then you'll know if the bugs you're running into have been fixed. https://neon.kde.org/


You can change GRUB and add your own fonts, yes:

   GRUB_GFXMODE=1920x1080
   GRUB_GFXPAYLOAD_LINUX=keep
   GRUB_FONT="/boot/grub/fonts/DejaVuSans30.pcf"
but you can't change the framebuffer resolution:

https://unix.stackexchange.com/questions/201965/setting-the-...

..or at least if you can, no one has bothered to answer my question on how for several years.


> You can change the GRUB loader resolution in config, my dear penguin.

But why do you have to? Why doesn't it use a reasonably sized font by default?


Since you are of the opinion that it is a small matter of programming, and it is an open-source project that takes contributions, you should definitely go write that.

Since it has not already happened, you may assume one of the following:

- it is in the works but has not made it to the version you are using.

- nobody else thought of that, congratulations

- it turns out to be harder than just identifying the issue


that's a bit snippy for a response to a fairly reasonable question


Pardon me but that sounds rather entitled.


Tip: Check the Arch Wiki's HiDPI page... There is a bit more to it thank QT_SCALE_FACTOR.

https://wiki.archlinux.org/index.php/HiDPI


The MacBook Pro display is better in many more ways than just resolution. It's also has a wide gamut, excellent factory calibration, good contrast, and an effective antireflective coating on its glossy panel. I'd prioritize each of these over increasing the resolution from 1080p.


Couldn't disagree more (writing from MacBook display). I'd take mate FHD pver Macbooks smudgy, glossy retina any time of the _day or year_. Glossy displays in laptops should never have been a thing to begin with and I'm not sure how apple managed to convinced world that this should continue.


I used to think this way too, up until this summer when I spent a lot of time at my parent's house working from the garden.

My personal pc is a 2013 MBP, so glossy screen. My work pc is a hp probook 430 g5 (~2 years old) with a matte screen.

The one that I would take out in the garden when it was sunny was the mac. Sure, I could see myself in it, I had to set the backlight to 100%, but it was possible to focus my eyes such that the text on screen was actually readable. Of course, this implies black text on white background and to sit in such a way as not to have too contrasty a background.

The matte screen of the HP was completely useless. It was just a big blotch of white, it was next to impossible to see the text. I even took out an actual external monitor once to try it out, a Dell P2415Q with a backlight that's blinding inside. Same problem.

Now sure, when there's not a high level of ambient light, the matte screen is more easy to use as there are no reflections. But as soon as there's a source of light that shines on it (and there are cases when you can't control it) it's game over.

Protip: clean the macbook screen from time to time. Even if it doesn't look dirty. I wear glasses, so I use the cleaning spray on the screen. It always amazes me how much better it is afterwards.


The HP is only 208 nits of brightness. 300 nits is considered by many to be the absolute minimum for being outside. The Mac is probably 500 nits.


In general, and for working outside especially, i recommend the Solarized colorscheme. It is optimized for good readability and contrast, while being easy on the eyes. https://github.com/altercation/solarized


> It is optimized for good readability and contrast

It most certainly is not. Solarized is a low contrast theme by design and fails accessibility standards for text contrast.


Honest question: what is an example of a good accessible color theme? I'm assuming Solarized is not a good choice for those with color blindness? Please explain, I'm very interested in color and accessibility.


> what is an example of a good

Good for you is the one you enjoy using. Good for me is the one I enjoy using. Even if I didn't have a problem with Solarized's lower contrast, I would still think that its colors are hideous.

The problem for many with low-contrast themes is that they often aren't focusing on keeping contrast high. They're mostly focused on keeping contrast low. They aim to avoid being "too contrasty" without targeting being "goodly contrasty".

> color blindness?

Color blindness is its own set of problems. To the best of my knowledge I'm not colorblind. I just can't easily read poorly differentiated text unless I really crank up my screen's brightness or make the text much larger. The problem I think is mostly one of pupil physics relating to the impact of external light on pupil dilation and how that affects both focus accuracy (pinhole cameras focus perfectly without lenses) and also the amount of light entering the eye (smaller aperture means less total light, less light means less difference between light things and and dark things). Neural color perception is another issue that is related but only sort of. Or maybe there's an analog to colorblindness that we just don't talk about that deals with sensitivity to luminosity variations. IDK.


Solarized is diarrhea Christmas lights.

Pardon the choice of words, but it is the only effective way I could convey my opinion re solarized and most color schemes.

Most color schemes try too hard to differentiate everything, lack good contrast and do not aid in understanding the code. Many of them and as a good example solarized, lack good contrast. I had so many issues trying to find a good scheme, that I resorted to creating my own for Jetbrain software, and make due with some of the rainglow schemes on VSCode. The themes are mostly black text on white background and the highlights are not obtrusive so as to draw more attention that the core. In the end, I spend more effort trying to mute the loudness of the colours than reading the content.


I have a complete opposite experience. I recently switched from Thinkpad t490s to macbook pro 16". I tend to work outside in the mornings and the glare has been driving me crazy. I'd rather have slightly less bright screen but actually see the content instead of my own morning bed-hair!


My MacBook Pro screen has less glare, or at least better color than my LG Ultrawide in similar conditions. Even at the worst angles, I can still read everything just fine. I'm not going to be editing photos in an airport, but it doesn't get in my way when coding.

They convinced the world by pumping up the brightness and providing better color than other matte screens.


> smudgy

Could you elaborate on this? Maybe you've enabled font smoothing?


Leaves visible finger prints / marks on screen


Just close your lid to find out! Everytime I open my macbook I'm greeted with a lovely print of my keyboard on my screen! You could call me out on my cheetos fingers but I use an external keyboard... As if the laptop wasn't heavy enough I have to carry a whole cleaning suite to actually see something.


It’s also actually higher res too! My Late 2013 13” Macbook Pro has a resolution of 2560x1600. I wish more manufacturers would remember that there are specs between 1080p and 4K.


This is true for dektop monitors as well. I had to scout around for a 2560x1440 display for my desktop Ubuntu box. You would think they would be relatively common, but as you say everything was 1080p or 4k, at least when I was shopping in-person pre-covid. I ended up with an AOC gaming monitor that I really like that I bought online.


Yes, but except for the (and here I mean resolution, not gloss), apple laptops are a mediocre choice in general.

I mean, that OS experience, man. It's like people are living in an abusive relationship that hasn't improved since 1998.


Agreed. The displays, speakers, touch pads on Apple products are amazingly good. But the pro laptops have been in steady decline as Apple gears them to the average user more and more over the years, instead of the power user like they were originally built for. I think System76 has the right direction for power users, shiny screens and crisp sound doesn't help me hunt through gigabytes of data and millions of lines of code.


This heavily depends on the model. My 2017 MBP has a worse (less bright and worse antireflective coating) display than my 2015 MBP.


Apple has never sold a laptop with a 4K panel. Highest resolution of any Macbook was the 2019 Macbook Pro 16" with 3072x1920. https://en.wikipedia.org/wiki/Retina_display#Models


I really do wonder where exactly 4K would begin if more and weirder resolutions were commercially available. Like, the fact that 3840x2160 is considered the canonical 4K resolution means you don’t actually need 4000 pixels, just a number that’s close to 4000. I wonder what would happen if someone dropped a 3800x2000 display just to mess with people.


That is just double 1980x1080, and if you shuffle it a little you get a 4000x2000 pixels, but that would mean your aspect ratio is different than what people are used to.


I agree with some other posters here that I don't need it to be 4k, but 1080p is bad enough to be a deal killer for me. The only reason I don't buy System76 laptops is their terrible displays — I love PopOS (so much that I donate monthly to it) and would buy their hardware in a heartbeat otherwise.


> The only reason I don't buy System76 laptops is their terrible displays

You've never used a laptop with a decent touchpad? Or one without a loud fan? Or one with a good battery capacity?

The displays are not the only lackluster thing about System76 hardware.


It would be nice to have the option.

But given that Linux still doesn't have perfect support for display scaling (compared to macOS, for example) and the fact that it's a 15" screen, I would still probably opt for the 1080p if given the option.


Actually Linux has my favorite display scaling --- if you just have one screen.

I spend all day with Firefox, terminals, and emacs in XMonad and they all scale perfectly with no raster UI elements. (XMonad's pixel boarder I didn't bother to scale.)

On Windows and MacOS last I checked, there was a lot more resized raster bullshit.


Even with multiple displays it's not that difficult - just pass the scale you want for a given display when calling `xrandr` and things generally just work as long as you're downscaling from the "native" dpi you have X using.


It's 2020, we can make rockets land on self-driven barges, and somehow we're still forced to use xrandr manually...? This is the sort of thing that keeps me from going back to the Linux desktop.


You don't have to. Ubuntu 20.04 and Gnome desktop provide a fine hidpi experience. There are a few warts, but it all generally works. If you start dabbling with other desktop environments like xmonad or i3, it gets tricky.


I can never get it to work properly with multiple displays. I have a 4k monitor and two 1080p monitors. Using 192dpi with 2x scale (param passing to xrandr), all kind of display errors popup (wrong color, weird fonts etc).


Yeah, I wanted to configure hidpi on linux recently, when I had to work on FHD 14" screen without external monitor for a while. I found no config options in UI, and guides involving xrandr and fifteen different steps on the internet. I thought hell no and abandoned that idea altogether, cranked font size in the terminal and worked like that. And I actually "can" work in Linux and configure easier stuff. As far as I'm concerned at least some distros/DEs have non-existent hidpi support right now.


My configuration panel only has 2 options for scaling: 100% or 200%. Needless to say neither option is very helpful.


If in Gnome, fractional scaling is possible but not enabled by default. I use 125% and it has been fine. I did it via Gnome-tweak, or you can manually enable it [1].

[1] https://www.linuxuprising.com/2019/04/how-to-enable-hidpi-fr...


There should be a slightly hidden custom option if you're using windows. (Not in the drop-down)

You'll have to relog for the scaling to take effect with custom numbers for some reason


You can only do integer scales though this way, can’t you?


As for me: I haven't figured it out yet, sometimes a fractional scale works, sometimes you get vague errors. In any case what I have noticed is that quality drops significantly if you use fractionals.

For my set-up at home (1080 - 4k - 1080) I have a simple script, that really was more of a `write once - use always`-experience. E.g.

    xrandr --output HDMI1 --auto --scale 2x2 --pos 0x0+0+0
    xrandr --output eDP1 --auto --scale 2x2 --pos 7680x0+0+0
    xrandr --output DP1 --auto --mode 3840x2160 --scale 1x1 --pos 3840x0+0+0 --primary
So basically what I do is, I virtually up-scale my other screens up to 4k, such that system-wide zoom settings are (more or less) consistent and enjoyable.


No, with `xrandr` you can scale to any fractional amount. For example, I had 2 24" external monitors (full-hd, not 4k), and a 13" thinkpad (chromebook, booted into linux). Obviously, the respective DPI's were way off.

Just a quick calculation in a repl, then called xrandr with the right scaling number, and presto, I had them exactly the same.

Then, with precise lining up of the displays, it was basically one seamless "sheet of paper". If I had a long web page or pdf spanning the displays (the laptop was directly under the monitors), then scrolling through it was quite "trippy".

I think because this was now a unified display where the bottom 1/3 was tilted differently to the top 2/3'rds (think of a curved ultrawide, but in portrait mode)

Honestly, it really was that simple.


If you want fractional scaling I’d suggest you look into wayland.


How do you use xrandr in Wayland?


You don't want to. Wayland-based GTK and Qt should to do per device DPI. XRandR cannot.


Pop!_OS (their own Linux distro) has DPI scaling built in. I can actually turn on x2 scaling, and while it's huge on the poor 1080 display, it works perfectly.

Also a lot of the big linux distro's now have support for this. I know Manjaro and Ubuntu can do it with minimal effort.


Is there any end in sight for the pixel doubling hacks? Why do we even use them any more? Apps should have had plenty time to adapt to higher DPI displays by now, and X11 has supported per display DPI since the 90s and a scale factor on top of this seems obviously wrong.

And for the niche case where you have a big disparity in DPIs of multiple displays + apps that can't adapt to DPI on the fly, scaling up is surely the wrong solution, you should be scaling down no the smaller screen.


I've tried looking a bit into this at one point and my understanding is that it's more a toolkit issue than an app issue.

QT has an option to adapt the size of the widgets according to the reported screen resolution, which seems to kind of work. I don't use that many QT apps, so not sure how this turns out in practice.

However, GTK just doesn't care.

All this is on X11, I hear the situation is somewhat better on Wayland.


I'd guess in Wayland the compositor would be responsible for handling such issues.

I could readily see different filter profiles for different applications. Some of which render, and take input ques, at scaled resolutions.

What I hate a LOT more are games and applications that make any sort of connection between Pixel Aspect Ratio and a display resolution.


I recently got a 2020 Mac mini and I can tell you without hesitation: Linux+Wayland has far superior scaling support than macOS.

macOS only allowed me to run my 3440x1440 display at lower resolutions, but no _actual_ scaling support. The font is around 3mm tall.

Meanwhile, wayland scaling work wonders. The big issue being that xorg-only apps are blurry (it’s recently been fixed in master though).


I use Fedora on a 4K 15" display with an Intel GPU, and the scaling works fine. However, the Intel GPU is a tad underpowered for the task.


I use a Skylake on a 4K laptop with a 4K external monitor above it. It is plenty fast, with no tearing. Under Qubes, where app VMs don't get to use the GPU, mpv plays full-screen 1080p video using 3 cores.

(Oddly, the external monitor will do 4K only at 30 Hz. I don't know if that is a limitation of the laptop or of the monitor.)


The 30hz thing might be because of an older hdmi version, like 1.4. I have had the same issues and it was a hardware limitation


I have a 13" HP Spectre from mid-2017 with a 4k display. It's noticeably sharper (and brighter) than the 1080p alternatives. Prior to that, I had a QHD laptop from circa 2013. No font size issues on Linux that affected my work.


Ubuntu Unity, which I still use today, has working fractional scaling (in 12.5% steps) for ages now. In fact, I'm pretty sure it had that before even Windows did. Gnome seems to have finally caught up as well, even with Nvidia drivers. Like it or not, 4K is the current standard and going backwards makes no sense.


4k is overkill at this size. 1440p is better for battery size and you would scale 4k up to 1440p effective anyway.


I agree I would be happy with 2k as well. But this doesn't appear to be offered in laptops for some reason.


1080p == 2K


1080p is 1920x1080 and 2k is 2560x1440


What? Who defines 1440p as 2k?

https://en.wikipedia.org/wiki/2K_resolution


I don't doubt it's used that way, but that makes absolutely zero sense.

For that matter, it's 1080p that's half the (linear) resolution of 4k.


2K is a generic term for resolutions that have a horizontal resolution of approximately 2,000 pixels. In terms of cinema the DCI standard for 2K is 2048x1080. There is also the flat cropped resolution of 1998x1080 and the cinemascope cropped resolution of 2048x858. 1080p has the same vertical resolution as DCI 2K and it's horizontal resolution is also approximately 2,000. 2560x1440 is 560 pixels more than 2,000 compared to 1920x1080 which is only 80 pixels away. That is 7x further away from exactly 2,000. Calling 2560x1440 2K is too big of a stretch. If you wanted to give it a name based upon its approximate horizontal resolution then you should call it 2.5K.



The main problem is that it is low dpi. It means you can't really connect a high external dpi monitor and then use both displays at the same time, because linux can't (or at lest couldn't within the last year) handle displays with different DPI.


All my screens are 1440p and the same model, so there's that.


If you need a 4k matte display, consider Purism Librem 15.

https://puri.sm/products/librem-15


I have one, and it's an amazing laptop. However, as Purism are focused on privacy and free (as in freedom) software, they won't put in a gpu.

System76, for me, seems to have a balance between coreboot and as much free software as possible. But also the addition of a proprietary blob full gpu.

Also the system76 cpu specs are slightly higher.


I could take or leave 4k, but at least 2k (~2560x1440)? Having their 2020 high-end 15.6" maxing out at 1080p is just ridiculous and I'd probably have bought at least one of their laptops if they had a sub-2kg model with just one step up in resolution.


Anything above 1080p doesn't give you any more usable screen space at that physical size. It's very debatable whether just having richer blacks is worth the battery life hit.


Speaking from experience and having played around with fonts, scaling and DPI, I still feel a bit hampered switching back to my Thinkpad t470s (14" 1080p) from a long time on my t490s (14" 1440p). I do use pretty small terminal fonts.


yes it does. the market clearly wants higher res screens, that's why "Retina Display" has been a major marketing point for Apple for years, and you literally haven't been able to buy a Macbook with a 1080p screen for years. This is not even for tech enthusiasts or professional uses, it's basic tablestakes in the market now.


> you literally haven't been able to buy a Macbook with a 1080p screen for years.

You are technically right, but to be fair, it has been barely more than one year. Apple sold their 2017 model of the 13" MacBook Air, 1440x900, in July 2019:

https://web.archive.org/web/20190708170222/https://www.apple...


No it doesn't. There are mostly PC laptops with 1080p precisely because of people like me who only want 1080p because more pixels are pretty useless. Have you actually shopped for one recently? Top hits on Amazon are all FHD.

> "Retina Display" has been a major marketing point for Apple for years...

Exactly. And that's all it is - marketing, particularly for egoists who want to feel like they're better however they can. The vast majority of the market, mostly those who don't use Macs, don't give a shit.


Eh? It’s not useless and it’s not all marketing, and it’s certainly not all about feeling superior. I feel like you’re projecting.

It’s completely simple: I enjoy reading content on a higher res screen. It’s clearer and easier on my eyes. That’s all there is to it.


Right, because Mac users aren't generally known for carrying around this air of superiority by saying things like "everybody wants what I have with my Retina/M1/touchpad gestures/etc."

If you like the way text looks on a Retina display just say that. Instead you're here on a thread about a PC product that you'll probably never use, acting like anyone that doesn't have a 4k display must be missing out. (You're right, I must be projecting :) Personally, I like desktop rigs with 1080p, a 5 button vertical mouse and 32-64gb of RAM in a quiet private office. I don't optimize for working on planes, trains and in meetings even though I do own a PC and Mac laptop as well...

Look at all the top listings here though- they're all FHD. Nobody cares about 4K on a laptop display - https://www.amazon.com/s?k=laptop&ref=nb_sb_noss


I see two on the top page with 2k. But none with >16GB RAM and they’re all Windows - guess only egotistic people want Linux and 32GB RAM?

Some people do make use of a higher res screen, just because you don’t doesn’t make it useless. I haven’t wanted or owned an Apple computer for 10 years.


> ... none with >16GB RAM and they’re all Windows - guess only egotistic people want Linux and 32GB RAM?

Huh? Hardly anybody sells Linux laptops, mostly not on Amazon either. I think you'll be hard pressed to find many laptops pre-configured with 32GB RAM on Amazon too.

> Some people do make use of a higher res screen, just because you don’t doesn’t make it useless.

Okay, but the person I was responding to said that 4k is "table stakes". It's clearly not. Point being - not everybody wants a 4k screen. I didn't say that they were useless to everybody, I said that essentially "most people don't want or need them".


Most of the market just wants the cheapest thing that will allow them to chat on facebook and watch netflix.

If you are a professional and spend 10 hours a day at your computers, you will realize the quality of the screen matters a lot.

As seen developer with 1080p at 14" inch I cannot stand the ugly fonts you get.

It's not just about real estate. Just out side by side a 1440p/retina screen next to a 1080p and you will see the difference.

It's ok if you value battery, good for you. But saying there is no advantage is ridiculous.


I've been a professional programmer for over 20 years and I have tried many, many different screen resolutions. I have a 2015 Macbook Pro right here next to my Acer E5-575G with 15.6" Full HD (1080p) screen.

Comparing them directly - I have absolutely no clue what you are talking about. Fonts, images and the UI look just as good if not better. I'll say better since I think the Mac UI is hideous.

You're right though - it's definitely not about screen real-estate because with Retina/2k/4k the OS typically has to scale everything up so you basically lose any real-estate that would have been gained in the first place. That's ridiculous IMO.

The only place I use Retina/2k/4k is for gaming and movies and I definitely am not doing either of those activities on a lowly laptop.


It doesn't

Retina display are simply more dense but add nothing to the screen real estate , nobody uses them at 1x I bet many don't even know that every pixel they see is actually 4 pixels on the monitor

Better have a larger monitor than a 13 inches packed with useless and expensive pixels


Yeah, I’ve terrible vision, and I can still tell the lack of sharpness at 15” 1920.

Actually, the lack of sharpness makes it even harder to read.


AFAIK they just resell commodity generic laptops under their brand and they do some System76 specific things to them such as open source bios firmwares. I may be wrong but when I looked into them years back thats what I uncovered. I think if they want to differentiate themselves as a high end ubuntu/developer machine they're going to need the things people want (or don't know they want). My personal needs are high res, bright, color accurate, crisp displays (currently loving my Samsung Galaxy Chromebook) and high quality keyboards. I don't really know of a device that has both.


It's crazy that many laptops are 1080p in 2020. Such low resolution is unacceptable to the majority of users.


And here am I with the 1680x1050 monitor... I agree though


Yep, kind of a deal breaker for me, which is a bummer because this almost looks perfect for me. It seems this is a weakness across their product range.

AMD integrated graphics needs a bit of explaining to me is this the performance monster variety or Intel graphics is actually faster type of deal. How many external screens can I drive with this setup? My main use case for this would be using Darktable. For the same reason I want a good HDPI screen. Otherwise, this would be a development machine (docker, misc jetbrains stuff, homebrew, some data engineering, and all the usual stuff). So looking for ram and CPU that is faster than what I have today (2018 15" MBP).

My mac book pro is literally falling apart (keyboard) at this point and I'm not eager to jump on the ARM bandwagon yet as it will mess up my tool workflow short term (Java, docker, intellij, homebrew) and cause me compatibility headaches I just don't need in my life right now. I am liking the performance though. All the software that I use runs fine on Linux. So, not a hypothetical option for me. I'd be up and running in a few hours with little to no loss of functionality.

I don't care about legacy ports; USB-C dongles are cheap and effective and I have a few in my bag. Likewise with hdmi to usb-c. What I do care about is being able to connect my Thunderbolt USB-C phone, fuji camera, hard drives, external screen, etc. I'm thinking of adding a headphone to that mix. 2 USB-3.2 ports seems like it's limited.


I just picked up a Lenovo Yoga Slim 7. The only thing I can fault it on is the screen, the brightness is a bit anemic and there is some bleed. However, the performance is fantastic. I've been using it for image processing, and the 8C16T gets through my problems at the same speed as my desktop Ryzen 2700X. Similarly the onboard GPU is sufficient for playing Witcher 3, at just a slightly lower performance than my RX580.

It has 2 legacy USB-A ports, MicroSD, HDMI and a USB-C-3.2 which I've used to run a 4K screen at 60Hz.

Price was very reasonable. I'm happy, this has stepped up my laptop CPU performance by a factor of 4, the GPU can now do all the stuff I want from a laptop, and it's compact and has good standby life.


The Adder has a 4K display, and the Bonobo has the option.


But both are Intel/nVidia.


For a 14" display, I think that a 2k display is a better option.


That's only 1k in double-pixel remdering. Much less smooth.


Why would you get a 2k screen and then render a 1k image on it?


I bought a Oryx Pro about 2 1/2 years ago to get a laptop with an NVidia GPU. I bought the high resolution version, but sometimes run it at the lower 1920x1080 resolution.

I just looked and they don’t seem to offer the higher resolution right now on that model.

Anyway, it is a fine laptop.


I would prefer 2K res on a laptop. On a 15 inch screen, 4K is overkill - I have to increase desktop scaling to read text, so there's no increase in screen real estate. Higher res drains the battery faster as well.


This is by far the biggest reason I don't buy these.


my galp2 has the 4k display. I'm not sure why they changed. It's been great. .. well, except for tiny grub text.


> 1920×1080 FHD

Not even a 4:3 display! 1080 vertical is almost useless for coding. Seems much of their target market of Linux enthusiasts would have a lot of coders wanting better than that.


The cow has long, long left the pen on that one. I haven't seen a 4:3 laptop in years.


Too true, for 4:3 at least. Though the Microsoft Surface is 3:2. Macbooks are 16:10, which gives 10% more vertical.


https://www.amazon.com/dp/B07FRPL763/?coliid=I3UN9SRZOFDBAD&...

There you go. 3000x2000 on a 13.9" screen with a reasonably current processor. They have a few updated variants out there. (Was shopping for a 3:2 laptop when Dell finally got around to shipping a 16:10 7" display)


I love that expression! I only knew about the ship that had already sailed...


Other fun related idioms:

"The genie is out of its bottle."

"The horse has left the barn."

"You can't unring a bell."


Thanks! I saw this over a week later.


Works pretty fine / well with a tiling window manager.


Sure it can work, but it’s certainly not for me since I believe it’s a poor choice of default screen size for anything but primarily watching movies, as the aspect ratio limits the vertical viewing area significantly.

I like the way one blogger describes a 1440p vs a 1080p: “A 1440p monitor will give you 3x code windows side by side and 76 lines of code. This is a pretty big deal. You might not realize it until you’ve tried it for a while but being able to comfortably have 3 editor files open and being able to view about 50% more vertical lines of code at a glance is huge.” ( https://nickjanetakis.com/blog/how-to-pick-a-good-monitor-fo... )

Though someone else mentioned they have 4K version, not just a 1080 version! That can work better with the right scaling. The rest of the laptop looks nice.


I also use a mac pretty regularly and a 1440p desktop. I was getting at the aspect ratio issue, which it seems like you were talking about first, before now talking about resolution (?). Yes, a higher resolution can fit more pixels. The absolute actual size in inches also matters though for how big the text ends up appearing.


Pretty much every IDE now assumes wide screen and default to a layout with a central editor and navigation/tools sidebars on the sides. And it actually works out great!

I don't think a majority of linux enthusiasts codes mainly in vim or emacs anymore.


I'm not convinced on this.

Both Vim & Emacs have hardcore followings and buoyant communities.

Neovim + coc.nvim is massive these days... I know a lot of people who've moved from VSCode (or other big IDEs) over to terminal-based editors...


No idea. I still use emacs because it's still the most programmable editor. I had some hopes for Atom, but after Microsoft bought GitHub... seems dead-ish.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: