This is like the Framework laptop but it's a UK-based company that has been around for years, and with an explicit focus on Linux compatibility to boot. I'm surprised I haven't come across them before as I've been looking for Linux-compatible laptops for years.
They do some things better than Framework such as supporting Ryzen processors, and seem a bit cheaper overall. The battery life seems like it would be better. They have a spare parts store as well and a full disassembly guide as well as an "open warranty". I was never a fan of Framework's swappable ports.
The biggest disadvantage seems to be that the screen is bog-standard 16:9.
I'm curious on the quality of the speakers, webcam, keyboard, and trackpad. I have a feeling they will not be great - in comparison to a Macbook at any rate.
From reviews it looks like the trackpad is poor because it is not whole-trackpad clickable, and the keyboard is also poor due to the short key travel.
Are there any other decent alternatives available in the UK?
> The biggest disadvantage seems to be that the screen is bog-standard 16:9.
Bigger than that IMO is 14" @ 1080p. It's just too little for daily use for me. Framework are so far the only ones in this market segment who have ever marketed >1080p laptop displays AFAIK. I have a hard time understanding why there is such a lack of higher-res panel options across the market, even when seemingly all other relevant specs are high-end.
I'm sure like usual when this comes up, some people will come reply with "you can't make out the difference anyway", which is just not true. Some of us prefer smaller fonts and higher information density. 8px font size is just less ergonomic and more tiring for the eyes. For 16:9 14" I just won't consider anything below 1440p.
* Asus Vivobook S 14X OLED - 14.5" 16:10 2880x1800 550-nit 100% DCI-P3 120Hz OLED
I'm also keeping an eye out for the next gen refresh of the Tuxedo Infinity Book Pro 14 - they use a 16:10 2880x1800 400-nit display as well (these are a TongFang ID4H1, and XPG and some other OEMs also use a similar chassis).
My last 2 company laptop were a Lenovo X1 Carbon G8 and a Lenovo X1 Carbon G9.
These machines are so reliable and resilient, no matter how much you travel, they are indestructible. I definitely recommend them.
I would not. I've had an X1C7 for three years now. Numerous Linux driver issues (especially audio), issues restoring from sleep, had the screen replaced twice and the trackpad replaced once.
To be fair if you get the on-site repair warranty, that is pretty great.
The audio problems with the C7 were unfortunate but more on the Intel side to have force pushed that with this generation of CPU while nothing was ready. I got it 6 months later at the moment where things were starting working. Never got much trouble for resting from sleep.
I am currently still using my old X1C2 from 2014. That's quite reliable by my standards. I would prefer to continue with Lenovo due to reliability and good Linux support (there is a dedicated engineer for that matter who is available on their forums and who help various distributions including Debian and Fedora to make things work). I was waiting for the Gen 10 but battery life seems disappointing on Windows, so I am waiting for people to test on Linux. Gen9 only has FHD+ or 4K+ (too low and too high).
I'm also looking for a new Linux laptop for this spring. I thought about the Tuxedo laptops, but after reading around there are a lot of complaints about quality and even Linux driver issues.
I've been pretty disappointed with my Lenovo X1 Carbon and don't want another Chinese laptop, so I'm thinking to go ASUS. Currently maybe one of:
- ASUS Vivobook Pro 14X, 14" OLED, 600 nits (peak), 3K, 90Hz
- ASUS ROG Zephyrus G14, 14" LCD, 500 nits, 120Hz, 2560x1600, AMD GPU
I personally try to avoid Nvidia dGPUs since I usually don't need CUDA and they tend to be a PITA even w/ DKMS drivers and are terrible for battery life.
I think the thing with the currently available Vivobook Pro 14Xs (M4700/N7400) is that they're currently last-gen Ryzen 5000 and Intel 11th gen chips. Even though I'd like to try out the OLED display, I'd probably go w/ the G14 atm - Ryzen 6000 and also you get 1 unsoldered DIMM slot so you could probably get yourself get up to 40GB of DDR5.
While Asus doesn't give a crap about Linux, there's a pretty active community (focused mostly on the G/M ROG laptops, maybe another good reason to go w/ the G14): https://asus-linux.org/
It seems like these days there are always si0x/s3 or other power drain issues w/ non-validated Linux laptops though. A good reason to support companies like Tuxedo, System76, Slimbook, Starlabs etc that at least make some effort that their EC/BIOSes actually play nice.
I agree Linux support would be ideal, but hardware quality is of first importance; many of those companies are just rebranded Clevo.
Until recently I agreed about absolutely no Nvidia, though it might be improving. For a few months now the drivers support the standard Wayland interface GBM, which at least means any compositor should work with them.
I agree you want good QC, support, and product design/features, but the various Lenovos, HPs, Asus, are just ODM'd machines manufactured by Wistron, Compal, Pegatron, etc so it's not all that different except big brands pay for more exclusivity. As far as repairability goes, you'd be better off with more standard parts, and honestly, lately some of the white label chassis' like the Clevo L141MU or Tong Fang ID4H1 are as good or in many ways better (lighter, unlocked BIOSes, more upgradeable (SODIMM slots!), better cooling, bigger batteries) than what many of the big OEMs have been putting out.
Not all configurations one these white labels are the same though since OEMs usually can choose what quality display panel, trackpad, etc they want to use...
My concerns are more with chassis quality. Having a good keyboard means having a rigid body. Many reviews of Tuxedo mention mushy keyboards. Even the keyboard on the Stellaris, which they highlight, doesn't fair well in opinions.
I agree most OEMs are bad, but the specific ASUS models I've mentioned, along with Thinkpads in general, seem to have better keyboards. Sadly Lenovo is making them worse every generation (with less travel).
Hey, I'm the one complaining about the lack of options - the Slimbook Executive seems to check all the major boxes and it's the first time I hear of them. Do you have any experience or anecdotes with it?
It looks like the same Tong Fang ID4H1 chassis that Schenker Vision 14/Tuxedo InfinityBook Pro 14 use (also XPG Xenia 14 I believe). If you do a search online, Notebook check, YouTubers (including some Linux-oriented ones) and various subreddits will probably have reviews. Basically looks good to me but ideally for performance/efficiency I'd wait for the 12th gen refresh (this particular chassis is Intel-only, although TF has other AMD models).
Below 45W, Ryzen 6000 outperforms, and it's battery-life can be up to twice as good as Intel 12th gen in idle/low intensity tasks like web browsing though, so for a thin and light, my hope is to see a decent AMD model come out in the next couple months.
BTW, a couple years ago I did a review of one of the first 4800H Tong Fang systems (versions of this have been available as the Schenker VIA 15 Pro, Tuxedo Pulse 15, KDE Slimbook, and Eluktronics Thinn 15): https://www.reddit.com/r/AMDLaptops/comments/hunyv6/my_mechr...
It was pretty thorough and I answered a lot of questions so maye relevant still. Also, one of the things that I liked about the Ryzen laptops is that while it mostly wasn't possible to exactly undervolt, enthusiasts have done a great job documenting mobile Ryzen's power and thermal behaviors, and you can basically script it to behave how you want, when you want: https://github.com/FlyGoat/RyzenAdj/wiki/Renoir-Tuning-Guide
I use a toggle that runs `ryzenadj -f 50` for example, which allows full turbo/speeds, but hard throttles to keep my temps right below my fan hysteresis temp to keep it completely silent. This tends to be my favorite run-mode on battery (I have it attached to a udev script for when unplugged).
> I'm sure like usual when this comes up, some people will come reply with "you can't make out the difference anyway", which is just not true.
It is true for those people. No doubt that's not true for you, but you're not everyone. With fairly small screens (14" is small compared to a 24" desktop monitor) I genuinely don't notice the difference. I'm sure if you would put two screens next to each other I'd be able to spot differences, but in daily use? Not really. I used to have a employer-provided 15" Dell XPS with some >1080p resolution (I forgot which exactly) and I really didn't notice.
There are downsides, too: the computer has to work harder to render all those pixels and the battery life is shorter; my battery life was noticeably shorter than my previous almost identical XPS with a "normal" 1080p screen, so it's a trade-off.
Having an option would be nice for those who care about it of course, but I suspect a large section of people just don't care, and especially as a fairly small shop you can't do everything.
> There are downsides, too: the computer has to work harder to render all those pixels and the battery life is shorter; my battery life was noticeably shorter than my previous almost identical XPS with a "normal" 1080p screen, so it's a trade-off.
Makes me wonder why Apple is capable of building laptops that run for hours and have a high DPI screen then?
Don't confuse Dell dropping the ball on hardware and software integration with trade-offs.
There are upsides to high DPI too: you can turn off anti-aliasing since you won't notice it when the pixels are too small to see.
For me personally, I definitely do notice the difference; I just don't care about it enough for it to be worth the downsides you mentioned (plus the downside of worse FPS when gaming).
I notice the difference so much that I have a visceral reaction to low res displays, almost claustrophobic as if a screen door is covering my eyes. Can't stand them.
I can live with the lower resolution, as my eyesight is getting worse, but not with the 16x9 aspect ratio. 16x10 (1920x1200) feels so much more productive.
My main monitor is actually 16×10 1920×1200, and at 24" I find it's kind of too low of a resolution: I wish it was more. But it was fairly cheap (second-hand/refurbished from a reseller) and the intersection of "16×10", "not too large" (I don't want a 32" behemoth), and ">1080p" is basically zero, so I had to compromise somewhere, and given the low price point and that I couldn't be arsed to figure out all the display tech and whatnot for the new screens it was an easy choice.
But 16×10 seems to be making a bit of a come-back, at least for laptops.
> We're all different, with different needs and preferences.
Yeah sure; actually, it just so happens I more or less said the same thing in a different comment a few days ago (very different topic, but similar sentiment): https://news.ycombinator.com/item?id=30984309
I mostly just wanted to hook in on why these kind of systems still frequently come with a 1080p screen.
> It is true for those people. No doubt that's not true for you, but you're not everyone.
While what you say is probably true I just wish it was not the constant narrative around laptops that have nice screens. I bought a Samsung Galaxy Chromebook and absolutely loved the 4K OLED screen it had up until the day its WiFi suddenly died. Everything looked amazing on it - even just the boring ChromeOS UI elements looked amazing. However every reviewer had the same feedback - that you can't tell the difference and you should shoot for 1080.
> Framework are so far the only ones in this market segment who have ever marketed >1080p laptop displays AFAIK.
This is weird to me because I feel like I have the opposite problem. I don't want retina displays I derive no joy from and that eat more power for that lack of joy, never mind cause problems with differential scaling if I plug into a lower res desktop screen. I find I can't get high quality laptops that don't force me to have a too-high density screen.
I have settled for the framework because everything else about it is perfect for me, but I kind of hope someday they sell a replacement non-hidpi screen for it and I will very much buy it. The possibility of those kinds of choices down the line is why I bought into it.
1440p is perfect for a 14". It has a close enough dot pitch to my 4k 27" displays, so that I don't have to do any weird scaling when connected to external monitors. My other laptop it 4k and it's a pain in the ass to tune it so that things are not too big or small across displays.
Isn't 1080p @ 14" like almost exactly the same as 4K @ 27"? It's 157 ppi vs. 163 whereas 1440p @ 14" is 209 ppi. And 4K at 200% scaling is just sharp 1080p. I guess it depends what scaling you're using on the 4K display 27" display but if it's 100% it should be near perfect unless scaling isn't working like I think.
You need to set up various environment variables to scale up per app and framework, but once you do it should work across both the monitors and the laptop.
> Framework are so far the only ones in this market segment who have ever marketed >1080p laptop displays AFAIK.
Dell has sold their Linux-ready "developer edition" XPS 13 with a UHD+ screen for quite a few years now. (They max out at 16GB of RAM, though, unfortunately.)
Dell isn't quite as "open" as something like Framework or System76 (or Star Labs), but they at least sell their Developer Edition laptop with Ubuntu preinstalled, at your option, and claim the hardware is supported properly in Linux.
(I say "claim", because I have the 2018 model, with a fingerprint scanner with no Linux driver.)
The post-Ice Lake models of the XPS series all support 32GB of RAM on the Developer Editions; I have one. Until that point this was Intel's fault because before then they were behind on their memory controllers, nothing Dell could do about it.
Fingerprint reader support for Linux in general has been a sticking point for a very long time, including on the XPS series. (I still fondly remembering the blobs needed for my old ThinkPad's reader to work...) But it's basically the only thing that requires blobs from my knowledge. I don't even bother with it on my stock Fedora install, though it is quite slick in Windows, I admit...
Ah, strange. I'd just looked at Dell's site, but couldn't configure the Developer Edition for anything over 16GB. Probably my fault I couldn't figure it out... plus their order/configure site is terrible.
I had an xps. The battery swells and the charger stopped working, then the replacement charger stopped working. The keyboard and trackpad werent great either
I don't get this either. I picked up a cheap Samsung Galaxy laptop with Manjaro a while ago to replace my broken mac book pro until I can get that replaced (getting the new 14" pro for work).
The Samsung is fine but the screen is very mediocre in comparison. The same dreadful 1080p screen as world + dog insists on these days. Gnome makes it more cramped by insisting on a top bar thingy. I'd love to have a much higher resolution and brighter screen. 16:9 is just way to claustrophobic. And why do window managers insist on this silly top bar these days? That just eats into already limited space. Currently using Gnome and of course the one extension for that that fixes that (auto hide the damn thing), promptly broke when I updated Gnome.
Writing this on my good old imac 5K; the original one from 2014. Now that's a nice screen. My 2017 15" macbook pro had the infamously shitty keyboard, which actually ended breaking the very nice retina screen by virtue of a loose key that inserted itself in between the keyboard and the screen when I closed it. So something that should not be falling apart actually fell apart and did maximum damage. Absolutely disgraceful. I'm glad they ditched that design.
The Samsung at a quarter of the price manages a nice keyboard (with numeric keys even), a passable touchpad (multi touch but mechanical click sadly) and even a nice aluminium cover. If it weren't for the screen, I'd call it a superior deal. About as fast, same amount of ssd/memory, and it runs a lot cooler (i5 with xe graphics). Also, no thermal throttling because it just does not overheat. But at this price, I'm not complaining. This laptop with a better screen would be an awesome deal. Somebody needs to start doing this. 16:10, 4K would be what I'd spend money on.
>I have a hard time understanding why there is such a lack of higher-res panel options across the market
There are plenty of options, just most of them aren't cheap, or fitting in targeted price range.
There isn't a lack of high-res panel in supply, but a lack of high-res panel in demand. Or in other words, no-one apart from Apple (not that Apple gave its user a choice, but let's ignore this for a moment ) manage to market Retina or high-res panel and created enough demand to sustain a different upgrade option or SKUs.
This then cycles back, without economy of scale, not just a vendor but the whole industry, high-res panel do not enjoy the unit cost reduction as standardise low -res panel ( or 1080P panel, which isn't really "low-res" ). So the cost of these specific panel are far higher, increasing BOM, increase RSP, etc. etc.
Next question that always comes after this answer is always;
>"But we have higher prices SKUs, it is not like there isn't a demand for current SKUs +$100 / 200"
Yes. But given the option to choose between paying extra $100 / $200, the market tends to flavour for more memory, faster CPU or better GPU. Not higher-res panel.
8px font size works fine with good hinting. In general, 1080p would be a near-optimal resolution given pixel-perfect rendering. Higher resolutions are useful when the rendering is too fuzzy, they're simply about hiding that imperfection.
Come back when you have tried that with CJK. There is no font or rendering that can make that comfortable to read.
If it works for you, great, I'm just tired of people telling me I don't understand what I want and I shouldn't have to write a blog post justifying myself everytime I claim that higher screen res makes a huge difference for my use-case. It's not like Chinese language is an edge-case.
Sure, but nobody is using 8pt CJK fonts. Even with a 5k panel, you wouldn't be able to resolve their shape without the equivalent of a magnifying glass (i.e. focusing on a smaller section of the screen). Talk about eye tiring.
*px but yes, precisely my point (wouldn't be so sure about "nobody", though...). On higher resolutions, more pixels in the same physical size. I find 10~12 all good, depending on circumstances.
But physical size is exactly what matters wrt. resolving a shape. Already at 1080p, the pixels are too small to resolve individually when comfortably looking at the screen. So there's really nonbenefit to such tiny character sizes.
> Already at 1080p, the pixels are too small to resolve individually when comfortably looking at the screen.
No, and people perpetuating this myth is what annoys me - that physical size is comfortable for me and that's what I use on a daily basis.
Try 8px font of full-width Chinese characters on 14" 1080p next to the same effective physical size on a 4k and if you still tell me it's too tiny to tell, well, I guess we have different eyesight and preferences.
Th e so called perfect rendering you're referring to is just infinite layers of havkd that rely on the subpixel layout of your panel to appear smooth. Monospaced, unaliased bitmap fonts will look great and I use them too, but most fonts out there are not designed to be pixel perfect but instead to be rasterized to different sizes, and for that, higher resolutions do help.
I do not count Thinkpad as open enough to be in the same segment anymore. I have owned many generations of them and some are still in use but will not be getting a newer one the way things look now.
One practical aspect is how they are very inconsistent in making crucial firmware updates available. In theory, they're on fwupd/LVFS. In practice, that can lag behind by months or even years and the only way to get necessary updates to get security fixes or get certain hardware working is many times to boot from Windows.
It is only the trackpoint that locked me on ThinkPad. There is IMHO clearly a user base that would be willing to move away from Lenovo only if they could get an open replacement for that single part...
Yeah me too, the trackpoint keeps me at ThinkPads for my daily driver (I have experimented with others like the Pinebook Pro, but (underpoweredness aside) the lack of the trackpoint kills the deal for me...
> In practice, that can lag behind by months or even years and the only way to get necessary updates to get security fixes or get certain hardware working is many times to boot from Windows.
I feel like I get updates fairly often with my p50 on Ubuntu which is 5 or 6 years old at this point.
My gen 9 x1 carbon is the best Linux machine (which includes desktops) I've ever owned.
I wouldn’t use a screen larger than a phone (7”) at 1440p. 4K is the minimum for laptops. Unfortunately 4K is also the maximum for desktops since I can’t find any 5K panels with a decent refresh rate (>=120Hz)
The key is to just stop caring about absolute pixel count. All that matters is the angular pixel density, with regular pixel density being sufficient when compared within a “class” of viewing distance (e.g. handheld, laptop, desktop, couch).
Regarding laptop speakers (in general) Dave2D recently uploaded an interesting video [0] where he compared MacBook speakers to best Windows laptop's ones that he had in the studio.
It really does make you wonder why there such a noticeable difference!
> They do some things better than Framework such as supporting Ryzen processors
For some weird reason if you pick a non-english keyboard you can only pick the i7-1165G7 (which IMHO is the worst proposal of the three available if you pick an english keyboard).
I got one of these in December 2021 and frankly love it. Got fed up with being pretty much the only person on earth who had a shitty experience with Apple Silicon, so jumped back to Linux after 17 years away. I don’t use it as a portable, or at least haven’t yet - it’s my daily driver for work and permanently plugged in to a monitor (which also is a USB hub). I opted for Manjaro and enjoy using it way more than I did the M1 macbook pro it replaced.
I was originally planning on waiting until like M2 but ended up needing an upgrade sooner than I had hoped for both my personal (2011) and work laptops (2016) and just have to say that M1 has been awesome so far. Snappy, low power consumption, much better cooling...
I was on an Intel MBP previously and everything I was using it for is supported just fine on M1. Some things require Rosetta but most big apps have updated support for M1 directly at this point.
Our docker development environment was based on x64 but it was pretty trivial to add support for ARM to it.
Gaming can be a challenge (when hasn't it been with Mac though), particularly GamePass games (through Parallels Windows 11) seems to be a hit-or-miss. Games through Steam, even virtualized through Parallels, have been fine though. On my M1 Max w/ 64GB memory I can play everything I've tried at ultra settings.
> Our docker development environment was based on x64 but it was pretty trivial to add support for ARM to it.
A couple of examples I've run into that weren't trivial: Chrome on Linux doesn't support ARM (Chromium does), nor does Microsoft's SQL Server Docker image (though their Azure version of SQL Server does, and for most, it's no real difference)
So I changed jobs in May 2021. Had been a happy Intel mac user since the intel switch (my first Mac was a PowerBook). New employer supplied me an M1 MBP and I was fucking delighted, since I didn’t have the means to buy my own at the time.
Contrary to other comments later, I have no niche needs. Well… I’m predominantly a Perl coder, so that’s niche but nothing to do with architecture. I spend most of my days with just iTerm, a couple of browsers, Slack. Not much else. I’ve been an edit-over-ssh developer for 20 years.
My M1 was also my first experience of Big Sur, so I was never sure whether my issue was hardware or software. But issues I had:
• frequently it would stop all ipv4 networking, going ipv6 only (which in practical terms meant I went offline)
• I would often - multiple times a week - lose all sound devices. Suddenly things would go silent, and all inputs/outputs (including the built in speakers/mic) would disappear from preferences. For a while only a reboot would fix it, until I learned to `killall coreaudiod`
• chrome would often say “you don’t have a camera”, which I could only fix by a reboot (we use Google Meet, and safari was always terrible with it. But I just prefer chrome anyway)
I had a variety of other problems but can’t recall them clearly enough to list them, especially from this temporal distance.
[EDITED to add: I did go through hardware checks and company IT checks, but nothing ever came up as faulty]
I did love that it was silent and permanently cool, but that was about it. Since it too was always plugged in, any difference in trackpad or keyboard or battery life made zero odds to me, and that’s why those things don’t matter to me about the Starbook either.
When Monterey came out I upgraded and actually things got mostly (but not entirely) better. But by that time I’d ordered the Starbook, and the desktop Linux experience has been so good - not to mention just a refreshing change - that I don’t regret it one bit. And if I do, well, the work MBP is sat in the corner and I can resurrect it…
Ya, I'm curious too. We have developers in every level of our stack using M1 macs and nobody has anything but good things to say. I suspect it is an edge case that doesn't affect a lot of users, which is a reasonable complaint and also somewhat expected when switching silicon paradigms.
The only downside I've seen is that I can't seem to tax it, which has made me lazy -- I now have hundreds of chrome tabs open over dozens of windows and it hasn't skipped a beat.
I'd understand this position back when the M1 was first launched, but I'm not sure why anybody would still hold it now unless they have obscure or niche needs. Switched to an M1 MBP a few weeks back and the only issue I've found so far is Parallels won't attach my Xbox controller to Windows 11 Arm for some reason. I'm very happy otherwise.
Macbook Air requires you to get a DisplayLink hub to support multiple monitors.
Macbook Pro eliminated this limitation, but it almost feels like it's using the same technology, as it often loses window locations.
Early on software support was limited, but for many apps, that's been resolved. (Even "unsupported" apps can still run, though I typically don't jump through many hoops, especially when I run into Docker images that don't support ARM)
Me as well - I am super happy with M1. Honestly have never experienced anything that is different from running on Intel - and it is light years faster and quieter and cooler.
Tangantially related, I remember reading the Asahi Linux wiki [0], and was shocked by this statement about Apple Silicon machines. It's fascinating that Apple has a reputation for being anti-libre, yet the hardware they built is arguably very libre-friendly.
> This puts them somewhere between x86 PCs and a libre-first system like the Talos II in terms of freedom to replace firmware and boot components; while a number of blobs are required in order to boot the system, none of those have the ability to take over the OS or compromise it post-boot (unlike, say, Intel ME and AMD PSP on recent systems, or the DMA-capable chips on the LPC bus running opaque blobs that exist on even old ThinkPads).
> From a security perspective, these machines may possibly qualify as the most secure general purpose computers available to the public which support third-party OSes, in terms of resistance to attack by non-owners.
>yet the hardware they built is arguably very libre-friendly
Well they're already in legal trouble for abusing their iOS market dominance with the new EU proposed legislation, they know the tides are turning against big tech, so they didn't lock down the MacBooks preemptively to brace themselves for what's coming, not because Apple has somehow now become a FOSS supporter.
Before you applaud Apple for being too libre and think the company has changed direction, keep in mind this is the same company that ships recently launched monitors without user replaceable power cords and locks out in firmware the possibility to upgrade the SSDs on the very expensive Mac Studio despite teardowns showing that it's phisically possible by end users.
Also, as a curiosity, why is every thread related to a X86 news, need to bring in the Apple M1 fan army and bang their drums? They serve completely different markets. For those consumers or companies that need to run X86 windows/Linux binaries that have no Mac ports, M1 based hardware is off the table from the start. And the device from this topic is designed to cater to that market, not to compete with M1.
> Also, as a curiosity, why is every thread related to a X86 news, need to bring in the Apple M1 fan army and bang their drums?
Because this thread isn't about x86 news; it's about a laptop. And Apple makes laptops, which have the M1 chip in them. It's completely fair to discuss them, especially because one of the selling points of TFA is that it supports coreboot.
> Also, as a curiosity, why is every thread related to a X86 news, need to bring in the Apple M1 fan army and bang their drums?
Honestly? Efficiency & battery life without sacrificing speed/performance.
If you need great battery life (or simply don't want to have to worry about charging your laptop frequently), it's hard to swallow ~4-10h battery life of a typical X86 laptop when MBA/MBP are consistently achieving 10-16+ with real-world use.
...not to mention the other perks MacBooks hold over their competitors, e.g. fit/finish, display quality, speakers, etc.
>They serve completely different markets.
Do they, though? With so many things being web-based these days, it's not very hard for most people (without specific needs) to switch to an M1 as their primary device without experiencing problems.
> when MBA/MBP are consistently achieving 10-16+ with real-world use.
Really? I have a 2019 MBP and it doesn't last more than 3 hrs on 100% charge. The only usage is for the web development. And this is my third MBP (the old ones are dead because their magic boards became dead after the expiry of 1yr warranty + 2yr AppleCare plan).
The battery drains pretty quickly if I attach it to an external monitor.
Also, the evil `kernel_task` process eats up all CPU cores (for minutes and the machine remains unusable in the meantime) that drains the battery within an hour if I don't keep my table fan pointed at it (probably because Apple doesn't give a sh*t about the products they sell in India and their performance at 30-45 deg celsius temperature which is normal in Indian subcontinents).
Then there's something physically wrong with it. My M1 MBP - which is near-constantly at 100% CPU doing neuroimaging tasks - gets about 9 hours. If I'm not running high-cpu tasks and just web browsing, I've gotten 13-18 typically (e.g., two full days of use between charges).
>If I'm not running high-cpu tasks and just web browsing,
You state this as if they are 2 different things. Sadly, more and more shittily designed sites are using more and more resources. Whether that's just a poorly written bit of JS or a maliciously written bit of JS, web browsing is becoming more compute intensive.
I believe the M1 is much better at computing JavaScript as well. Comparable intel x86 CPUs have 60% more branch misprediction during JS benchmarks and ARMv8.3-A added the processor instruction FJCVTZS (Floating-point Javascript Convert to Signed fixed-point, rounding toward Zero).
Especially due to the better branch prediction, the M1 simply does quite a bit more web browsing with fewer VPU cycles used, thus less power consumption.
Let me guess the device is over two years old, your display is always above 70% brightness, you have chrome opened with over a dozen tabs, alongside it a bunch of other background apps you don't actually need open. Oh and probably a VM or docker. How close was I? ;-)
The only one that matters is if their MBP is over 2 years old. I got my M1 the week it released and I only run it under the conditions you describe. Sometimes accidentally leaving baldurs gate 3 / Stellaris / Civ6 running in the background.
It still gets a battery life of “oh I showed up to an all-day working session with a friend with 50% battery life and forgot my charger and I don’t have any anxiety about that” battery life.
I think the least it’s ever gotten was around 10 hours. I’m still getting 14+ after a couple years of abusing the battery charge levels.
The iPhone 13 Pro Max is also the first phone I truly never worry about battery life since a late-stage flip phone in 2007. (Or the Samsung S5 Active which had a replaceable battery, I just carried a couple extra around in my pocket).
>The only one that matters is if their MBP is over 2 years old
I wasn't just asking because of the potential wear on the battery from abuse, but M1 laptops did not exist 2 years ago. Although the intel macbooks still had great battery life for their size. People who put their device through it's paces and wonder why the battery is draining so fast are a special breed that's for sure.
What use is all those perks for those of us, if the apps we use for work and play are not compiled for M1 nor web based? Meanwhile my "inferior" x86 machine can run everything I need to use.
Why do people insist you need to buy a Ferrari when your work or lifestyle requires an F-150?
Therefore, back to my original question, why do M1 fans need go off topic on every X86 article and spray the same things over and over again: "but muh' Geekbench score", "muh' excellent battery life", "muh' no fan noise" and insist you're wrong for choosing to go X86 even though you don't have a choice because of the architecture.
We know the pros, Ok? But some of us still need X86 machine like the one in the article, regardless if those are machines not for you. And if they're not for you, that's fine, but why always bang on with the M1 trope everywhere?
The comment section isn't a zero-sum game. The fact that alternatives to the x86 are brought up in the comments section could be useful for many people, it might not be for you, but it is for many people. To use your own analogy, how many people are currently sputtering around the USA in pickup trucks that don't require a F-150?
I need to run Windows video games (new and vintage), Linux tools for penetration testing and windows and Linux tools for embedded development and reverse engineering none of which have M1 ports.
On top of the software I can't run on M1, I can't stand, from an ergonomic perspective, MacOS's rigid opinionated workflow and Apple's stance on upgradability, repairability and general environmental unfriendliness, plus their pricing on RAM and NAND storage wich have two to four times the markups I can gen get on the free market for the Framework (RAM and Storage are more important to me than screaming M1 performance) but that's besides the point, my main point is that I can't run my software and tools on the M1, so I don't understand why everyone on HN wants to crucify you for using X86 instead of M1.
I get that there are specific use cases that preclude M1 as a choice, I even noted that by saying "without specific needs in my post.
>so I don't understand why everyone on HN wants to crucify you for using X86 instead of M1.
No one's "crucifying" anyone, nor "insisting you're wrong for choosing X86".
People are just (rightly) impressed & excited by M1's efficiency/performance/battery life - for many that's a highly desirable set of traits in a laptop, so it's easy to recommend. Sorry this seems to offend you.
>I need to run Windows video games (new and vintage)
Need? I'm not judging for what one does in their own time, but need? Is that really true, or are we pushing this term just a bit for dramatic purposes? Like, are you getting paid for running this games? Are the use of these games part of your actual J O B? If so, then yes, I'll agree need is accurate.
I think you are perhaps getting a bit attached to that one particular word. Mr. Norris89 may or may not need to run old Windows games the way he probably needs to breathe, but he can certainly say “a computer needs to run old Windows games if I’m gonna buy it”. The phrase “I need to run old Windows games” is easily understood as that.
The cheapest M1 laptop is actually really affordable, almost the same price as this laptop's cheapest configuration. This is probably the exact product I would be looking for if I were trying to avoid buying from Apple. I couldn't find anything on cooling/fans, so IDK if this will be another standard Intel jet engine and lap burner.
> Also, as a curiosity, why is every thread related to a X86 news, need to bring in the Apple M1 fan army and bang their drums?
What do you mean? The root post complains about M1 and thus it was brought up. It's not like an Apple bro army parachuted in and started comparing x86 to the M1
I suspect everyone is waiting for the news to drop that either a top of line ARM machine is released outside of Apple, or for AMD/Intel to announce a competitive ARM chip.
Computer enthusiasts have heard for ages that X86 is bloated, and that alternate instruction sets are better if only the tooling good catch up. Now that Apple is pushing ARM across the line, and ARM servers are becoming popular - folks want to see the linux ARM machine.
Err the monitor has a user replaceable power cord. I have one and removed it fine. It’s different because an IEC lead won’t fit in the design envelope. Even the stand can be replaced.
The SSD is also a misnomer. The influencers couldn’t put a normal SSD in a mac and make it boot. But they don’t understand that there is integration with the crypto at hardware level. And there’s no reason to suspect that apple won’t provide SSD replacements on the open market as part of their self repair program.
This is the power of crappy influencer bloggers. Distributing misinformation which everyone parrots verbatim. They are a cancer on the planet and so are the followers.
Realistically these are absolutely trivial issues in the scale of things blown way out of proportion by people looking for problems rather than actually using the hardware.
>The influencers couldn’t put a normal SSD in a mac and make it boot.
No, Linus put the SSD from another Mac Studio into the second slot of a Mac Studio and it wouldn't boot, and it wouldn't even run recovery to format everything and do a factory reset. So Apple is obviously firmware locking the Mac Studios to their factory configuration despite only genuine Apple hardware being used in the upgrade.
Yes you can. You need to use Configurator to do it. It has been confirmed by iFixit. It’s the same as the Mac Pro was: https://support.apple.com/en-us/HT210626 . You can’t just swap a disk. You have to actually re-pair it with the crypto and then reprovision it and recovery etc.
Again Linus doesn’t know what he’s doing. Another influencer confirming my point.
This is not a PC. That’s where the assumption goes wrong.
Literally nobody on YouTube, certainly not Linus, has attempted to upgrade the storage modules with a supported configuration that the machine is shipped from the factory with.
There isn't a single other SSD controller on the market that knows how to work with arbitrary quantities of NAND flash chips because it's an unnecessary engineering challenge, I don't know why anybody thinks it's remotely reasonable to expect Apple's to be different.
I always liked the LTT videos and was a subscriber. I don't think there's anything wrong with reviewing products and having a certain opinion, however Linus has a stake in a company which makes laptops whose main selling point is the ability to upgrade and repair the parts (Framework). He has a horse in the race. This doesn't automatically disqualify him from having an opinion, at the same time I don't value his opinion as much as an independent reviewer like Steve at GN.
That is troubling for me and I believe is not a super ethical practice!
This is cart before horse. Linus worked with PCs for years and saw (obvious) problems with Apple's hardware, and so he's invested in a solution. He's been critical of these things since long before Framework was founded.
It doesn’t matter regarding what I said. It’s about dissing your competitors. Sure that doesn’t make it a lie but it’s better if people know where you stand; As an independent reviewer vs shareholder of a competitor brand.
It also affects how he presents the facts and directs the videos.
I say this while loving Framework and hoping that their approach puts pressure on likes of Apple for openness. No sympathy for Apple here.
Everything comes off if you yank on it hard enough. Even the fingers on your hand. That's not how great UX or environmentally friendly repairability works though.
>I think this is mixed messaging to be honest.
Regardless what it is, it's Apple's fault here. They claim it's officially not user replaceable. Good luck with a warranty claim if you yank on it and break something, as Apple's position is clear: You're not supposed to remove it yourself.
And how about Apple's famous "we care about our CO2 emissions so much, we removed the iPhone charger to save the environment", if now you're saying you should make CO2 emissions and drive your monitor to the Apple store to replace a friggin power cord that can be done at home on any other monitor on the planet.
> Before you applaud Apple for being too libre and think the company has changed direction, keep in mind this is the same company that ships recently launched monitors without user replaceable power cords and locks out in firmware the possibility to upgrade the SSDs on the very expensive Mac Studio despite teardowns showing that it's phisically possible by end users.
I don't get all those accusations that are constantly recirculated on tech forums. A non-removable power cord has a big advantage: it's hard to lose it. I've never damaged a power cord before (or heard of someone who had) so is it really that damning you should go to a repair shop for something like that? Also, Apple never advertised removable SSD modules. I don't care what they put in that box as long as it works as advertised.
>A non-removable power cord has a big advantage: it's hard to lose it.
How could you possibly loose a monitor power cord? It's not a portable device that you travel with. But in a home or office setting I can certainly see the cord getting crimped under the weight of the feet of desks or chairs.
>so is it really that damning you should go to a repair shop for something like that?
Yes, going with your equipment to an Apple Store enquires down time and extra costs that wouldn't be a problem if you could swap the cable yourself in two minutes.
Heck, they already do that with the excellent removable mag-safe power cable on the Studio display. Come on! Seriously. They proved they can do it but choose not to because fck the consumers and the environment.
>Also, Apple never advertised removable SSD modules. I don't care what they put in that box as long as it works as advertised.*
That's the problem. It's not about you, it's about the environment and the e-waste that these company's products generate because while they could be more user upgradable and repairable like the old Macs were that people still upgrade and use today, Apple now spends resources to make sure their new products are not upgradable or repairable, in order to have you buy more of their stuff instead of keep using their old products for longer.
This is an interesting bit about how the same company can "design" so many different ways to do the same thing. I wouldn't expect them to have a dedicated team to design power cords/sockets. So teams designing different products at the same time will probably have slightly varying designs. Then I go back to that's what VPs are for to see what's going on within each design team and see how good ideas from one can be shared in the other teams.
In other words, how can something as cool and useful as Magsafe be on one thing and not obviously mandated to be used on all the things?
> How could you possibly loose a monitor power cord?
In a 'flex' office environment where people are regularly relocating I expect lots of people missing small stuff like cables.
> I can certainly see the cord getting crimped by furniture and damaged under the weight of the feet of desks or chair legs.
And yet I've never seen such a damaged chord.
> it's about the environment and the e-waste that these company's products generate because while they could be more user upgradable and repairable like the old Macs were the people still upgrade and use
No one in their right mind would want to be dependent on hardware running toward the end of the bell curve for anything 'mission critical'. Also, why do you think those SSD modules are removable? Probably saves you quite a lot when one dies since you don't have to replace the logic board.
It's often useful to be able to replace a power cord with a longer one -- depending on your setup the standard 2m cord might be too short. Extension cables are pretty unsightly, especially with big grounded plugs like the Schuko plug we have in Austria.
It's also useful in some setups to use power cords with right-angled connectors, because they make cable management easier.
Also, moving to another country is easier with standard sockets: just get a couple of new power cables from any random hardware store and you're good to go!
Not trying to nitpick. Probably the parent might have meant “forget it when moving the monitor to a different place”. This happens more often than one may think once you have trashed the box the monitor came in.
But yes, it is a bit strange to write it down as an “advantage”.
People have been proclaiming that for years. “Just wait, Apple will only let you install their software on their machines. It’s going to happen soon. Anytime.”
Why only a single USB-C port? Almost everything I buy these days has only USB-C ports/cables/charging.
It's gotten to the point now that I am increasingly frustrated with the fact that there are no good USB-C capable hubs that take a USB-C USB 4 and turn it into an 8 port USB-C hub. With a mix of 10 Gbps/5 Gbps/480 Mbps speeds.
I can get plenty of USB-C to USB-A hubs, but that is not what I want.
I believe that when companies do this, it's because there's only a single Thunderbolt controller on-board the host, and they think that giving people two USB-C ports, where either port can do Thunderbolt, but you can't use two Thunderbolt devices (in actual Thunderbolt mode, where you're passing through PCIe lanes) at once, would be too confusing. So they force you to buy a hub or dock — pushing the consumer's blame for this confusing quagmire onto the peripheral manufacturer.
> 8 port USB-C hub
And I believe a major reason for the lack of these, is that, if a dock/hub/etc. that consumes a USB4/Thunderbolt host port, wants to be able to expose a full-speed USB4/Thunderbolt downstream port of its own, then that port isn't going to work for peripherals that are picky about speeds/lanes if you have any other USB devices plugged into the hub. So vendors choose "the ability to plug a single Thunderbolt device in, plus other things" over "the ability to plug a bunch of USB4/3.2 devices in."
On the plus side it does have three USB A ports, and it has a barrel charging port instead of one of those shitty USB-C ports for power. And you can plug USB-C-charging things into a USB A port with just an adaptor cable.
Having a barrel charging port is a serious flaw. I have many devices, but I expect to carry a single charger for all of them; if my laptop has a non-USB-C charger than I need a separate charger for the headphones/phone/Kindle/etc which is extra weight to carry around, extra mess and e-waste.
The world has moved to a single standard for charging for all devices, we're just waiting for the "transition period" to run out as the devices with nonstandard charging expire, it's inappropriate to design new hardware with an obsolete charging approach.
Barrel charging ports provide a superior user experience: they are easier to use because you can insert the charger at any angle.
Barrel connectors are physically much more robust: they can rotate around their axis without imposing any torsional stress on the device being charged, and the longer lever arm inside the charging device (typically 15 mm) means that flexural stress around other axes creates much less force on the connector. Consequently, they break much less often than pre-C micro-USB connectors. It's too early to tell whether USB-C will improve on its predecessors in this regard, and it may, but it seems unlikely to reach barrel-connector levels of reliability just based on its geometry.
Barrel-connector ports are easier to repair when they break: there are only two wires, and they are quite thick. This is true a fortiori for the cables.
Barrel-connector charger faults are easier to diagnose: either the voltmeter tells you it's outputting 19 volts open circuit (or whatever the rated voltage is), or it's not, or, rarely, it has the correct open-circuit voltage but sags under load, the diagnosis of which requires a voltmeter and a power resistor. Moreover, they are less likely to occur; you are not going to break a bipolar SMPS with static electricity, not even if it is attached to a 5-volt USB cable, but you can easily do that to the CMOS control chips necessary for USB-C voltage negotiation.
Barrel connectors pose less of a security risk: they do not, in most cases, have a data connection at all, and a malicious charger definitely cannot execute a firmware upgrade attack on your device through a barrel connector.
Against this list of technical advantages you claim that barrel charging ports are "nonstandard" and "obsolete" — not because of any actual USB-C functionality, nor even because there are more USB-C-chargeable devices in the world than devices with barrel connectors, but just because USB-C is newer and currently fashionable. The maximally charitable interpretation of your post is that USB-C chargers are capable of providing a range of different voltages, so the complexity of voltage conversion goes into the charger instead of your headphones. But that's an extremely weak argument; a buck converter capable of deriving 5 V 300 mA from, say, 19 V is already much smaller, lighter, and cheaper than a pair of wireless headphones, and adding USB-C charging support to your product also requires a significant BOM cost, and the necessity to operate on 5 V as well as whatever it prefers, though maybe less weight than the buck.
What's inappropriate is that you're attempting to dictate decisions of technical functionality on the grounds of mere fashion and social approval, then shaming others for disagreeing with your judgment rather than supporting it by any actual arguments.
I predict that >99% of devices made this year that can only charge through USB-C ports will be nonfunctional in 15 years. That is, they are cheap trash, designed to be discarded rather than repaired. You should be ashamed of yourself for attacking my social standing to convince people to accept this inferior technology. The form of your argument — a veiled personal attack — makes it unworthy of being posted on this site, or, actually, anywhere.
I like it, when both USBC charging, and BarrelJackCharging are available. So I can use the BarrelJackAdapter at home for when in "DeskMode", and only need to travel with the USB-C charger, that can also charge all my other devices.
Also makes for the possibility of "Redundant Power Supply", in case you use your old Laptop as a "HomeServer".
Depending on what/where/how I hook up in DeskMode, I also have Power on USB-C. Mainly depends on how/where I want to route DP/HDMI-output.
Also, POE-to-BarrelJack is a thing, while POE-to-USB-C-with-more-than-5V is not.
Yeah, I'd actually use the barrel charger on the road so that it can just live in my backpack and USB-C (provided by an appropriate dock that's connected to a keyboard, monitor, etc) at home.
> my experience with USB charging ports is that they always break, and then they're a huge pain to fix, and my experience with barrel charging ports is that they almost never break, and then fixing them is pretty easy
The opposite for me. I’ve broken a proprietary charge port and had to wait over a week for the replacement. Never broke a USB-C port, but if I did it’d be no big deal since I have plenty of C connectors in my cabinet. I can solder a replacement in 10 minutes.
> So I think of a computer that can only be charged via USB as a piece of cheap trash. Maybe USB-C is different on this axis
Wait are you saying you’ve owned a laptop that charges over USB, but not USB-C? If so that laptop really was cheap trash. Why compare trash with flagship products?
I've owned many computers that charge over USB but not USB-C. They're cellphones, though, not laptops. I'm sorry to hear about your experience with a proprietary charging port.
USB-A (or quasi-USB-A) charge ports on laptops were a weird abberation for a while. I had a dell with that, and it's now one of the few laptops I've got that I might never be able to power on properly again because the power delivery over the port is 100% proprietary and the AC adapter for it is busted. I don't think there's ever been a high wattage PD standard for USB-A? Even with USB3. So any version of it was just some manufacturer's attempt to jump the gate on PD over USB-C.
USB-C is a completely different ballgame here. It has its flaws but it's nothing at all like that weird little era of laptops.
It's funny; my reaction is the opposite. Yes, I think the barrel charging port is good to have, but only because there is just a single USB-C port. I don't want a barrel port! I have USB-C chargers littered around my house already! And that's how I like it!
And I definitely don't want my default to have to be using adapters for my USB-C devices, so they can be plugged into the USB-A ports.
I guess my experience with USB charging ports is that they always break, and then they're a huge pain to fix, and my experience with barrel charging ports is that they almost never break, and then fixing them is pretty easy. So I think of a computer that can only be charged via USB as a piece of cheap trash. Maybe USB-C is different on this axis; I don't know yet. I really miss barrel charging ports on cellphones. I don't know, maybe I just abuse my hardware.
Yeah, I wish USB was designed to be a little more sturdy in general. I find it disconnects a little too easily (compared with most chargers/AC adapters), breaks far more often, and even the cables I encounter are thinner and more easy to damage.
If we're settling on "charge/power everything with USB!" I hope the next version (USB F?) holds up better to more "abuse" than charging a phone on s desk or end table inflicts.
I never had problems with USB-A ports but I'm not using them much. All my laptops had barrel ports and I'm not using external keyboard or mice. A USB-B port started to get loose on a phone of mine after 6 years of use. I didn't use any USB-C port for more than 4 years yet. They look sturdier but time will tell.
8 ports is high expectations, I don't think there's anything really there, but there is a 3 port tb4 hub by plugable[1], and you can I think daisy chain multiple together. Pretty expensive to get up to 8 though, never mind all the power bricks to get the power necessary.
Yes, please! Ideally, also with POE .bt Power Supply possibility (upto 80W), and compliant to USB PerPortPowerSetting specification, to be able to toggle power to each port comfortably with uhubctl/Home_assistant/etc.
Looks interesting, starts at $930 which is not too steep vs other products in the same space, price of upgrades is not stated which is a bit scary. Several Linux distros tested including Ubuntu, but Debian is not in the list. Appears to have a non-removable battery with nothing said about replacing it. And of course the CPU (whether you pick Intel or AMD) is full of blobs, management engine, etc. I didn't notice a mention of a built in microphone but it is a standard thing to find on laptops, so what I was hoping to see was a way to hard disconnect it.
Edit: aha, I see replacement parts including batteries are available if you select "parts" from the menu.
Ordering screen alert says "Production for the StarLite's has now finished, and are in the final stages of testing. Orders placed now are estimated to ship out in 2-3 weeks."
There are some unstated "if"s in that, so I would say until people actually receive units, it doesn't quite exist.
Anyway I will keep an eye out for this. I'm going to need another laptop at some point and this seems like a possibility, as does the Framework, etc.
> There are some unstated "if"s in that, so I would say until people actually receive units, it doesn't quite exist.
Kudos for being wary of marketing copy. I'd like to note that StarLabs has been shipping laptops of their own design for a few years now. They've been super transparent about manufacturing lead times and their order queue too. I don't own anything of theirs yet, but I've been keeping an eye on them long enough to say I appreciate their way of doing business.
Gah, I like nearly everything about this, except that a single USB-C port is a showstopper for me. I've gone whole-hog USB-C over the past 4-5 years, and at this point I only have one thing (an old mouse) that connects via USB-A (I've permanently connected an adapter to it, so it's fine with USB-C as well).
The other thing is the 1080p screen. I have that on my XPS 13, and it's... fine, but not great. The laptop I had before it had a higher-res screen, and I liked it much better, though I'm sure it negatively affected battery life.
The price point is pretty decent, though. Spec'd the way I'd want it, it comes to around $1550 pre-tax. Shame about the USB ports, though.
So close. I really want a laptop with a "2K" 4:3 aspect ratio 14" screen, 64GB of ECC ram, a Ryzen processor, and an M.2 SSD.
Would be interesting to get a nice web cam too but that is less important to me. I'm also not a huge "thinness" type, thicker is fine if it isn't heavier.
Not really so close, though right? Using Laptopmedia's database, the only mobile ECC workstations are Xeon not Ryzen, and the minimum size for those are 15.6". 16:10 is the best ratio you can get for displays. In fact, AFAIK, there are no current COTS laptops with a 4:3 aspect ratio display at all.
I suspect if all of what you listed are actually hard requirements, the only way you'd ever get that is if you built your own - source an industrial 4:3 panel+controller, find the smallest Ryzen board with ECC support and get a Ryzen Pro APU (none of the ThinkPads will boot ECC SODIMMS on their boards, despite Pro APUs having nominal unvalidated ECC support) - you could try your luck with an A320TM or otherwise, might need to go embedded Epyc and an M.2 display adapter (like the SM750 ones). You could probably build a "tablet" form factor that's reasonably thin (<30mm?) and light (and use an external USB-C PD compatible chargers/battery packs and simple DC conversion to make life easier on the power side).
Xps 17 is 1200x1920, 16:10. It's a great ratio and at 17 inch I don't use any scaling in windows and everything looks right size for my eyes. Can upgrade ssd and I've changed out memory to 64gb. It's a great rig but fans are a bit noisy.
I went to dual boot with Ubuntu but ran into issues that were going to take more than an evening to resolve. I can't remember exactly what it was now - something about bootloader and needing to change something in Windows registry to work ?(!).
You can choose between American Megatrends and coreboot. They have a FAQ with the pros and cons[1]. But I have a hard time seeing why anyone would choose the proprietary American Megatrends firmware, especially since the laptop is targeted towards Linux users. Are they offering the choice for their older customers who are used to the American Megatrends UI? Or maybe to not anger the supplier (American Megatrends) that they've been working with for years?
I'm not well versed in UEFI firmware, but how is this possible:
> Due to how lightweight coreboot is, it will offer better performance and lower power consumption. For example, the LabTop Mk IV combined with coreboot will offer approximately 8% more performance and around 20% longer battery life (with a record of 13 hours and 42 minutes for general use).
I don't think I'd make that security tradeoff. From best to worst, I'd rank the options is this order:
(1) signed open source
(2) unsigned open source
(3) signed proprietary
(4) unsigned proprietary
In some cases I might rank #3 as the worst option if it prevents me from editing the firmware binary (to fix a security bug myself or to remove tracking info like a serial number), reverse engineering the firmware to verify it (because signed code is often encrypted as well), or installing alternate firmware that might be more secure.
One thing I'm very happy to see on their configuration page is the choice of keyboard layout.
It feels like most companies assume that you must want the standard layout for the country you're currently living in, when personally I just want a us layout.
Do wish they had some more options for the display, but still looks like a pretty compelling and reasonably priced product.
When I loaded the website it was slowly going incrementally through different intel generations. So I thought for a second they were offering versions with older intel chips. But the page was just animating slowly.
Do I understand correctly that this is capable of charging over USB-C, and includes a USB-C power supply, but the default charging cable is USB-C -> DC barrel jack?
So you can charge over USB-C, but only if you buy a C-to-C cable and tie up the single USB-C port.
I think it's good that they have the barrel-jack option for people who want to use the USB-C port for something other than charging, but I would just prefer they ditch two of (or even all three) USB-A ports and have 3 or 4 USB-C ports instead.
Yup, at the same time it allows people to use USB-C for charging, which might easily be the case with some of those usb-c hubs that also provide charge to the device.
- a second USB-C.
- You don't need that HDMI if you have enough USB-C, since you can turn them into what ever port you need.
- Why the fuck do we still have USB-2 port ?
- We really need to have option for the screen, 1080p is not enough for a lot of people. I like to have as much screen pixel has possible (because I can display more line of code).
You seem to be wanting a stationary desktop computer, not a portable laptop used for many different situations.
HDMI is great when you're doing presentations all over the place, no need to keep track of dongles, most places I present at uses HDMI.
USB-A is useful for tons of accessories, personally I only have a one (out of five in total) using USB-C, and most USB sticks I come across when consulting are in fact USB-A.
If you're out after as much screen estate as possible, external monitors are for you. A 14-inch laptop are for people on the go.
> HDMI is great when you're doing presentations all over the place, no need to keep track of dongles, most places I present at uses HDMI.
Then you only have an HDMI dongle.
Until you present at one that use display port, someone can give you a display port dongle.
> USB-A is useful for tons of accessories, personally I only have a one (out of five in total) using USB-C, and most USB sticks I come across when consulting are in fact USB-A.
I know that, I m not against it at all. Just, can we USB-2 retire now ?
> If you're out after as much screen estate as possible, external monitors are for you. A 14-inch laptop are for people on the go.
I code on the go and I don't want to be on a small resolution laptop when I m not at home.
I bought the first ZBook 15 in 2014. I'm running Ubuntu 20.04 on it now. One of the reasons I got that laptop was the physical buttons, plus on site next day support. Too bad for the 16:9 screen and I absolutely hate the numberpad. I don't have any use for it and it puts all the keyboard off axis. It's a world of compromises.
Which ZBook do you own and which Linux do you run on it?
I have a ZBook 15 G3 from 2016. The number pad is one of those things I love 10% of the time and I wish it could just disappear the rest of the time. Perhaps the ideal laptop should come with a USB number pad?
IMO, not all Thinkpads. My E485 was noticeably worse than my X1C6. Never seemed to track as well, nor did it seem to resist "sticking" as much.
I don't quite know to describe sticking, but it's like when an INU (or a strain gauge in this case) fails to properly self calibrate, and it decides that an objectively false input is true. Something similar to video game controllers with "stick drift," but not quite the same afaik.
Aside:
Ultimately, I can take it or leave it w.r.t. the trackpoint. A good touchpad can do wonders. In tight spaces I use my thumbs then rely on decent tap-to-click and inertial dragging implementations (present since ~2008). Though in the recent years, I haven't had to commute to work by train or bus anymore, nor have I had to really travel by airplane.
When typing, I mostly rely on keyboard commands for text-heavy work. I do find quite a bit of use in many thinkpad's KB layout still preserving home/end/insert/pgup/pgdn without needing to resort to "fn" key twister games, even if newer Thinkpad layouts are worse and worse for this (insert key got eaten by fn in some newer laptops, and there are obviously longstanding debates over the 7row vs 6 row). None of the griping about the "swapped" ctrl/fn key positions would even matter if functions weren't being hidden behind fn keys in the first place, negating the need for a fn key at all.
I do appreciate the thinkpad's clustered f1-f12 keys, which make common f keys quicker and easier to find.
I've never used the touchpad on any of my thinkpads.
I think Lenovo have been okay stewards of decent laptops. Not great, especially relative to newer developments on the market.
I have an E585, and I agree it's not quite as good. I think the E series are kind of the "budget ThinkPads" (I got it second-hand in a pinch after my previous laptop broke). It's decent enough I suppose, but I've had a number of curious hardware problems with it (including the drift you mentioned).
> I do appreciate the thinkpad's clustered f1-f12 keys, which make common f keys quicker and easier to find.
Finally I found someone who cares about this too! I've had laptops without those little gaps, and I found it unnoticeably harder to use the F keys; for example brightness up/down is F5/6, and when watching some TV at night when it's dark it's so much easier to find those keys with those little gaps. Even in regular normal use it just makes things a wee bit easier.
It's a small design detail, but it so clearly and objectively makes the keyboard better than I don't understand why so many keyboards don't include it. Well, I do understand: the graphical designers have locked all UX designers in a closet somewhere, but still...
> griping about the "swapped" ctrl/fn key positions
You can just change it in the BIOS anyway so that Fn is Ctrl and Ctrl is Fn, so it's a non-issue anyway.
I know disabling JS is a popular security and privacy protocol on HM, but JS is a core part of the modern web. Strange side effects seem like they should be expected.
Very cool! Not exactly to _my_ personal taste, but objectively a very nice project! Kudos for USB-A 3.0. Love the openness, Coreboot and the configurator. The warranty is impressive. I wish more products like this existed!
On the design downsides (imho): Why MicroSD? There's enough space to put a BD-RE drive, not to mention full-size SD. The battery is not user-serviceable, judging by the looks. Not made of 18650, not hot-swappable. And the power button... there... err... ugghh... just.. WHY?? (I'd definitely hit it instead of DEL, which was placed there an all of my previous laptops).
I know the people working on Asahi Linux don’t have things perfect yet, but it runs on an M1 Air. That machine claims 18 hours of battery life (in some tests, MacOS) and gets 14.5-16.5 hours depending on who tests it.
The StarBook claims 10 and probably achieves less (everyone does).
The base specs are roughly comparable, except i3 vs M1 and the high resolution screen on the Mac.
The site is quoting me $883 vs $999.
Even ignoring the CPU difference, the battery life alone is probably worth the $100, let alone the display.
I understand the value some place on its openness, but it’s not even a medium sacrifice. It’s pretty big.
Asahi Linux will never work as well as a normal Linux machine within any sort of reasonable time frame. Even with Intel CPUs the Linux experience on Macs is kinda shit.
If your goal is to have a Linux workstation then buying a Mac is pretty dumb.
> I understand the value some place on its openness, but it’s not even a medium sacrifice. It’s pretty big.
There's open, closed, and then there's the crap Apple pull off. None of their devices are even slightly serviceable or upgradeable. RAM, disk, battery are all, physically or otherwise, fixed for the whole life of the machine. Add the huge premiums they charge for going from 8GB RAM to 16, or similar in storage, makes the sacrifice much more bearable.
It runs, without full GPU support (at the moment no 3d acceleration at all If I am not mistaken), and you have to use Asahi Linux, not your favorite distro (yet). Also many of us doesn't want to give our money to Apple.
I bet the trackpad is awful compared to a MacBook, although I have no idea how it feels under Asahi. I reached the same conclusion... These m1 macs are just too good at the price point, nothing else can meaningfully compete. Once Asahi works well I'm going to have to get one.
Damn, I wished I had bought the mini-PC from them instead!
I really don't understand the appeal of laptops, it only makes sense for the people that is really on the move like university students or blogger. Even then, they only need cheap laptop for browsing (cue memory hod jokes). That or you prefer to do your work at public spaces like libraries or cafes.
What's the point of spending huge sums of money on a laptop that performs 1/3 compared to its PC equivalent? You're also stuck with that keyboard and monitor for the rest of its natural life.
space - you have to have at least 1 dedicated desk for your desktop.
mobility - You can do your work in places other you workplaces/home. Change of scenery/environment is trivial. even inside your own house, you can work from your own bedroom, kitchen, lounge, garden, wherever you want.
Your last point is baffling, really. Just treat it like any PC. If you dont want to use the screen/keyboard/trackpad just connect what you want to the laptop.
Actually I dont see the point of mini PC. Theyre just desktop that you can lug without monitor and any input devices.
What's the point of spending huge sums of money on a mini PC that performs 1/2.9 compared to its PC equivalent?
> Actually I dont see the point of mini PC. Theyre just desktop that you can lug without monitor and any input devices. What's the point of spending huge sums of money on a mini PC that performs 1/2.9 compared to its PC equivalent?
It's cheaper for the same specs; for example this StaBook with the 5800U is €1074, but their mini-PC with the same CPU is €786: €288 cheaper! You can also put a 2.5" SSD in it next to the MMC drive (my E585 also allows this, but many laptops don't). And for a lot of "serious work" I use my USB keyboard and HDMI screen anyway: the laptop doesn't really give any advantages here.
Right but you can get a faster full fat desktop for the same price that'll let you shove in 3.5" and 2.5" drives if you want, typically have higher RAM ceilings too, etc. Even in my tiny apartment the volume taken up by an ATX case isn't too much.
I haven't regularly sat at a desk using a computer in years. My most common locations are on the couch or in bed. Anything but a laptop would be incredibly awkward.
And yes, I do more than just web browsing. Mid-range gaming (no, I can't play the latest graphics-heavy titles on this kind of hardware, but I don't mind), development, etc.
I'm using this laptop in bed right now, and I took it over to my in-laws' house the other day. And yeah maybe I'll go to a cafe tomorrow, that sounds great.
What do you do if you have to work in the office and there's a meeting where you want to demo something? Or you want to quickly look up an email with something you want to show to others?
I'm in my 30s, been doing this since I've been 23, and absolutely zero issues related to that so far. Everyone I know who has RSI from this line of work seems to be unlucky mostly, so maybe it's just me, but I think it's mostly a matter of getting up and walking or running around for around 15 minutes for every hour of sedentary activity, stretches, underwater ocean swimming for the lungs... All things WAY more likely to give me an RSI just from stepping in the wrong place or whatever other exercise-related injury. Not like I spend hours in strange positions acting like I'm a three toed sloth
Hmm, seems like currently the costs of the more "open" laptops are around this:
StarBook ~1000€
Librem 14 ~1250€
Framework ~925€
I can see why they'd focus on this price point, but what are some of the better budget options? Best i can think of are:
StarLite ~500€ (available soon)
Pinebook Pro ~250€ (out of stock)
Raspberry Pi 400 ~100€ (not a true laptop, but interesting)
I actually bought my x86 notebook for around 200€, which was perfect for note taking in University and now occasionally serves as a low-power long battery life web browsing or typing machine: https://techbite.eu/en/laptopy/laptop-zin-3-14-1 (i got a slightly older revision)
That said, it seems like you should only go for the more popular devices so you don't run into stupid weirdness with drivers, even in popular *nix distros (4 GB of RAM just wasn't enough for usable Windows), for example, that manufacturer i linked had problems with the fingerprint scanner in the model i got.
Of course, what i'm after might be better for those who prefer desktop computing for most tasks and only need something for being on the go occasionally (or just are frugal, like me, due to only being able to make savings slowly).
1920x1080 on a 14" display is unfortunately not something i will be okay with anymore. i feel like 2k resolution is the sweet spot for this size display.
I thought QHD would be nice enough at that size but I really like my 3840x2400 14"-ish Dell Precision. The extra resolution means the fonts look printed instead of no-too-pixelly.
It changed with the “K”. The marketing teams decided that 4K sounded better than 2160p. This is all the more ridiculous as 4K usually means UHD where neither axis has 4000 pixels ( 3840 x 2160 ). In the industry I am in, this would be called 8 MP ( Megapixel ) instead.
After the popularity of 4K, somebody noticed that they could start calling 1080p 2K to make it sound better than 1080p. If you say 3 MP, does that sound even better? I mean, 3,000,000 is more then 2000 right?
it never changed, p in 1080p always referred to progressive lines as it came from movies/tv, 4k always meant 4096 at horizontal.
We still have 2160p which is 3840×2160
They never replaced each other, just different definitions for screen sizes
I know this back-and-forth happens every single time there's a conversation about this on HN, but I just want to point out that trackpads are extremely personal, and I've had nothing but bad experiences with Apple trackpads until recently. The "hinge" models of the previous decade border on unusable for me because of the force required to click them, whereas my (early decade) XPS model has the best trackpad of any laptop I've ever used and it includes physical left/right buttons, which I love.
My advice is to try out the trackpad on a laptop before you buy it if this is something that matters to you, not trust the opinions of Internet strangers.
>My advice is to try out the trackpad on a laptop before you buy it if this is something that matters to you, not trust the opinions of Internet strangers.
I can’t disagree, but trying out Mac hardware is far easier than trying PC hardware. I can walk into an Apple store and try all the major hardware variations.
I had to get my XPS 17 based on online reviews. Because I can’t go out and try it. Buying PC laptops is the very definition of trusting internet strangers.
> My advice is to try out the trackpad on a laptop before you buy it if this is something that matters to you, not trust the opinions of Internet strangers.
This really applies to most products, especially those that you interact with so frequently and directly. It's one reason that in the EU you have a 14-day "right of withdrawal" for all purchases made at a distance (most commonly being internet purchases).
huh? apple hasn't used the "hinge" touchpad since 2015, more than 7 years ago. unlike afaik every other touchpad out there, the actuation force is actually adjustable. i'm not sure whether this is necessarily a good change, but it certainly is a change, and your data seems woefully out of date.
My comment is literally a response to someone talking about a MacBook Air from 2012.
Regardless, the comment stands, as even though Apple's trackpads have improved I would still strongly prefer a device with physical left/right buttons.
I purchased and returned an XPS last year and the trackpad was unbelievably bad. After using Mac laptops for the previous 10 years, I hadn’t thought the difference in hardware quality could be so far apart. MacBooks are at least a generation ahead of the best PC hardware (generation as in, 10 years)
The XPS isn't a great example of the best PC hardware, but it IS a pretty good example of the best laptop performance one can get for the price of an iPhone. ^^
I mean, to Dell and the XPS's credit, it all appeared great until I started using the trackpad. Structurally it's super rigid and the hinge was satisfying. I also really prefer the rubberized+carbon fiber thing other Dell laptops have going on, much more than the cold and sharp MacBook. Even the packaging is getting Apple-esque with heavy cardboard and near-mechanical boxes. But, that trackpad produced so much dissonance and is such a critical part of interacting with the machine and software.
Likewise, 2021 Dell XPS 15. Had to retire it very early -- trackpad issues, then frame deformation caused by moving the laptop would cause the RAM or the CPU to crash.
> 2021 Dell XPS 15. Had to retire it very early -- trackpad issues
Me over here with a 2021 XPS 15 using a mouse 24/7 because the touchpad lags hard on linux but refuses to admit I got burnt lol. I'd be over the moon if framework released a 15" model.
On the other hand, almost any keyboard is better than what Apple sells :) their latest mbp is a bit better than the butterfly one but nowhere near the 2015 and earlier models.
2021 Dell Precision 5750. It's a $3000 laptop and the trackpad is a disaster due to flakey palm or loose t-shirt rejection. 15 years after the introduction of the glass trackpads of the first aluminum unibody MacBooks, you'd think that others would figure it out, but no.
(The rest of the laptop is equally bad, but that's a different story.)
I can't comment on the StarBook, but the Framework touchpad is really great. I had an early model that had a slightly annoying out-of-the-box issue where I had to click kind of hard, but after doing that it works great now. Two-finger scroll is super smooth, and it properly detects 1/2/3 finger gestures, etc.
I think my frame.work is doing the "kinda hard" click thing too. Did you just break it in, or change something mechanically? I mostly use the keyboard, so it's a very minor annoyance to me.
> Pressing the bottom center of the touchpad firmly a few times has resolved the issue in some cases.[1]
Said in a slightly different way:
> There were also a small number of early units produced that may have contact issues on the physical switch on the Touchpad. Try pressing the bottom middle of the Touchpad firmly a few times to see if that resolves the issue. If it does not, please contact Framework Support.[2]
STOP WITH THE MINIMALISM. I don't want my laptop to have _the least amount of features_, I want it to have _as many features as makes it useful to me_. These are not fashion accessories, these are life productivity machines. I need 4 USB-A/C ports (at least 2 that are 3.x and fast-charging), an Ethernet jack in the back (not in the way on the side), a lid that doesn't get stuck at 120 degrees, fans in the back, a monitor port (preferably in the back out of the way), a hard drive light, separate volume buttons, 14-16" screen that isn't crap, headphone jack, and easy to upgrade RAM and hard drive.
Laptops these days are RIDICULOUS, poorly-designed, wanna-be Jony Ive slabs that look pretty and are frustrating. We had good designs years ago! What happened?! They've gotten hard to use unless you're a coffee-shop blogger, they used to be easy from kitchen-table user to weekend coder.
What's the equivalent of this for someone who wants to bring their own keyboard?
Is there a Linux tablet that can both act like a tablet and also be the screen (and heart) of a developers "laptop" if paired with a external keyboard?
Some kind of detachable "hub" that gives more IO options is optional.
Oh that is quite interesting. I did not even know that there were AMD Chromebooks. I don't know who at google is forcing hardware vendors to publish sources, but they are really awesome. Several arm vendors seem to only publish sources for their Chromebook SoCs (e.g. Rockchip, mediatek).
Website doesn’t work all that great on mobile. But I’d be supportive of a Linux laptop that can market itself to regular consumers rather than already-converted Linux folks.
I was interested, but sadly the German keyboards are only available in combination with an intel 11th gen mobile CPU. I really wanted the Ryzen variant. I wonder if the keyboard can be bought separately - the layout at least looked compatible with that.
Another point against it is the screen resolution - I really wanted to have a 1440p screen this time around.
It says in the specification it has DP alt mode for the USB C. You just don't get thunderbolt on the AMD models. This is pretty typical and it's the same case on the AMD ThinkPads.
This comes as a reminder that good, high resolution screens are not a given yet.
Every other specs are configurable or look decent enough, and it seems to be a well thought, very competent product. I image not getting a screen on-par with current macbooks for instance to be something that was just not feasible, and clearly not an oversight.
You know what I'd buy? A proven laptop brand that a third party revamped/Coreboot-ed/etc. and resold. These mediocre, unproven, semi-bespoke laptops with third-rate keyboards and so-so screens (the two features that really matter to me) are never compelling enough.
Looks very similar to the TUXEDO Pulse 14, to the point that I'm wondering if it's the same OEM. I've been using the only Pulse for a while now, it's okay.
Though I basically only use it plugged in to external everything and frankly would prefer to use a desktop computer, so I guess even after a few months I can't say whether it's a good mobile device.
If anyone that works at Starlabs is here you should know that I tried to use the configuration page on my iPhone and it was pretty buggy. I tapped learn more on operating systems then it was stuck there.
I will never go away from a 3:2 Laptop ever again. I am using the Huawei 14, bought it for 700. The only issue is that i cant replace the RAM, and the camera is really bad, being behind the keyboard.
The screen is atrocious. Apple has been using HIDPI 220ppi displays on macbooks since 2012. No we dont need a matte display. Ever see a matte smartphone?
Tiny bit puzzled by the random USB2 port? Seems out of place given rest of specs. Is there a specific reason why that would be desirable over another USB3?
I don't understand why the industry hasn't switched over completely to hidpi screens. Once you use one you can never go back. I feel so bad for people still using 1920x1080 screens. Truly the price of freedom is having to squint and lose your eyesight, hoping that font tricks can save you, when you could just be using a display with a modern amount of pixels on it.
Has anyone here had experience with installing and running Windows on one of these? Not for me, but my wife is in need of a laptop, and I'm wondering how well it would run on this platform.
I personally would love to run one of these for work, but IT no likey Linux boxes. Will have to upgrade to an M1 sooner or later (my 2015 MBP is on its last legs).
I've been checking these out for months, unfortunately the 13" is out of stock and apparently isn't listed any more in the site :(
Wanted a decent, light (physically light I mean) Linux laptop at that started at less than 900€, so probably not more than 1100/1200 after some ram/CPU tweaking.
This is such a terrible-time(opportunity wise) to build/buy a laptop:
On the one hand, the newer much more energy efficient intel cpu's 12xxx are out, and if you want to go Ryzen, you should wait for the new Ryzen Rembrandt(?) APU (new GPU RDNA2)
Looks like the EU version of Framework! Funny enough they do give a Ryzen option which is pretty cool.
Would love to go back to a PC running Linux but for the time being macOS + Brew on MacBook Air/14" is the way to go due to the battery life and performance.
Give me a truly sunlight readable display (transreflective polarized lamination or whatever works well), a high fidelity glass trackpad, good keyboard, solid case build quality and decent internal specs and I'll buy a new laptop.
Looks like a great machine, however, the screen resolution is a little disappointing. Why can't we have retina screens on non-Apple hardware? Is this due to Linux limitations or manufacturers are just "cheap"?
Given that I can easily see pixels on my XPS 15 4k, I wish this one had higher resolution. Otherwise, I am not sure how an experience would be different than using a few years old laptop?
Maybe I am not a target audience.
I hope it does well and the range ends up expanding a bit, it would be nice to see some options with discreet GPUs (and higher res displays). The pricing seems reasonable and more support for Linux is always welcome
I think it’s great the kind of initiatives are becoming more widespread. Personally, I have little interest in these kind of laptops but the more choice and variety is out there the better it is for the users.
Long shot but can this power two lg 5k displays? I'm guessing it can at least power one since it has thunderbolt 4 but I think it depends on how many "lanes" are available right?
unless you're considering using an egpu, thunderbolt is completely irrelevant to whether the machine can drive two 5k displays (from any mfg). thunderbolt is saying little more than the device can route pcie signals externally. on top of that, thunderbolt is only available on the intel model
both intel and amd models use their respective igpu, so it's unlikely you're going to have a pleasant experience trying to drive those displays without an egpu
All those Linux friendly laptops are cool. but it's quite troubling to announce devices with Intel's 11th gen when 12th gen ultra-low-voltage is around the corner.
I could get used to that, but the letters and space bar seem to be off center. It looks like they added a column or two of keys to the right of the enter key.
Is that a UK thing, or is the US layout also off center?
Looks decent. From the title it seemed like this only supported AMD, but you can get it with Intel CPUs, too.
I've been burned in recent years by trying to use AMD stuff with Linux and just running into even more weird, minor hardware incompatibilities than usual. For now I would not buy an AMD laptop again.
Make sure to use the latest kernels. My Ryzen 2500U laptop has finally been fully stable and usable after about the 5.10 kernel. There were unfortunately a ton of bugs and issues with Ryzen, especially the APUs, in the earlier kernels. Ubuntu 21.10 or similar derivatives have been solid though.
I do, I use linuxPackages_latest from nixpkgs. On my first AMD machine (2020 Threadripper) I even made and carried a kernel patch to correctly address audio output pins for a while. Some of that works upstream now, but the AMD-compatible mainboards are spread too thin to be well-supported.
Ah, those silly number animations where you animate from zero to the desired value. Combine that with a careless form of server-side rendering (though I am glad that all of the rest of the page is correctly server-side rendered with only one or two other minor issues), and this is the JavaScript-free experience:
> Intel® Core® 0 th generation processors
> Ryzen™ 0 processor
> Up to 0 GB 3200MHz memory
> Up to 0 hrs battery life
> Up to 0 GB/s SSD read speeds
And more like them later in the page (like the 0° viewing angle on the 0″ 0×0 display).
Amusing.
(Aside: I wish people would stop doing scroll-linked transitions of any kind for regular content. It just slows everything down for no good reason. Let me see the stuff immediately as I scroll; don’t make it take another few hundred milliseconds until you fade the content in, or a second until you show the correct number.)
Everyone wants to do Apple-like product pages, and no one knows how to do Apple-like product pages. (And in fairness, Apple has forgotten how to do a proper product page, too).
A little more than a decade ago when parallax scrolling was new and seen only on a few websites like apple.com, there was a wow effect. Nowadays apple.com and clones are among the most obnoxious and tiring ad-free websites out there.
Do not hijack scrolling. Use CSS scroll snap if you feel like it, but don't do anything beyond that.
(To be clear, this comment is a tangent — TFA doesn't hijack scrolling. I see the same hilarious 0s everywhere effect though.)
While the hamburger menu doesn't work (it should, these can be done in CSS with no need for JS) there appear to be no number animations for me, just the correct numbers.
Funny, without js all the specs say "0." I guess they wanted it animated for dramatic affect and now you get something very much the opposite, like they're sarcastically selling you a rock.
This problem happens even with JS enabled. They’re supposed to be dynamic as you scroll but it doesn’t work properly so you end up staring at zeros for quite a while!
Apple Silicon MacBooks instantly made high end Windows laptops become paperweight. Why buy $1000+ Windows laptops with loud fans and much worse battery life, build quality and trackpad? Desktop PC's are a different story.
First, the article is about a Linux laptop, not a Windows laptop, and the reference to Apple Silicon is a bit of non-sequitur anyways. so I feel like your comment lacks relevance to the OP.
Secondly, the reasons one would want to choose a Linux/Windows laptop over an Apple Silicon are so obvious I'm sure you know them already. Many people use software that runs best (or only runs) on Windows or Linux or any x86 computer. Also, Apple's default software is undesirable to some people for a plethora of reasons that have been repeated ad nauseam on HN. And the raw performance of Intel/AMD processors better than that of equally priced Apple alternatives, with comparable performance at iso-power (see e.g. [1] or numbers from [2][3]).
Apple Silicon's success has been very impressive, and Apple laptops are well built, but they hardly makes its competitors obsolete as Apple's greatest proponents seem to want people to believe. The fact of the matter is that an Apple silicon laptop is useless to someone who does not want to commit to the Apple ecosystem.
You don't have to run macOS. There's full support for Windows and Linux and x86 applications. This is also upgradable. Run out of storage? You can replace the drive! Add more RAM! Choice is good.
Asahi's making decent enough progress that, eventually, you may not need to run macOS to enjoy macOS hardware.
Why would I pay ~$930 for this, instead of buying an M1 Air? The comment you're responding to hit the nail on the head - once you've gone fan-less and sleek, it's really hard to go back.
Few vendors, if any, hit the quality feel of Apple's products.
32GB of RAM? Good Linux support today? Don't have to pay $400 for 1TB storage?
Kinda a bad move to buy something that may or may not down the line do what you need to do today. I need features like USB 3 and DP USB output today not 6 months or 2 years from now if it even happens.
32gb is the only reason.. but then again, doesn't matter, because developers claim RAM is cheap anyway.. so there's no problem buying a macbook pro instead.
The screen is not 4k, trackpad not as good, it's plastic, It has this oldschool barrel powercharger, etc. etc. etc.
Except you can just buy whatever M.2 SSD you want and slot it in and it doesn't even void the warranty. So for the same $400 to get a 1TB MBA you can easily get a 1TB Samsung 980 Pro (~$160) + 32GB RAM (~$130) and still have money left over.
Plastic is a pro IMO anyways, doesn't have to feel cheap and it also doesn't get nicked or dented nearly as easily.
They do some things better than Framework such as supporting Ryzen processors, and seem a bit cheaper overall. The battery life seems like it would be better. They have a spare parts store as well and a full disassembly guide as well as an "open warranty". I was never a fan of Framework's swappable ports.
The biggest disadvantage seems to be that the screen is bog-standard 16:9.
I'm curious on the quality of the speakers, webcam, keyboard, and trackpad. I have a feeling they will not be great - in comparison to a Macbook at any rate.
From reviews it looks like the trackpad is poor because it is not whole-trackpad clickable, and the keyboard is also poor due to the short key travel.
Are there any other decent alternatives available in the UK?