Hacker Newsnew | past | comments | ask | show | jobs | submit | karteum's commentslogin

Possibly, but I don't see why those people would buy a new MacBook rather than a used 100$ laptop (which would be both better for their finances but also for the planet...)

Have you ever used windows on a $100 second hand laptop?

Imagine for a second that you don't know much about computers. You buy something crap like that and turn it on. Windows is of course already installed. Along with 18 antivirus programs and who knows what other junk. The computer will run dog slow. Even if you get rid of all the preinstalled programs, it'll run horribly slowly.

My mum has a computer from her work. Its pretty recent - worth way more than $100. It takes about 5-10 seconds for zoom or google chrome to start. And about 15 seconds for outlook to open. Its an utterly horrible experience.

If you can afford it, you'll have a way better experience on a macbook air from the last few years. In comparison, everything starts instantly. The experience is fantastic. Premium, even.

Personally I think its criminal that cheap laptops run modern software so poorly. Its just laziness. There's no reason for the experience to be so horrible. But the world being what it is, there is plenty of reasons to spring for a $1000 macbook air over a $100 second hand windows crapbook if you can afford it. Even if you don't do much with the computer.


> there is plenty of reasons to spring for a $1000 macbook air over a $100 second hand windows crapbook if you can afford it

Plus you can pick up a used M1 MacBook Air for as little as $300 these days. Despite being 5 years old, it'll still smoke anything on the PC side much under a grand, in terms of responsiveness.


Battery life?

Go for a midrange Chromebook then. It's all my wife uses, cost about $250 and has better battery life than most laptops on the market.

I don’t know, I kind of like 10 hrs on battery with normal usage and screen fully lit on a 15” screen while not being bulky. Virtually no contenders in that space.

https://www.acer.com/us-en/chromebooks/acer-chromebook-315-c...

10 hour battery life, 15 inch screen, not bulky, $200. A bit more money will get you better specs, screen or battery life from this or another brand, but it hits all the spots for someone who doesn't need downloaded software.


I have happily used nanokvm (https://wiki.sipeed.com/hardware/en/kvm/NanoKVM_Pro/introduc... ).

(N.b. unfortunately the ATX board cannot be ordered independently, so be sure to order the "nanokvm-full" package)


Morpheus : "The human body generates more bioelectricity than a 120-volt battery and over 25,000 BTUs of body heat. Combined with a form of fusion, the machines have found all the energy they would ever need."


Not entirely sure, but I heard that the "Pixel 911 bug" (e.g. https://news.ycombinator.com/item?id=37714579) could be related to an IMS bug. Since operators are liable with regards to safety calls, maybe it explains why they would only allow tested devices to use IMS...


I wish that the diff/patch would be able to better take into account moved data (not only as deletion+add but with with a proper semantic indicating the moved block). This would both lead to smaller and more readable patches.

I noticed that some peoble have worked on such an algorithm, e.g. https://en.wikipedia.org/wiki/User:Cacycle/diff


If you squint hard enough, that's also what `git` does at the file level, when it detects renames even if the file changed.




If using a VPN service is deemed unsafe, it's still possible to book your own VM or server with a full root access, and then setup your own openvpn/wireguard connection.


I don't fully understand: aren't gradle and aapt2 open-source ?

If you want to build buildroot or openwrt, the first thing it will do is compiling your own toolchain (rather than reusing the one from your distro) so that it can lead to predictable results. I would have the same rationale for f-droid : why not compile the whole toolchain from source rather than using a binary gradle/aapt2 that uses unsupported instructions?


SDK binaries provided by Google are still used, see https://forum.f-droid.org/t/call-for-help-making-free-softwa...


I agree, this should be the case, but Gradle specifically relies on downloading prebuilt java libraries and such to build itself and anything you build with it, and sometimes these have prebuilt native code inside. Unlike buildroot and any linux distribution, there's no metadata to figure out how to build each library, and the process for them is different between each library (no standards like make, autotools and cmake), so building the gradle ecosystem from source is very tedious and difficult.


having worked with both mvn and gradle, i always have a good chuckle when i hear about npm "supply chain" hacks.


I was thinking exactly the same : I also don't get it (even though I totally get that someone may lose motivation to work on a project, and certainly has no obligation to continue. But this justification sounds a bit weird to me).

Could this mean that he has been approached by some "men in black" asking to insert some backdoor in the code or to stop working on it, together with a gag order ? (actually I also wondered the same a long time ago with Truecrypt, even though to my knowledge no backdoor has ever been found in further audits...)


I suspect there is much less Indium and Selenium mining resources available on earth compared to silicium, so I wonder to what extent we can speak about scaling to "mass production"...


> Usually, what you want is either the byte or the grapheme cluster.

Exactly ! That's what I understood after reading this great post https://tonsky.me/blog/unicode/

"Even in the widest encoding, UTF-32, [some grapheme] will still take three 4-byte units to encode. And it still needs to be treated as a single character. If the analogy helps, we can think of the Unicode itself (without any encodings) as being variable-length."

I tend to think it's the biggest design decision in Unicode (but maybe I just don't fully see the need and use-cases beyond emojis. Of course I read the section saying it's used in actual languages, but the few examples described could have been made with a dedicated 32 bits codepoint...)


Can you fit everything into 32 bits? I have no idea, but Hangul and indict scripts seem like they might have a combinatoric explosion of infrequently used characters.


But they don't have that explosion if you only encode the combinatoric primitives those characters are made of and then use composing rules?


You still get the combinatoric explosion, but you have more bits to work with. Imagine if you could combine any 9 jamo into a single hangul syllable block. (The real combinatorics is more complicated, and I don't know if it's this bad.) Encoding just the 24 jamo and a a control character requires 25 codepoints. Giving each syllable block its own codepoint would require 24^9>2^32 codepoints.


> Giving each syllable block its own codepoint

That's the thing - you wouldn't do that! Only a small subset of frequently used combos would get it's own id, the rest would only be composable


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: