Hacker Newsnew | past | comments | ask | show | jobs | submit | more pornel's commentslogin

You reuse optimizer and machine code generator of the C compiler, and you're not tied to a single backend like LLVM.


libpng is reasonably fast, and has SIMD optimizations. Make sure to compile it with a modern CPU target.

The biggest bottleneck in PNG decoding is zlib, which is not part of libpng. There are faster inflate implementations, but nowhere near 5x.

The second slowest thing is unfiltering, but it takes only 10-20% of the decoding time, so even lightspeed implementation would make little difference.

There is possibility of a 10x difference when encoding, but that's not due to libpng being slow, but because it's possible to apply worse compression and there are dedicated crappy-but-veryfast encoders.


And the internal state of a JPEG decoder can be an order of magnitude larger than the JPEG file (especially progressive JPEG that can't stream its output).


Cook has bet on increasing services revenue instead. Financially, this has been very successful, and has given AAPL growth even when they could not grow hardware sales.

However, this strategy has turned Apple into a rent-seeking company that must continuously expand subscription services, while locking down the devices and giving themselves a preferential treatment, to avoid competitors undercutting Apple's high prices. This motivates Apple to obstruct integrations with 3rd party services instead of giving their users choice and good UX.

If Apple didn't try to maximize services revenue, they could have run AppStore with merely a healthy margin, rather than a duopoly fee that keeps alienating developers. There would not be a messy battle with Epic, and the EU could not justify a major intervention.


As a developer (just $1,000 MRR from App Store) - I don't care about 30% fee (actually 15% in my case because of Small Business Program). The only other developers that are mad are other large corporations, that are also fighting for profit.

For me, as an indie developer - I love that Apple takes care of many things for me on the App Store side. I am ok with sharing my revenue with them. I hate the App Store for other reasons, but not the 15-30% share.

Apple became too big. Other companies will sue them for one reason or another. Whatever they do - other companies will sue them.


I'm a slammer developer in a similar boat, and while I still think the 15% SPP fee is too high, it's not what really bothers me about being in the App Store.

It's the Kafkaesque app review process and the inversion of power between platform holders and third-party developers.

I remember a time when I sold my software directly to users over the web. It certainly didn't cost me 15% of my revenue to process payments and refunds, but I'm actually fine paying Apple a few extra percent to cake care of that internationally.

Back in those days, the prevailing wisdom was that platforms competed for third-party devs in order to make their platforms useful for their users and increase market share. If you read the arguments around the new EU rules for platforms, there's a lot of talk about the platform holders' "rights" to "monetize their IP" or similar language. Apple says it. Google says it. Gruber and other influencers parrot it. I hate it.

Developing the OS isn't free, but the revenue from end users more than pays for it. Developing the developer tools isn't free, but the $99/year revenue from developers more than pays for it. Combined, they probably more than pay for development and hosting of the App Store.

I get that Apple has no shortage of people wanting to develop for their platform, so they don't need to compete for third-party developers anymore. I don't like it, but I understand it.

I just wish they wouldn't try to rent-seek on top of it. I wish app review weren't so complicated, self-serving, inconsistent, and useless. Apple likes to publish statistics on how many fraud apps they keep off the App Store, but in my opinion, it still has far too much trash and scam apps to argue that app review works well for anyone but Apple.


This eternal discussion gets rehashed over and over again.

It's great that you like it. It wouldn't be an issue if it was optional. Then the people who like it could have it and the people who don't could go somewhere else and get what they want. Nobody is trying to take away your cheese.


This is somewhat tautological, because the AppStore is necessarily filtered down to businesses that can exist within this fee structure.

The cut is 15-30% of revenue, rather than profit, so for lower-margin businesses Apple's cut can be larger than their own profits.

And I care as a user, because developers aren't subsidizing Apple – this fee is ultimately paid by me, and I don't have much choice outside the Google-Apple duopoly.


> the HBase version upgrade is a slow and painful process due to a legacy build/deploy/provisioning pipeline and compatibility issues

Is that HBase's fault, or Pinterest's added complexity?

I'm baffled when databases don't support seamless in-place upgrades, and require a full dump and restore instead. At certain scale a full rebuild is as complex as replacing wheels of a moving car.


HBase & Hadoop are painful to upgrade. Honestly doing anything with them is painful.


HBase supports in-place upgrades. Almost* all version upgrades have been relatively painless if you have automation to do the necessary operations across a cluster of nodes. Hadoop upgrades have been a similar story. That minimum automation necessary is high for all stateful data stores of this size.

* A few notable exceptions exist where Hadoop and HBase upgrades were necessary simultaneously. These have been awful experiences that required huge efforts to accomplish.


I wish Debian/APT had first-class support for having multiple versions of the same library.

On a large system it's inevitable that packages will be upgrading at different rates, and a bit of duplication can be a way out of complex system-wide dependency chains held by the most outdated package.

Despite lack of tooling support, Debian still does not completely avoid duplicated packages. They merely manually move the major version number to the package name (foo v10.0 and foo11 v11.0), causing package names to vary between distros and Debian versions. This seems like the worst solution, since APT can't automatically do it where it would be useful, and renamed packages complicate use of tools like pkg-config, ansible, and providing install instructions to users.


That is an orthogonal issue. Guix has first-class support for versions (where "version" means the version string of the source code), while nixpkgs uses the same naming-based approach as Debian.

The important part is being able to install two packages in isolation, irrespective of if they represent the same software in different versions or if they are completely unrelated. Apt couldn't install two unrelated packages if they contain files that have the same path, to my knowledge.

As soon as one foregos putting everything into a single filesystem structure (like in FHS, which most distros follow to some extent) the logical conclusion will be to prefix each packages directory with some identifier that meaningfully describes its content (which could be considered an "extended version" of this package). With nix this is a hash of all the inputs that went into the package (its own source code, compiler flags, build settings, dependencies, etc. and recursively the same for all of its dependencies). This means that if the compiler flags of e.g. a five levels deep transitive dependency are changed, then the package that depends on it changes as well, which is only reasonable since its behavior could change.

At that point all dependency resolution has happened at build time and none is required at runtime. Being able to install different versions of the same package is merely a by-product of this approach.


One server = one feature = one library version

That's the golden rule for easy upgrades, sane dependencies managements (both technicals and organisationals)

There is no good reason to not spawn hundreds or thousands of VMs : just do it, make your life easier and stop bothering with a whole bunch of issues


You may "booh" me. But in the end, you know I'm right : who won, the 80' era supercomputer, or the k8s/serverless world ? Yeah


Indeed, AmigaOS despite being ahead of its time in the '80s, was doomed to lose by the end of the '90s.

The biggest problem was that Amiga did not have an MMU until very late, and the OS has been designed for unprotected shared memory space. It was crashy, with fragmenting leaky memory, and it could not support fork(). Later AmigaOS 4 and MorphOS struggled to add full process isolation.

In retrospect, Microsoft has been prescient in adding virtual memory to 9x, and incredibly successful in switching to the NT kernel. AmigaOS would have needed the same to survive, instead of just sitting on their multitasking-a-decade-before-Windows laurels.


> In retrospect, Microsoft has been prescient in adding virtual memory to 9x

Did you mean protected memory? Didn't OS/2 have both earlier?


Rust has this. Data can be placed in atomic or non-atomic refcounted containers, and the type system disallows sharing non-thread-safe types across threads.

Additionally, data can be borrowed from inside the refcounted wrapper, and used freely within a scope without touching the reference count.


The popular sentiment has changed from enthusiasm about "digital", to disillusionment about big tech inserting themselves into our lives to monetize everything.

In 2009, smartphones were a novelty, and the iPad has not been announced yet. People were wowed by the new capabilities that "multimedia" devices were enabling. They were getting rid of the old, outdated, less capable tools.

Nowadays "multimedia" is taken for granted. OTOH generative AI is turning creative arts into commoditized digital sludge. Apple acts like they own and have the right to control everything that is digital. In this world, the analog instruments are a symbol of the last remnants of true human skill, and the physical world that hasn't been taken over by the big tech yet. And Apple is forcefully and destructively smushing it all into AI-chip-powered you-owe-us-30%-for-existing disneyland distopia.


I guess earlier people must have assumed it is not really possible to replace all those instruments and tools with a small phone.

So the ad was probably punching up in a way back then.

Today there is a real recognition of how pervasive digital devices and AI tech is becoming.

With all the might and influence Apple and tech companies now have - this ad might have evoked a sense of punching down.


Rust's iterators are single-use streams of items, and nothing more.

A Rust iterator can be fully implemented by a closure that either returns the next element or signals it's done. This minimalist interface composes well, especially that it always uses single ownership, so there's always only one copy of the state accessed only from one place.

They can't describe collections. They don't remember their beginning. They don't need to know their end until they reach it. They can't be sorted. They don't need to support splitting or iterating in reverse.


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: