Thanks! I've since watched the session, you're right — they announced specifically for macOS. The door's still open, though…
"All the demos revolved around Safari web extensions available on the Mac, though the talk makes a point of not targeting user agent strings (information a browser reports about its own version and what system it’s running on), but rather asking about features that might be present. Given that other types of Safari extensions are available on iOS/iPadOS (namely content blockers), might web extensions eventually make their way to Apple’s mobile platforms as well? One can hope."[1]
Agreed. This is a step in the right direction, but, "Firefox works on Mac", is still the easiest, most straightforward answer to anyone who's asking me, "how can I get a decent adblocker for the web?"
Having a shared API matters for extensions primarily because it makes it easier for a developer to say, "well, I might as well throw that on the Apple store as well." But if Safari literally can't do the things your extension needs it to do, then the API is kind of a secondary concern.
I haven't looked at AdGuard in specific, but fundamentally, AdGuard's Safari version can't be better than uBlock Origin on Firefox/Chrome, because Safari lacks the APIs to do what uBlock Origin does.
The immediate problem is that Safari has a hard limit of 50,000 rules per extension. Some developers do try to get around this with various hacky strategies like splitting their adblocker up into multiple extensions, or moving their blocker to the OS-level, or trying to recompile the list on the fly. But at the end of the day it's just a really weird hoop to jump through, and it ends up being kind of error-prone, and there's no guarantee that Apple won't break those hacks in the future.
More fundamental is the list itself -- Safari doesn't allow contextual, dynamic blocking, you have to put everything into a static list. This makes it impossible to build the kind of detailed filters[0][1] that uBlock Origin allows -- for example, there's no way to strip HTML content from a GET request in Safari before it gets parsed by the browser.
And so on. See the privacy API[2], dns[3], which is used to block a newer bypass method where 3rd-party trackers are hidden behind 1st-party subdomains. Worse CSS/JS injection, etc...
I'm not saying everyone else is crap at adblockers, I'm saying static blocklists are fundamentally less powerful and less capable than Firefox's APIs. That's especially true if AdGuard is doing something like system-wide DNS blocking. No shade at those tools, they have a place, but a browser-based extension will always be better than a setup like PiHole, because the browser has more access to a given request's current context.
There are a couple of reasons, but the biggest one is that intercepting HTML at the network level allows you to filter inline script tags from documents[0]. If you want to stop an inline script from running (say to defuse an anti-adblocker popup/redirect), you can't wait until after it gets parsed to remove it. Once it gets added to the page, the browser will execute it.
Of course, note that uBlock Origin also supports scriptlets[1] that can be used to rein in scripts after they get added to the page, instead of by doing resource rewriting. You're not restricted to only doing your filtering at the network level, HTML filtering is just another tool that filter authors have at their disposal.
Helpfully, at the scriptlet wiki page I've linked each entry also shows a practical example rule that relies on the scriptlet. You can also take a look at the uBlock Origin's default filter lists[2] if you want more examples of people using both HTML filtering and scriptlets in the wild. Many of these block rules are impossible to implement in Safari.
Sharing notion (dumb question) here, since you're smart about this stuff:
At what point will "ad blocking" flip to "content scrapping"? All this cat & mouse, arms race seems like a bad ROI.
I imagine an adaptive super reader view mode. Meaning emphasis on content scrapping rules vs ad blocking rules.
Take snapshots of top websites, do some rendered page aware content diffing, with some user directed setting & curation, infer where the content is, distill that down to "good enough" scrapping rules.
Any way. Thanks for answering some of my lingering questions.
I am not an expert on this stuff, I've just spent slightly more time reading some of the wikis than other people. Take what I say with a grain of salt, other people who are actually building ad blockers or managing blocklists would have more insight.
If you want to swap from a blacklist model to a whitelist model, there are a couple of problems to solve off the top of my head:
- you need a way to refer to content that supports re-hosting. You need a way to convert the Facebook/Twitter link someone shares into the scraped version without you loading the original link. See IPFS, but also see Invidious, Nitter, and the Internet Archive for a lower-tech, more straightforward version of what that might look like.
- you need good enough copyright exemptions that it's OK to re-host or proxy the content somewhere else. This is kind of a gray area, people are rehosting web content without getting sued, but it's not clear to me if that's scalable moving from sites like Youtube to the Internet as a whole. I guess nobody's called to take down Pocket, so maybe the situation there is better than I think.
- You need the web to stay relatively semantic. There are a lot of sites that don't work with Pocket/reader mode. A lot of the sites that do work are because there isn't a highly adversarial relationship between Pocket and site operators.
I'm not sure whether a cat-and-mouse game around content scraping would be better or worse than what we have. I can imagine it would be better in the sense that you'd only need to do it once per page, and then distribute the scraped version. But that's assuming that copyright would allow you to do that.
And I suspect that breaking a scraper is easier than breaking an ad blocker. But maybe someone could prove me wrong there.
I hadn't connected those dots. Thanks. Very interesting.
Tangentially, I've been chewing on an idea similar to quotebacks (recently on HN's front page). My naive take was to create URL shortener to support my use cases. For example, by shortener would attempt to link to OC, falling back to Internet Archive (or whatever). I'll now learn about IPFS, Invidious, Nitter.
Also, I didn't do a good job explaining my 1/2 baked notion for implementing a "whitelist" based content scrapper.
I imagined distributing the whitelist to be used by client's browsers to do the actual transformation locally. Your explanation of how uBlock can also rewrite HTML DOM client side sparked this line of thinking. For the whitelist's curators, my notion for capturing and diffing was providing tools, like a better debugger, which could also be used by front-end developers.
Back to your notion of server side processing, for some combo of caching, transformation, render: I love it.
Opera did something similar with their mini mobile browser, ages ago. The server would render pages to GIF (?) with an image map for interaction, pushed out to the mobile device. Squint a bit and that architecture resembles MS Remote Desktop, PC-Anywhere, X-Windows, etc.
I keep hoping someone trains some advertising hating AI, that will scrap content for me.
re cat-and-mouse
No doubt.
I have written a few scrapers in anger. On mostly structured data. Often mitigating compounding ambiguity from standards, specifications, tool stacks, and partner's implementations.
So I just tossed the traditional ETL stuff (mappings, schema compilers), treated the problem space as scrapping. Generally speaking, I used combos of "best match" and resilient xpath-like expressions (eg prefer relative to absolute paths) to find data.
No, I know that. The issue is that the actual app extension itself (which is written in Swift and useful for things like picking elements from the page, etc.) will pretend like it cannot function without the Electron app running. Some more details: https://github.com/AdguardTeam/AdGuardForSafari/issues/84#is...
I was getting several sites not working properly with AdGuard. Yeah you can add whitelists and such, but it’s annoying. Some of was it Safari not working.
In the end I just use a browser with built in blocking (AdBlockBrowser or now Brave) and leave Safari alone.
If adguard’s VPN adblocking solution is anything like NextDNS’ Then it doesn’t block ads on YouTube, which is a pretty big source of ads for the lay user.
It's not Open Source, and if it's running in Safari it doesn't have access to any of the APIs I mention elsewhere[0].
I'm sure it's a great extension with a lot of work put into it, but there is no realistic way 1blocker can match the best ad blockers in Chrome/Firefox unless it's hacking Safari behind the scenes to inject new APIs.
Doesn't Apple's content blocking API limit the number of rules at 50,000 rules per extension? uBlock Origin uses 85,000 rules from EasyList alone. That's without country-specific and privacy rules. I have about 180,000 active rules at the moment.
To elaborate further on your comment, you can programmatically enable/disable/change/recompile nested rules from your master extension.
I wonder if half the reason the 50k rule limit exists is so developers don't end up with an append-only infinite rule list that takes 20 seconds to recompile. If you're stuck turning 200k rules into four lists of 50k rules, you can just recompile each one on its own thread and all of it will take a couple seconds max.
That's certainly ideal, and for the most part, Safari content blockers work reasonably well. However, there are things you currently can't block using Safari content blockers that can be blocked with uBlock origin. One of the most prominent examples are CSS classes, and the inability to block them becomes a real pain if you open the mobile Reddit website. Their mobile website disables scrolling if you block their pesky "use our mobile app" popups.
to maintain an apple developer account you also need a mac or iDevice with secure enclave, so even if these were actually compatible with webextensions, you couldn't publish your webextension for safari unless you were an apple user.
there isn't literally a rule that says you must own an idevice. but there is a rule that says you must have your developer account linked to an idevice, and use it to log in.
I don't know. I mean, yes, I need 'blocked requests' and 'unlimited storage' for my extension (http://fraidyc.at/). However, at least I can give Safari users a taste of it - which could either:
* draw them into fuller platforms for using it. (or)
* pressure safari to support more of the standard.
I’d never heard of your extension before but IMO that is not at all what I think of as a “browser extension” - it seems to me that it’s more using the browser as an application runtime than actually providing more functionality for the current page.
I love the concept of your extension. You're right, the 24/7 feeds from social media encourage spam, and the way out of it is to take control of the view with software like yours. Great work!
Apple can and does prevent VPN apps from being on the store that do things like this. (If you're relying on DNS resolution, then yes, there's not much they could do…unless they at some point hardcode some DNS servers.)
Nobody really uses Safari except for on iPhones, and I suspect I will drop dramatically as they’re adding support for choosing your own browser in iOS 14.
Even among non-technical people I know, nearly everyone replaces Safari with Chrome or FF. Even my grandfather uses Chrome on his Mac and iPhone.
>> Nobody really uses Safari except for on iPhones
Have you got _anything_ to back this up? I’m willing to bet the majority of Mac users use Safari (given it’s the default browser). I use it and it’s a lot less clunky than Chrome and Firefox. I also repeatedly run into an attitude of “if my users don’t user Chrome then I don’t care about their experience” and find it very off putting.
It's a bit outdated now, but back in May 2017 Gruber reported [0] that 69% of visitors to his site used Safari. This was about 6 months before the release of Firefox Quantum, so I'd be curious to see more recent numbers if the Firefox usage has changed at all.
> Looking at my web stats, over the last 30 days, 69 percent of Mac users visiting DF used Safari, but a sizable 28 percent used Chrome. (Firefox came in at 3 percent, and everything else was under 1 percent.)
Your “nobody” is not a representative sample: if you run a general-purpose website, Safari will be something like 15-20% of the total traffic which is more than most people are going to write off as insignificant.
On iOS, even if they allow different browser apps they have not apparently relaxed the restriction on the embedded browser engine and that means Firefox or Chrome are different UIs on top of a browser engine which is still Safari.
Third party browsers for iOS wrap the same WKWebView that Safari uses. Safari isn't going away any time soon, even with the ability to change default browser.
Apple had a great opportunity to natively support web extensions in Safari. Instead they choose to do it through the app extensions System. Meaning a developer needs to have and pay for Apple dev account. They need to have a Mac to port and compile it. They have to sign it. And then maintain it.
This will certainly affect adoption. Forcing an open standard into a closed system seems intuitively wrong.
But at least kudos to Apple for recognizing the need for webextensions support.
I think it's worth looking at how Apple has gone about similar changes in the past.
It dips its toe in the water. It then eases an ankle in. It finally jumps in fully when it's ready to, and not before.
Along the way, the sort of developers who spend more time posting on Twitter and HN than actually developing complain about each of the first two steps being not enough, and when the third comes moan about "monopoly" this and "lock-in" that.
Apple isn't going to make everyone happy. And it doesn't care to. It brings what it feels its customers want. But it is rarely rushed to something because it's cool or trendy. Apple gets there when it is ready. The times when it has rushed, it's botched the job. So I'm OK with Apple's measured approach.
If what Apple offers you today in Safari isn't what you want, go Firefox.
Also things such as wireless charging, Apple Pay, etc. They were not the first by a long shot. They took their time to do it well and now it's everywhere.
I switched to Firefox in the end of 2019, and I don't think it's entirely clear cut, slowness-wise.
In my experience, Safari slows down to a crawl if I open more than 5 or 6 tabs at once, especially something sadness inducing (think new reddit); and it's very eager to.. dump things from memory? I am not sure what happens, but as I switch these gargantuan tabs, it seems to re-draw things a lot.
Firefox handles that kind of thing like a champ. Add uBlock origin, and I have my almost ultimate browsing experience: JavaScript off default, enabled as I see fit.
There is one nasty oddity where I often found myself having to slow down for Safari: when I edit URL query parameters. If I hit return before some indecipherable arbitrary time, which I guesstimate by the auto completions showing up, Safari will throw away my changes and merely reload the current page. I have unchecked and toggled all possible autofill or search checkboxes in Safari settings, and it still happens. This and the extension apocalypse are what drove me away from Safari. In contrast, I've got all the extensions I want, and never ever have to slow down for Firefox.
Well worth giving it a shot for a few weeks. Container tabs are also awesome.
Why is Firefox slower on a Mac? I know Safari is (supposedly) faster than a lot of the other options, but I haven’t seen anything about Apple slowing down other browsers on Macs[0].
[0] I know they force all browsers to use the same webkit engine as Safari on mobile devices, and used to limit competitors to an older slower version for “security reasons” but AFAIK they don’t slow down competitors on mobile anymore and never have on Macs
It's not Apple that slows down Firefox on Mac. It's just that Firefox cannot match the resources that apple and google throw around on their browsers. Firefox has to be economical on where their resources are directed towards. And understandingly Macos specific optimizations not the top priority.
Firefox natively suspends unused tabs which dramatically improved my memory usage and battery life.
Chrome has a similar extension but it's nice having it built in to the browser natively.
If anything you could argue that FF is behind Chrome security-wise with their more limited sandboxing and some other memory protections. But performance is largely a non-issue.
Understandable from perspective of how apps are developed, but from the user perspective comparing Firefox on Mac to Safari on Mac is literally comparing software on Apples to software on Apples.
I don’t have a lot of hopes that this will satisfy my needs. I just want these extensions in Safari, to start with:
* uBlock Origin (confirmed as of now as not supported, and unlikely to be supported in the future either, with content blocking handled at the browser engine level)
* Privacy Possum or Privacy Badger
* Session Buddy or some good Session saving extension
I'm with you. I switched from Safari being my main browser, which I did love, to Chrome and then to Firefox. I cannot use a browser without uBlock Origin.
Seriously. Not being able to block ads with Safari makes an automatic no-go for me. Even if they make their own built-in ad blocker, you can bet they'll let companies pay to become "platinum sponsors". No thanks.
> Not being able to block ads with Safari makes an automatic no-go for me
I actually turned it into a very specific use-case: when I need to be tracked for cashback/referral purposes on big purchases, I use Safari. Everything else goes on Firefox.
It's a bit ironic to use Apple software when I want less privacy, after all their spiel.
I've been happy with Wipr on macOS and 1Blocker on iOS for years. In fact I think you can just use 1Blocker in both platforms now. Neither is as "good" as uBlock Origin but I don't really care.
The full app extension apis do let you do that (based on my installed as blocker), there’s just a bunch if restrictions that apply to those extensions to prevent arbitrary content manipulation (iirc).
I really wanted to stick with Safari a little over two years ago when I moved off of Google products. It was (is) snappy and battery conversant. Beyond poor extension support, I realized the absolutely most important features to me in a browser are password, bookmark, and history synchronization. Since I use Windows and macOS, Safari was out of the question (RIP Windows edition of it). Thankfully in the last year Firefox has mad leaps and bounds on performance and battery life consumption on macOS (on Windows it is fantastic).
Also at the end of the day, you have what I believe is the most unlimited ad-blocking experience on Firefox compared to Chrome, Chromium, and Safari.
I'm going to be downvoted for saying this but... ublock origin actually does slow down rendering your webpage. I think that it's sad that there seems to be so much pushback from vendors in regards to options like ublock origin, but I have found equally impressive results with system wide options like adguard without it reducing the speed of pages rendering.
Yes, I’m going to downvote you, but only because you misunderstand how ad blocking works. uBlock does marginally slow down webpage render because it has to inject CSS rules (cosmetic filters) to improve blocking quality. If you don’t want this feature, you can disable cosmetic filtering entirely within uBlock’s settings.
The problem with AdGuard is that they charge for system-wide blocking which can be done for free with a hosts file or pi-hole.
Let me please chime in. As AdGuard developer, I'd like to comment on what you said and explain some details about how it works.
First of all, AdGuard desktop and mobile apps are quite different from hosts files or pi-hole. For instance, they're also able to apply cosmetic rules. Also, and this is really important on Android (unfortunately we can't do that on iOS), AdGuard is able to apply different rules depending on which app makes a request.
The common example is dealing Facebook Audience network. If you need to block Facebook ads in third-party apps with Pi-Hole, you'll have Facebook official apps broken as well. With AG, you can keep the official FB app working and block it in third-party apps at the same time.
Second, on every platform save for iOS, AdGuard filters every network connection and not just DNS queries. There're already multiple examples of apps (for instance, TikTok) that switch to using DOH when they detect that there's DNS filtering messing with their domains. Eventually, every network-level blocker, pi-hole included, will have to control all connections as DNS simply won't be enough.
> For instance, they're also able to apply cosmetic rules.
How does this happen? Do you inject the thousands of CSS rules from EasyList in each page? Doesn't this require AdGuard to be able to have access to the response body for secure connections? Doesn't this mean that web pages can easily override injected cosmetic filters since these are not injected as user styles?
1. Yes, it basically injects a CSS stylesheet into every page + a JS script that does additional filtering.
2. Yes, in order for this to work, AG will need to access response body.
3. Yes, and in this case we'll need to use JS-based filtering (so-called extended CSS rules or scriptlets).
And here's an answer to the question you didn't ask: yes, this would be slower than doing the same with a browser extension. To make it faster, a different approach to filter rules is required - fewer generic cosmetic rules, more HTML filtering and removing elements from the page content.
The end goal is to completely get rid of any app or extension. Ideally, I'd like to see all the filtering moved to a server-side application like AdGuard Home.
If all pages were static this would’ve worked beautifully.
But they aren’t, and uBO approach also implies that there is a mutation observer constantly monitoring DOM changes and adding new rules when they are needed. In a browser extension it uses browser’s own RPC to pass DOM nodes to the background page, check them, and apply new rules when needed.
In our case we would’ve needed to use something like a websocket to do it, but WS may be quite problematic. And the overall performance gain from this DOM-survey approach is milliseconds, it’s not something that a person can notice.
No, AdGuard desktop is basically a full-scale firewall with a network driver that intercepts all network connections.
Regarding Android, this works with the help of the VPN API.
1. Android routes all IP packets to the “tun” interface
2. AdGuard reads them, passes through its own small tcp/ip stack. On one side there is the tun device, on the other side there are real sockets to the IP packets destinations.
3. App detection can be done either by reading /proc/net/tcp or, on newer Android versions, by using special getConnectionUid method.
Regarding secure connections, besides IP filtering (which AG also can do) there is always SNI scanning. Also, there’s an option to MITM connections, in this case AG generates a unique CA locally and does all the certs validation by itself.
I read more about how AdGuard works on the website (which is what I should have done in the first place, instead of making assumptions on how it works). I’m quite impressed, especially with Android, at how much control AdGuard has over network traffic. I’m left with a few questions.
- Many apps ignore the device certificates and instead use their own certs which come installed within the app itself (to prevent MITM attacks). How does AdGuard deal with this?
- iOS is much more restrictive. Other than Safari’s content-blocking API, does AdGuard for iOS only do DNS filtering?
- I’m not surprised that apps like TikTok are using DOH to circumvent filtering, but I can’t find a source online confirming this. Could you point me to an article/repo issue/etc. which confirms this practice and lists other apps that also do this?
1. SSL pinning is not actually that widespread. However, it is used by quite popular apps - Facebook and Twitter. Unfortunately, there is no way to deal with it without patching the apps itself. Also, modern Android versions limit the trust for user certs, basically only browsers trust that type of certs. The solution would be to move the cert to the system store, but it requires root.
2. It’s not iOS that’s restrictive but Apple. We can’t even mention anywhere in the app that you can block something using DNS filtering. We’re playing this reject-phonecall-reject-appeal game for two years already and I simply have no confidence that if we bring all the functionality to iOS they allow this. Other than that it’s possible and rather easy to do, the core filtering engine is implemented in C++ and we use it on all other platforms.
3. This is from talks with filters maintainers. But yeah, this is a good topic for an article, we should write one. Thanks for the tip:)
edit: my desire for moving filtering to the server-side actually comes from the experience of communicating with different stores. Also, Chrome’s upcoming changes contribute to my paranoia. Slowly, step by step, users are losing control over their own devices and apps.
It’s almost as if the browser should provide an method for extensions to identify things to block, in a way that is optimised for performance and privacy, rather than just throwing even more JavaScript into the page.
However, this API could be used to optimize some of (not all of) the networking filtering that uBO does. Network filtering is distinct from cosmetic filtering, the latter is what injects scripts into every page.
The uBO wiki has some nice documentation of the different features, measurements of their overhead, etc. E.g.: https://github.com/gorhill/uBlock/wiki/Doesn't-uBlock-Origin...
Great so not javascript running "in the context" of the page.. just javascript running.. somewhere else, for every single network request that is made, and that still has access to the page.
I know a lot of web developers don't want to hear this, but throwing more javascript at the problem isn't always the best solution.
You are being presumptuous about my knowledge in regards to ad blocking. So I'll explain in further detail why I personally use adblock, and why ublock is unnecessary.
Second, adguard is free. There are certain premium features, however. For instance, filtering ads outside of your browser requires a premium license. However, if you are comparing it to ublock this was already not a feature that was available to you.
Of course, even that comes with an addendum, which is that this is only something that you need to worry about on mobile devices. Should you be using the desktop version you will actually see that adguard is free and available here on github: https://github.com/AdguardTeam/AdGuardHome
In regards to pi-hole. The cumbersome nature of pi-hole makes it undesirable for me. There were many sites which were blocked for inexplicable reasons the last time I used a pi-hole and creating a whitelist for these sites requires more work than is necessary. First you must navigate to the portal, and then you must make changes to either the whitelist, or remove a site from the blacklist depending upon why the site is not accessible. It was not seamless design. However, those that find this feature desirable will find that adguard actually offers the same experience with adguard home: https://github.com/AdguardTeam/AdGuardHome
And as a matter of fact you will notice that there are comparisons between ublock origin available on the github page for your review. There are actually a number of features which are missing from ublock which are simply not possible with ublock origin. As for the accuracy of these features you’ll have to determine that on your own but you can review them here: https://i.imgur.com/5fSLuHx.png
Also, ublock origin does not marginally slow down a webpage rendering. It significantly slows down a webpage rendering. As evidenced here:
That means that when you are running ublock origin your browser is working at 87.85% capacity. Or in otherwords, a 13.15% decrease in speed. To put this in real world terms it would be similar to reducing the speed to 50mph* on a road that was originally 60mph.
So feel free to keep using ublock origin. As a matter of fact I also use it occasionally in a pinch. (It does, after all, significantly speed up rendering of some webpages. I find it especially useful on webmd.) However, for my daily driver I find it to be a poor solution.
> That means that when you are running ublock origin your browser is working at 87.85% capacity.
No it's not, at this point your are spreading FUD about uBO.
This is a benchmark specialized for one thing, the creation/deletion of DOM nodes, a completely unrealistic scenario in the real world -- web sites do not repeatedly create nodes just to have them deleted as soon as they are inserted, as fast as possible.
uBO's code which deals with the DOM represents a fraction of what uBO needs to do, and yet you extrapolate the benchmark as if it represents all the work uBO does, leading to your nonsensical conclusion.
Here is how uBO compares to other comparable content blockers with an actual real-world scenario, as per recent debugbear benchmark[1] -- it's the least CPU hungry of the bunch: https://twitter.com/gorhill/status/1273263792785326085
Note that the above is a worst-case scenario for content blockers, since this was about loading a page from `example.com`, where nothing is blocked, consequently where a content blocker acts only as an overhead. See reference [1] for a more typical scenario.
I don't know what to conclude about ad blocker perf from Speedometer, but it's not just creation/deletion of DOM nodes. It runs multiple implementations of TodoMV using various JS frameworks[1, and performs real operations in the web app. DOM is not the bottleneck. The test stresses many parts of the engine, JS, CSS, HTML, rendering, etc.
Google's V8 team (not the original creators of Speedometer, that was WebKit) did a study of real-world JavaScript and found that it was highly correlated with Speedometer[2]. It's likely to be a good proxy of load time and responsiveness for many web sites, especially ones built with modern JS frameworks.
I don't know why uBlock Origin would slow down Speedometer and I'm actually surprised to hear it's true.
Care to comment about the debugbear benchmarks? Why is the supposed undue overhead as per SpeeDOMeter not showing up in there?
I encourage whoever to actually speculate less and measure page load times from the top 500 Alexa and make the case that uBO is an issue CPU- and memory-wise. I confidently predict that you will find that there is no correlation to the SpeeDOMeter benchmark.
There are other 3rd-party benchmarks out there which also found that uBO does indeed save CPU and memory resources.[1][2][3]
Then my explanation would be that these measurements measure different things than Speedometer. CPU time is a power metric, First Contentful Paint is a page load speed metric, memory is a memory metric, none of these are web app responsiveness measurements. All of these data points can be true at the same time!
Personally, I would expect any content blocker that does a lot of blocking to speed up page load, reduce memory use, and save power, because less stuff loads and less stuff gets processed. That any ad blocker would reduce Speedometer score is kind of a surprise to me, but if it's real, I believe it.
I really have no stake in which ad blocker is best or more efficient. I just wanted to clarify what Speedometer does, and why it's a relevant measurement.
I admire your work, gorhill, I do. However I'm not convinced of your results. Lets go ahead and run some benchmarks. First, would you mind tell me how you can about your results?
How did you come about your results? Did you use selenium? Or did you catalog all the data by hand? Also, do you have the data available? I'm going to run the tests myself, and any insight would be appreciated. I would especially appreciate it if you could allow me to review the data you have on hand but I understand if you don't want to provide it.
Also, slightly beside the point but is Raymond.cc your site? Just curious.
Yeah, no. Javascript manipulates the dom exactly as you are describing. In fact, javascript is directly responsible for much of the rendering on the modern web today. Hacker News goes in circles talking about it lately, in fact.
So any page with javascript is going to be slowed down precisely for the reason you are stating.
All you have done is compared a bunch of chrome extensions. Adguard is not an extension. Being faster than all the extensions is like being smartest boy who got held back a grade.
I responded to your claim _"It significantly slows down a webpage rendering"_, and provided 3rd-party evaluation to the contrary when actual real world web pages are used.
You counter with a just-trust-me claim that AdGuard not-the-extension is fastest, no objective evaluation provided -- despite AdGuard the-extension not faring the best among the bunch in actual real-world, independent evaluations.
Well, to be honest, AdGuard not-the-extension technically cannot be the fastest. It does a lot of things to do system-wide blocking - analyzing the connection, passing network packets, parsing protocols, this all adds some overhead. Browser extensions simply don't deal with all that stuff and thus any extension, unless it's really poorly written, is not slower.
Moreover, this is not what we are offering. System-wide blocking is about having more control over your system and not about being the fastest.
The point I was trying to make is that cosmetic filtering slows down the browser, period. AdGuard is not more efficient at applying CSS rules than uBlock.
uBlock Origin speeds up most pages because it actually prevents things from loading that would slow down the page. Whereas a during benchmark it's really just sitting there idling.
I'm sorry but that's definitely not true. It does not prevent things from loading. They still load, but they are scrubbed from the page. A DNS filter like adguard and pihole actually does what you are describing.
> They still load, but they are scrubbed from the page
Misinformation, again.
uBO prevents DNS requests from being made in the first place for requests which are meant to be blocked. This way the browser will know faster that a network request is meant to be blocked, it does not have to wait for a DNS resolver to respond.
Additionally, as a result of being in-browser, uBO is also able to collapse the placeholder counterpart of any of network requests which were cancelled.
I am skeptical system-wide blockers would slow down less, especially considering the browser has to fire the network requests no matter what, while in-browser blockers prevent blocked network requests from being fired at all.
As someone who works at a company with a web extension, one big thing I care about is automated deployment. We have automated releases with Chrome and Firefox (although it has to use Puppeteer). The new Edgium recently wanted to get us to publish but I found zero docs on potential automation so that's a blocker. Wonder what process Safari has to offer here?
No blocking requests.
"Unlimited" storage (which is necessary for most non-trivial extensions) is actually limited to 10MB.
You're stuck in Apple's app store which is equally as draconian as Chrome's and Firefox's but now you have to pay $99/year.
No IndexedDB.
The extensions aren't usable on mobile.
I care little for Safari's users if I have to deal with all of this.