>It's a bit annoying the first few days, but then the usual sites you frequent will all be whitelisted and all that's left are random sites you come across infrequently.
How does this work in reality? Do you just whitelist every site you come across if it's broken? What's the security advantage here? Or do you bail if it requires javascript? What about the proliferation of sites that don't really need javascript, but you need to enable it anyways because the site's security provider needs it to verify you're not a bot?
> Do you just whitelist every site you come across if it's broken?
I look at the breakage, consider how the site was promoted to me, and make a decision.
> What's the security advantage here?
Most of the bad stuff comes from third parties and doesn't provide essential functionality. A whitelist means you're unblocking one domain at a time, starting with the first party. If there's still an issue, it's usually clear what needs unblocking (e.g. a popular CDN, or one with a name that matches the primary domain) and what's a junk ad server or third-party tracking etc. You can even selectively enable various Google domains for example so that GMail still works but various third-party Google annoyances are suppressed.
> What about the proliferation of sites that don't really need javascript, but you need to enable it anyways because the site's security provider needs it to verify you're not a bot?
Depends on trust levels of course, but there's at least some investigation that can be done to see that it actually is coming from Anubis or Cloudflare.
First, I use both uBO + NoScript + ClearURLs (removes tracking from URLs) + FastForward (Circumvents sites like adfly) + A pop-up blocker of your choice (stronger blocking than default also whitelist only in my case). They're all popular add-ons on Firefox and should also be available on Chrome, or variants of them. You don't need them all, uBO is more than fine for most use cases, I've just gotten used to it for a few years.
>Do you just whitelist every site you come across if it's broken?
Mostly, yes, often temporarily for that session, unless I do not trust a website, then I leave. How I deem what is trustworthy or not is just based on my own browsing experience I guess.
>What's the security advantage here?
You can block scripts, frames, media, webgl... Meaning no ads, no JS... Which helps minimize the more common ways to spread malware, or certain dark patterns, as well as just making browsing certain sites more pleasant without all the annoying stuff around.
>Or do you bail if it requires javascript?
If I don't trust a website, yes.
>What about the proliferation of sites that don't really need javascript, but you need to enable it anyways because the site's security provider needs it to verify you're not a bot?
Not all sites require JS to work, or when they do, they do not require every single JS domain on a website to work. An example of this would be something like many of the popular news sites which try to load sometimes JS from 10 different domains or more and only really require one or none to be usable. Take CNN, I do not need to whitelist it's main domain via NoScript to read articles or navigate, but the moment I whitelist CNN.com, i see a flood of other domains to whitelist which are definitely not needed, like CNN.io, cookielaw.org, optimizely.com, etc...
Take Hacker News. It's viable without JS, I can read, navigate and comment, but if I want to use the search function, I need to whitelist algolia.com (which powers the search) or else I just see "This page will only work with JavaScript enabled". The search function not working is the most common issue you'll find if you block all JS by default.
>You can block scripts, frames, media, webgl... Meaning no ads, no JS... Which helps minimize the more common ways to spread malware, or certain dark patterns, as well as just making browsing certain sites more pleasant without all the annoying stuff around.
>Not all sites require JS to work, or when they do, they do not require every single JS domain on a website to work. An example of this would be something like many of the popular news sites which try to load sometimes JS from 10 different domains or more and only really require one or none to be usable. Take CNN, I do not need to whitelist it's main domain via NoScript to read articles or navigate, but the moment I whitelist CNN.com, i see a flood of other domains to whitelist which are definitely not needed, like CNN.io, cookielaw.org, optimizely.com, etc...
Doesn't the default ublock filter lists, plus maybe an extension for auto-closing cookie banners get most of those?
uBO has a different purpose. It's essentially a blacklist with sophisticated measures in place to fix breakage it causes. In many cases it selectively filters content that is otherwise allowed through. IIRC youtube is an example of an extensive such cat and mouse game.
A whitelist approach is less nuanced but far more extensive. It defaults to defending you against unknown vulnerabilities.
uBO can also block JS, yes, and I use both add-ons, but I find NoScript's UI to be more intuitive to use to manage JS, and I've been using it for years now.
It depends, but frequently, yes. e.g. If I were about to read a tech blog, and see it's from someone that can't make a couple paragraphs work without scripting, then that raises the chance that whatever they had to say was not going to be valuable since they evidently don't know the basics.
It's the frontend version of people writing about distributed clusters to handle a load that a single minipc could comfortably handle.
>It depends, but frequently, yes. e.g. If I were about to read a tech blog, and see it's from someone that can't make a couple paragraphs work without scripting, then that raises the chance that whatever they had to say was not going to be valuable since they evidently don't know the basics.
Seems only narrowly applicable. I can see how you can use this logic to discount articles like "how to make a good blog" or whatever, but that's presumably only a tiny minority of article you'd read. If the topic is literally anything else it doesn't really hold. It doesn't seem fair to discount whatever an AI engineer or DBA has to say because they don't share the same fanaticism of lightweight sites as you. On the flip side I see plenty of AI generated slop that works fine with javascript disabled, because they're using some sort of SaaS (think medium) or static site generator.
Generally speaking, getting good performance out of a database mostly comes down to understanding how the thing works and then not making it do a stupid amount of unnecessary work. I expect someone who understands that would also not e.g. fetch a script that then fetches the actual text instead of just sending the text. For example Markus Winand's sites work just fine with javascript off.
For ML stuff I'd let e.g. mathjax fly, but I expect the surrounding prose to show up first to get me interested enough to enable scripts.
It's not an exact filter, but it gives some signal to feed into the "is this worth my time" model.
It's also odd to characterize it as fanaticism: scriptless sites are the default. If you just type words, it will work. You have to go out of your way to make a Rube Goldberg machine. I'm not interested in Rube Goldberg machines or the insights of the people that enjoy making them. Like if you own a restaurant and make your menu so that it's only available on a phone, I'll just leave. I don't appreciate the gimmick. Likewise for things that want me to install an app or use a cloud. Not happening.
Rather then fanaticism I'm going to second that I find it to be a useful signal. The people that I find worthwhile to read generally have an interest in tinkering and exhibit curiosity about what's under the hood. Same reason HN lends itself to better discussions than other venues.
Very approximately: there's a group that took the time to understand and attempt to build something robust, a group that has no interest in web except as a means to an end so threw it at a well reviewed static site generator, and a group that spent time futzing around with a rube goldberg machine yet didn't bother to seek deeper understanding.
> Do you just whitelist every site you come across if it's broken? What's the security advantage here?
Most websites load their required scripts from their own domain. So you allowlist the domain you are visiting, and things just work. However, many websites also load JS from like 20 other domains for crap like tracking, ads, 3rd party logins, showing cookie popups, autoplaying videos, blah blah blah. Those stay blocked.
Try it out: Visit your local news website, open your uBlock Origin panel, and take a look at all the domains in the left half. There will probably be dozens of domains it's loading JS from. 90% of the time, the only one you actually need is the top one. The rest is 3rd party crap you can leave disabled.
And yeah, if a website doesn't work after allowlisting two or three domains, I usually just give up and leave. Tons of 3rd party JS is a strong indicator that the website is trying to show you ads or exploit you, so it's a good signal that it's not worth your time.
For me, if the site is broken and I'm interested in the content, I sometimes enable JavaScript temporarily without adding it to my whitelist. Deciding what to do when I encounter a broken site is the easy part.
The challenge is sites like StackOverflow which don't completely break, but have annoying formatting issues. Fortunately, uBlock lets you block specific elements easily with a few clicks, and I think you can even sync it to your phone.
>For me, if the site is broken and I'm interested in the content, I sometimes enable JavaScript temporarily without adding it to my whitelist. Deciding what to do when I encounter a broken site is the easy part.
But that basically negates all security benefits, because all it takes to get a 0day payload to run is to make the content sufficiently enticing and make javascript "required" for viewing the site. You might save some battery/data usage, but if you value your time at all I suspect any benefits is going to be eaten by you having to constantly whitelist sites.
I don't block javascript for security reasons, I block it for performance, privacy, and UX reasons. If there's a 0day that can be exploited by random javascript in the browser, UBlock won't save us.
> if you value your time at all I suspect any benefits is going to be eaten by you having to constantly whitelist sites.
I don't constantly whitelist sites, only the ones I use regularly (like my email provider). Temporarily enabling JS on a broken site doesn't add it to my whitelist and only takes three clicks (which is muscle memory at this point):
1. Click to open UBlock window
2. Click to allow javascript temporarily
3. Click to refresh the page
Personally, if some random link I click doesn't work without scripts at all, chances are that it's not worth the effort and potential security/privacy compromise anyway. But in many cases, the content is readable, with perhaps some layout breakage. Might even get around paywalls by blocking JS.
Even if other users do indeed whitelist everything needed in order to make sites work, they will still end up with many/most of the third-party scripts blocked.
How does this work in reality? Do you just whitelist every site you come across if it's broken? What's the security advantage here? Or do you bail if it requires javascript? What about the proliferation of sites that don't really need javascript, but you need to enable it anyways because the site's security provider needs it to verify you're not a bot?