The article mentions the functionality that the Javascript is used for: Expanding and collapsing sections, adding the media viewer on thumbnails shown in the page. Both seems like reasonable use cases for interactivity that is (better) achieved using Javascript.
CSS has come a long way, I would expect that these things would be ease to achieve with just CSS and HTML, for instance: media viewer -> dialog (I remember this being a HTML thing), collapsing sections -> details/summary (?)
There is always going to be a user interaction that is sufficiently complex as to require JS. Arbitrarily limiting to just CSS severely limits what you can do.
It's not arbitrary though. It's a choice to save bandwidth shipping unnecessary javascript rather than making it easier to develop the website. At the scale of Wikipedia that isn't unreasonable.
It's not unnecessary if the feature is something you want?
There's this pattern on HN: people value a feature as having 0 utility and then become annoyed that someone has paid time/performance/money for them. Well duh, if you discount the value of something to 0, it will always be a bad idea. But you're never going to understand why people are paying for it if you write off their values.
At my last job there were countless pieces of UX to make things smoother, more responsive, better controlled by keyboard or voice reader, etc.. that required JS. It was not possible to make our site as good as possible with CSS, and it certainly wasn't worth the tradeoffs of loading a big faster (not that it couldn't have had it's loading time improved--just, cutting JS was a nonstarter).
Fairly certain that's literally the point of simplifying interfaces. Do what you need with what you have. Don't try to shove a racehorse into a VW Beetle.
Surely at the scale of Wikipedia that wouldn’t be a factor ?
considering the dominance of few browsers (chrome , safari on iOS) will most users notice any difference? The first one (with that UA) to the site with a new build will have they cache key warmed up ?
Mediawiki itself is built to support 10-year old phones (which is why the Moto G makes an appearance in the post - it's the official low-end Android benchmark) and older desktop operating systems. https://www.mediawiki.org/wiki/Compatibility#Browsers
Makes sense since the Moto G is also what Lighthouse and a lot of tools driven by it use. So PageSpeed Insights (pagespeed.web.dev), the Lighthouse developer in Chrome tab and even external services like https://totalwebtool.com all generally evaluate mobile performance using it to simulate a slower experience.
The sections could be collapsed by CSS by default without render-blocking JS, and adding a media viewer to random media bits and pieces can definitely wait until later. So time-to-interactivity is unnecessarily hampered here.
Who is using mobile without JS? They are going to have many problems beyond collapses... Likewise, crawlers these days generally run with JS enabled in a browser/VM because so much breaks completely if JS is disabled. This is not a real problem, sorry, NoScript fans.
(Anyway, if you are using CSS + the relevant semantic HTML elements, it can be more accessible, not less, because you are expressing priority and emphasis, so they can skip over collapsed stuff. Although I have my doubts whether screenreaders etc make any good use of it, given that they apparently still do not do basic things like stripping soft hyphens.)
I love details/summary and want them to succeed but current implementations have some issues. A big one is VoiceOver for iOS currently what they are or their state, something that's very straightforward and reliable when making a custom disclosure widget.
"Better interactivity" is subjective. You could argue that faster is better. Expanding and collapsing sections can be achieved faster and with zero JavaScript (which also makes it work on browsers with disabled JS) with a hidden input[type="checkbox"].
Aa for the media viewer, it could be a great exercise to try and make it in a similar manner with input[type="checkbox"] for opening/closing media viewer and input[type="radio"] for switching to prev/next image. This one probably requires browsers to support the `:has` CSS selector.
Also, if you want to further speed up your site, just like you said, the fastest way to speed up the site is to delete JavaScript, get rid of jQuery.
It's just opening and closing detail panes (personal recipie site), there's no reason to put the open/closed state in the url.
But if I can do it with all html, that's better than a function to add / remove a css class and document.onclick = a function to find which recipie you clicked on and adjust the css class.
Probably more useful for a nested menu. The default thing almost works out of the box. You need some css to set a proper cursor and style the clickable text properly.
I assume the author of the blog post just wanted to optimize the current situation, not completely change how these features work (which would most probably be a much more elaborate change).
Same for dropping jQuery - that will probably be a few weeks or months of work in a codebase the size of Wikipedia/Mediawiki.
Hard disagree, I use the hover-to-preview-article feature all the time. Sometimes I just want to peek at a page to see what it is instead of opening it.
On Wikipedia's mobile view, collapsed sections are super useful (as the table of contents is not visible via the sidebar) and media viewer makes it possible to view details of an image/thumbnail without navigating away from the page.
> collapsed sections are super useful (as the table of contents is not visible via the sidebar)
Yes and no. On the hand some sort of table of contents is useful (but note that you could also just display it inline, the way it used to be done in previous desktop skins), on the other hand those collapsed sections break scroll position restoring when reloading the page after somebody (your browser or directly the OS) kicked it out of your RAM. This is because your (absolute) scroll position depends on which sections where expanded and which collapsed, and that information gets lost when the page reloads – all sections end up collapsed again and the scroll position your browser remembered no longer makes sense.
(There is some sort of feature that tries to restore the section state, but a) it only works within the current session, but not if the OS low memory killer kicked the whole browser out of your phone's RAM and b) even when it does work, it runs too late in relation to the browser attempting to restore the previous scroll position.)
So now that the mobile Wikipedia's full JavaScript no longer runs on e.g. older Firefoxes (e.g. one of the last pre-Webextension versions), the lack of a TOC is somewhat annoying, but other than that, somewhat ironically my browsing experience has become much, much better now that my browser can finally reliably restore my previous scroll position because now all sections are permanently expanded.
> The better question is: Why use Javascript at all for a static website?
I've been developing websites professionally since 1996. HTML/CSS/JS and SQL.
I am still amazed there is the crowd out there who is "anti-JavaScript". They run NoScript to only allow it in certain places, etc.
It's 2023, a small amount of JavaScript on a page isn't going to hurt anyone and will (hopefully) improve the UX.
For the record, the last site I deployed as a personal project had 0 JavaScript. It was all statically generated HTML on the server in C#/Sqlite that was pushed to Github Pages. So I get it, it's not necessary.
For my little personal site. I'm also the senior lead on an enterprise Angular project.
JavaScript is fine, it's not going anywhere.
And yes, there are way too many React (specifically) websites that don't even need a framework at all, but it's become the go-to. That annoys me too. But some JavaScript in 2023 is fine.
I won't argue that JS doesn't add something to the web experience. My main reason for having JS switched off by default is because JS adds something to the web experience. Namely, a whole load of annoying things like outrageously illegal cookie banners, adverts over the top of the content you're trying to read, auto-play videos on an unrelated topic when you just want to read the article, paywalls, "invitations" to have notifications or sign up to a newsletter, browser fingerprinting, active tracking, and outright malware.
It's a signal to noise ratio thing. There's a small reason to have JS switched on and a large reason to block it. I'd love it if the reason to have JS switched on was even smaller, and that is in fact the recommendation (that JS should enhance an otherwise functional site). I enable JS on sites where I regard the reward as worth it, but most of the time it just isn't, and I don't trust all the rubbish that gets included in the average site.
Wikipedia probably wants to support every user of the internet, which means even very old browsers. You can’t rely on relatively new CSS features when supporting browsers that are 10 years out of date.
The fact they’re still using JQuery, probably for similar compatibility reasons is good evidence of that.
Now there are ways to use polyfills that only load when necessary but just about everything is very difficult at wikipedias scale. We can’t solve their problems from an armchair.