Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

He does it "the right way™". Use the platform. Don't use any framework or generic library. Go straight to the point and code what you need, when you need it. Don't minify or bundle anything, and let the people who are learning and courious a straightforward way to connect the dots, without forcing them into a github repository with 90% of the code unrelated to the thing and existing just to glue 1000 pieces written by 10000 people together. Every essay by Bartosz is so top-notch and a such breath of fresh air! He gives me hope in humanity and I am immensely grateful for what he does.


I strongly disagree that this is "the right way". I think that the platform provides low level primitives that are _designed_ to have abstractions built upon them.

Doing it like this has the potential to be the most performant, but it does so in the same way as writing your programs directly in assembly is potentially performant.

I also don't think that the source code is particularly readable for me, and contains lots of magic numbers and very imperative code. I would personally find it a lot more readable if it was written in some sort of declarative way using a library, even if I have to look at a GitHub repo instead of view source.


> but it does so in the same way as writing your programs directly in assembly

> contains lots of magic numbers and very imperative code

Well, we really don't know if the code was written in this form by hand, don't we.

It could have been compiled into this, to use your words, "assembly with magic numbers and imperative" from much more elegant form. We may see this form only because this is what browsers understand.

I am not saying it was compiled, just speculating that seeing pure WebGL does not mean it was pure WebGL to begin with.



Graphics code tends to be imperative and have lots of magic numbers. I suppose it's the math-intensive nature of it.

Personally I'm not a fan of the magic numbers either but as I study more and more of it, it's everywhere


When there's physics, graphics and mathematics, there are magic numbers, which are results of formulas which needs to be computed once, or material properties, or nature's constants.

Also, nature and graphics works as an imperative parallel machine. So the code mirrors that.

This is not written deliberately this way. Code comes out like that when you strip all the libraries, fluff, and other less-related stuff.

I also write a scientific application, and yes, This is the way.


it depends if you are doing something to get paid, or to last, or to be really good. only in the first case do i ever consider a heap of abstractions


Abstraction is the only thing that makes any of our advancements possible. Not even the simplest of math theses could be proves without a “framework” of relevant lemmas, nor could you write even a single hello world without the layers upon layers of abstractions written carefully over the decades. Sure, there is also bad abstraction, but the problem is the bad part, not the concept itself.

Without abstractions you wouldn’t be able to read a text stored on a remote computer with accompanying style information displayed the same on both of our devices and with embedded 3D graphics doing the same thing on vastly differing devices be it a top of the line GPU or a simple low-end phone. Is it not abstraction?


Well, if the abstractions were peer reviewed and put through the same rigour as mathematical proofs, that's a whole different topic.

The equivalent would be a mathematical services company, who created "free" abstraction packages that required you to rewrite all your math, away from the scientific community standards, to fit their abstractions, and who also made money on consulting and selling books. And the big benefit of it all, is really that they only abstracted away writing summaries of your papers, which is actually the easiest part that is quite irrelevant to your research.


But it is not math - we only have empirical evidence and not even much from that.

Who is to tell whether the OSI model is ideal? It is more than likely not it, but we can’t measure these things up front, there is an insane cost associated with changing it, etc. Yet again, what is the alternative? We can’t manage complexity any other way, and essential complexity can’t be reduced.


> Who is to tell whether the OSI model is ideal?

The current idea of the OSI model was also retrofitted from what it originally was.


Ooo, you're giving me LaTeX vibes!


i mostly mean the heap of stuff people often throw at problems. of course you cant do anything without abstractions. it helps to understand them better though.


This is so backwards.


it really depends on what your doing mate!


On a scale of 1 to 10 how strongly are we talking here?


9.5 Your PR will be held up for at least a month with the back and forth.


> has the potential to be the most performant

It also has the potential to evolve in the most efficient way.


I mostly agree with you, but I don’t mind minification when appropriate, as it can serve a functional purpose with tangible end-user-friendly benefits (less downloaded over the network = faster response times).

But if you want to be friendly to the tinkerers, you could always host both the *.js and *.min.js versions, and have the webpage just pull the latter - anyone who wants the unminified source can remove the “min” part from the URI, while the majority of end users will still benefit from pulling the minified js.


minified js is not greatly smaller than gzipped js, I think the whole minification thing is a swing and a miss and now we have to deal with source maps and shit, and build pipelines and etc

    $ ls -la
    -rw-r--r-- 1 jack 197609 330905 May  4 22:56 watch.js
    -rw-r--r-- 1 jack 197609 152172 May  4 22:55 watch.min.js

    $ gzip watch.js
    $ gzip watch.min.js
    $ ls -la
    -rw-r--r-- 1 jack 197609 43690 May  4 22:56 watch.js.gz
    -rw-r--r-- 1 jack 197609 32507 May  4 22:55 watch.min.js.gz


Of surprise to no one, Brotli does better on both:

    $ ls -l *.js
    -rw-r--r--  1 mrd  staff  330904  5 May 01:04 watch.js
    -rw-r--r--  1 mrd  staff  152172  5 May 01:10 watch.min.js
    $ brotli watch.js
    $ brotli watch.min.js
    $ ls -l *.br
    -rw-r--r--  1 mrd  staff  34461  5 May 01:04 watch.js.br
    -rw-r--r--  1 mrd  staff  27122  5 May 01:10 watch.min.js.br
If I were serving this content, and if my web server and all of my target browsers supported Brotli, I'd be somewhat more content to ship an un-minified + Brotli-compressed file than an un-minified + gzip'd one. I'm sure it's some rule of thumb stuck in my head from the Web 2.0 era, but a JavaScript payload in excess of 40KB crosses some warning line in my head. (Probably 40KB / ~4KB/s throughput on a good dial-up connection = 10s transfer time, about the longest you'd want to wait for even a pretty spiffy page to load.)


> I'd be somewhat more content to ship an un-minified + Brotli-compressed file than an un-minified + gzip'd one.

Whoops, typo: I meant to say that I'd be somewhat more content to ship an un-minified + Brotli-compressed file than a minified + gzip'd one. That is, I'd be more happy to serve the 34.4KB watch.js.br than the 32.5KB watch.min.js.gz.


> and now we have to deal with source maps and shit

Yeah minification is only really for obfuscation. The small and unpredictable difference is absolutely not worth the ridiculous complex "solution" of source maps. Just the fact that your debugger really doesn't work right, is a deal breaker in and itself, not to mention all the time spent configuring and fighting with webpack.

I don't think any form of "compilation" i.e. bundling, transpiling, minification etc is needed at all. Javascript can already dynamically load (additional) code files when needed, I don't understand why you need to bundle it in the first place.

I don't buy that the http request overheads are so big that it motivates all this complexity, and in the average case a user don't use every single page of the application anyway, so by bundling everything you are always serving "too much", compared to just dynamically loading additional code.


Gzipped JS is generally much smaller than minified JS, but minified-then-gzipped JS is even more so. The minification (assuming gzip) doesn't make a much difference in this case only because the input file is not that large at all and compression algorithms have a natural bias for larger inputs. You can (rightly) claim it is bad to have a JS file large enough that the minification makes a difference after all, but you'd be moving a goalpost then.


True, but it also removes the comments and the whitespace, leading to slightly better performance and memory usage. There are also less bytes to gzip on the server side.


Slightly, but is it enough to warrant the extra steps?

I don't think the difference is significant enough in this case.

That said, I do think there should be an alternative to minification+gzipping, like e.g. a compiled version of JS that is more optimized than a browser's own JIT compiler can do. Mind you, that might up being a larger package than the JS source code.


Webassembly is* pretty much that

* hopefully will be


A discoverable version would be to include source maps that link to the original as well. That way a browser console will automatically pull up the original.


People measure minification in byte size (unfortunately I guess you're charged by CDNs by that metric too?). In reality everything text based compresses really well over the wire. In either case, importing tons of libs left and right is going to vastly out-size any minification, yet most fe devs are very liberal with dependencies.

Minification strips comments too though, which may be undesirable in many cases.


That's simply not a very well followed (and thus discoverable) standard. Especially for hand crafted code, minifying functions and variable names only obfuscates what is written and minifying whitespace often only has minimal benifits.

In practice this seems to be a lost cause, and links to alternatively hosted source code is more common. Sadly this makes is simple to introduce subtle, harmful differences between the source and what is hosted.


The pattern is extremely common on CDNs that serve JS.


It's hard to guess that extra assets exist on the server if they aren't being pulled down by the site itself.

Seems better just to have premassaged source available in a repo somewhere, or called out on the page itself for a downloaded archive.


> He does it "the right way™". Use the platform. Don't use any framework or generic library.

Hard disagree. "Use What's Right For You™".

Of course there is value in understanding the platform beneath your framework or generic library, but that's just an extension of "understand what you're using and why".


We need a ciechanow.ski explainer for how ciechanow.sky explainers are built


Where are the comments in his code? :-)


He made this in the spirit of watch making. Super impressive and interesting website!


> Don't minify or bundle anything

Yeah in this case it doesn't need to; there's no extraneous or unused code or documentation blocks, and gzip (and comparable) compression is good enough, minification doesn't actually reduce the downloaded code size by that much.


the obvious downside is that it's a lot of work and takes a lot more time... so it might be "the right way™" for some cases, but it's definitely not a rule of thumb...


The tradeoff is that there is basically nobody else that has the expertise or time to do the same thing at a similar level of polish. We're not going to see more Ciechanowski-level posts unless new libraries and frameworks make it more accessible.


We definitely won't if people are taught that frameworks are the only option and never allowed to just write a full program on their own.


Nobody stopping you from not using a framework, and yet there is basically nobody else at Ciechanowski's level. It's not going to happen, you can't expect everyone to become a hardcore webgl expert (have you tried?). If we want more cool interactive visualizations, we have to make it easier. Otherwise, we're stuck waiting for those with the time and expertise to pull it off.


Maybe, maybe not. We should do the experiment, though.


What experiment do you mean?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: