Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>What I want from a build often isn't the artifacts, but the side effects of producing the artifacts like build output or compilation time

You frequently build things not to get binaries but to spend time compiling?



The point is that there's often no way way to express "I want side effects" in declarative tools, and the number of side effects that might be useful is vast.

For example, sometimes I profiling the build times to see where I should focus effort.

Sometimes I want to see it to quickly check for issues where adding some dependency header causes build times to explode 100% in downstream dependencies during cold builds.

Another common occurrence for is trying to debug a platform, toolchain, or standard library issue and the build system either doesn't detect changes in those components or only makes the components readily accessible in an internal cache that's subject to invalidation issues. You'll usually get the wrong artifact or test results in those cases.

Some other systems (e.g. bazel/blaze comes to mind) actively try to hide side effects like stdout.

In all of these cases, the only way to actually get these side effects is to reach into the tool's internals by blowing away caches/output folders or reading live log files. That's a failure of the build tool.


> The point is that there's often no way way to express "I want side effects" in declarative tools, and the number of side effects that might be useful is vast.

Shake (https://shakebuild.com/) is pretty good about letting you specify that a specific step produces multiple artifacts.

I suspect Nix can do the same?

> Some other systems (e.g. bazel/blaze comes to mind) actively try to hide side effects like stdout.

Yes, blaze isn't all that great. You can tell, because Google folks check in generated artifacts into their repositories, instead of wrestling with getting blaze to build them.


Generally my aim with both Nix and Bazel are that, while they are the source of truth, day-to-day development and debugging occurs using language-native tools. So the only touch point for local development is when you are modifying the dependency graph in some way.

It's definitely more work (you need to maintain compatibility with two different build systems), but worth it for exactly these reasons.


I haven’t used it, but it sounds like make’s —-assume-new flag does exactly what you want for the first part. It lets you rebuild everything that would result from a changed file, including all side effects, without needing to first update the file.


Alas, Make is really, really awful in most other respects.


Really? It's the one part of the traditional c build system I actually still use. Easy to write, easy to debug, relatively small—what's the issue? I hear people complain about make incessantly but people rarely have substantial criticism to offer. Is it the syntax? Reliance on the filesystem? Inconsistencies between implementations?


As an actual builder it has limitations, such as not having (built in) the ability to know if it can still do an incremental build after changing some build option. That can result in inconsistent builds.

The main problem is that you often require more logic than makes sense to write in make, but it kind of has a language built into it so people try to use it. But as a language it's terrible (no scoping, many missing features). So people end up implementing their build logic in a bastard combination of make and shell which is very opaque and difficult to debug.

For example, I was recently trying to figure out how the OpenWRT makefiles are doing something, and it was really painful, because with make having no scoping any part of the system could end up affecting the piece you are looking at. There is a lot of dropping into shell to get stuff done, and a lot of the targets are themselves expanded variables, which makes it really opaque. Really a lot of it is not gaining from being written in make, they could do with rewriting large parts in a real language. But it would be a huge job. And that's where a lot of makefile systems end up

That's why you get tools like ninja where they decided not to allow any logic at all.


> That's why you get tools like ninja where they decided not to allow any logic at all.

Or you get shake, where your logic is the logic of a real programming language.


> Inconsistencies between implementations?

That's actually not too much of a problem in practice: almost everyone just uses Gnu Make.

> Easy to write, easy to debug [...]

Alas, Make becomes hard to write and really hard to debug past a certain complexity threshold. And you reach that complexity threshold very quickly.

> Is it the syntax?

Yes, the syntax of Make is awful, and I'm not even talking about ergonomics. Thanks to Make's abysmal syntax, special characters in your files make it barf completely. And by 'special' I mean something as mundane as spaces.

But everything you mentioned is far from the worst. See eg https://news.ycombinator.com/item?id=17088328 for a more comprehensive overview of Make's sins.


--always-make/-B is more in line, but yeah. Make has grown imperative models within its vast declarative morass.


So do you just not use incremental builds at all? That's insane.


Of course I do, but this isn't a thread about all the things that fit well in the paradigm.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: