Each of the following is in the 1000 hours saved club (in no particular order):
jq, perl, grep, gnu textutils, gnu parallel, bash, xargs, gkrellm, nload.
Make deserves a special shout out. If you think it's only useful for building software, you've completely missed the point (similarly, if you think some language-specific tool is superior, you're doing things completely wrong):
Anyway, this document saved me at least 10,000 hours:
Other peoples' python has the distinction of being the only software in my -1000 hours saved category. (And if you consider maintaining code written in the above "write only" languages, you'll realize how big of a feat that is!)
> "If you think it's only useful for building software, you've completely missed the point"
I clicked on your link with 16 pages of just contents links to find what the point is, and found "the make utility automatically determines which pieces of a large program need to be recompiled, and issues commands to recompile them".
It automatically parallelizes any large scale data processing pipeline (assuming that your data set is already broken into files). People use it for things like documentation generation. I've used it to process the results of web scrapers, large scale data cleaning tasks, etc, etc.
It takes about an afternoon to set up NFS + SSH primitives that automatically distribute the computation across clusters of machines. Since it is restartable, it automatically tolerates hardware faults (up to dozens of machines, in practice).
Basically, you get Map Reduce, but for arbitrary data processing DAGs, and it supports any language that works well on Unix-style operating systems.
There are two things to note: (1) you can turn them off so they don't get in the way of other use cases, (2) they include non-build stuff like version control, linting, and document generation.
There's also a whole chapter on ar file maintenance. This is mostly useful for linking these days, but ar is the precursor to tar, so this used to also be useful for running filesystem backups:
No, that absolutely is the point of `make`. You quoted the "overview of make" page[0] which does say "The make utility automatically determines which pieces of a large program need to be recompiled, and issues commands to recompile them."
However, that same page, about 5 lines down, expands to say:
> Indeed, 'make' is not limited to programs. You can use it to describe any task where some files must be updated automatically from others whenever the others change.
Which is exactly what the original poster is claiming.
jq, perl, grep, gnu textutils, gnu parallel, bash, xargs, gkrellm, nload.
Make deserves a special shout out. If you think it's only useful for building software, you've completely missed the point (similarly, if you think some language-specific tool is superior, you're doing things completely wrong):
Anyway, this document saved me at least 10,000 hours:
https://www.gnu.org/software/make/manual/html_node/index.htm...
Other peoples' python has the distinction of being the only software in my -1000 hours saved category. (And if you consider maintaining code written in the above "write only" languages, you'll realize how big of a feat that is!)