Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Years ago, somehow, I wasted hours and days of computer and my time, compiling and fine tuning my gentoo system, god knows why, when next day I anyway format it to install newly arrived Ubuntu cd.


Everyone please realize that just because you did something and no longer do that thing does not mean it was wrong to have done that thing.

All of us who at one point compiled our own kernels and now no longer do, are the killers that we are partly because we did things like that at least for a bit. It only makes sense not to now, after having done it.

It's not true to suggest (or to read these stories as a new bystander and come away with the idea that) "if I were smarter I never would have wasted time on that"


I hear your point that the act or process of doing the learning is good, even if the end result is that you shouldn't do the thing again. Such as learning assembly but then only doing web development for a profession where you don't need to know assembly.

However, I think that the statement below might be better with a bit of nuance.

> It's not true to suggest (or to read these stories as a new bystander and come away with the idea that) "if I were smarter I never would have wasted time on that"

I would say its "not true always," in some cases doing the action really wasn't worth the time.

Related to this, I believe the sentiment people have about regretting wasting time on some endevour, is a misalignment of what their intention was to begin with.

For example, if someone wanted to compile their own kernal because they wanted to learn and understand more about their computer its unlikely that they would walk away from that experience with regret. However if they wanted to compile their own kernal because they believed that in doing so they would make 10x more money in the long run (through learning so much), and that goal failed to materialize. They would likely tell others to not waste their time learning to compile their own kernal.

Not trying to be pedantic, or argumentative, I aggree with your point deeply, however I wanted to discuss it a bit further. Let me know your thoughts.


Same, but I learned so much while doing it. Eventually I got tired and moved to an arch and got most of the same without always fighting broken packages. But I still use the knowledge I gained dealing with random low-level issues when they crop up.


I had a similar experience with the Gentoos/Arches of the world. I'd never use Gentoo or a Gentoo-like as my primary OS for anything, likely for the rest of my life, but it still ended up being one of the most valuable operating systems for me to spend some time on.


You and me both. But, we learned a lot. Nowadays I feel like Linux is my super power, OS, VM, Container, Nix-shell, WSL2. It’s all Linux. And you can drop me on any command line (even a BSD one to some extent) and I will feel at home and can solve problems. I’d like to think that’s where my happy time with Gentoo led to.


I agree with you, but we're also lucky (or unlucky?) that the variety of out there's dropped a ton.

If you can, try to get access to OpenVMS or Cisco IOS, it's an entirely different world in terms of user experience.


And no matter if its true (and it might well be!) the overarching tendency to look for reasons to explain time spent with Gentoo should probably tell us something.

At best, we are at least a bit confused about it all.


I was confusing because you can’t install it without understanding stuff like fstab, grub, user creation etc. It sets you up to be a sysadmin, it requires you to be a sysadmin. Ubuntu on the other hand, looks and acts more like an iPad than Windows.


I never used Gentoo, but the time I spend screwing around in Arch was more educational than the video games I would have been playing, anyway.


I had fun.


Gentoo’s biggest attraction for me was always the USE flags - being able to turn off the X integration of mpg123 where CentOS demanded an entire X install to get the command line player.

The flags were just icing on that cake.


I too used to obsess over customizing my OS. Now I just install Debian, a handful of programs I use daily and that's it. I can recreate my setup on another machine in 20 minutes.


They talk about the shell as an IDE. My entire desktop is a 14 year experiment in tuning productivity. My ~/bin/ folder has around 100 scripts and maybe 20 are little scripts i wrote in conjunction with i3. Pretty cool how it stacks up over time


What handful of programs do you use daily, if you don't mind sharing?


Nothing interesting. I spend like 99% of my computer time in the web browser, ide and terminal.

Chrome, git, ssh, docker, netbeans


Oh wow, I did not expect to see NetBeans here. What do you use it for and why NetBeans?


General web development. PHP, javascript, css. I used to do some java projects as well, but not lately.

I know it's not mainstream to use NetBeans these days, but I don't care, I'm just used to it and it gets the job done. Maybe I'm just getting old.


Thanks for answering. I'm not NetBeans fan myself but there's absolutely nothing wrong with using the tools you like that get the job done.


Same but with macOS.

The only cool thing about it is that it’s declarative: nix-darwin everything and a fully working and customized machine is up in 10 minutes with one command


Do you have some docs or writeups on your setup? I'm planning to move to macOS in a few weeks.


[flagged]


Of course you customize your Debian installation over time.

Over time is the key here. A package here, a small config there, and after some time, that installation becomes so unique that it starts reading your mind.

Non-breaking updates is the icing on the cake.

The biggest point is you install Debian once, or when your processor architecture changes.

--Sent from my a 6 year old Debian installation.


Not sure if you mean that sarcastically, but it's a really boring and stable OS, the setup is just a few clicks and hitting enter a few times. I think I haven't even bothered to change the wallpaper on my newest laptop.

There's nothing really special about Debian as well, it could as well be Ubuntu, Mint or anything else that's plug n play. I'm just used to Debian, and it comes with less junk I don't need installed by default.


Debian is my set and forget OS as well.

After running my home servers on it, one release wasn't too far behind to run as a desktop, and I've been happy here ever since.

Xfce desktop, move the panel to the bottom, install applications. Use it for the next days/months/years until I decide to look around again.


When every ounce of power mattered, fine-tuning your OS made sense.

Nowadays most people are swimming in CPU cores and gigabytes of ram and terabytes of solid-state memory, so fine-tuning is a waste of time (unless you play bleeding-edge games). But it wasn't always such.


> so fine-tuning is a waste of time (unless you play bleeding-edge games).

Unless you run javascript 'applications'. Games are already optimized.


> Games are already optimized.

Yeah, I'm calling bullshit on this one. At least, it doesn't line up with my experience. In my experience, games are optimized _just enough_ for a decent playing experience (and not always then). Games devs, as a whole, are the worst offenders of expecting their users to just throw more money (hardware) at the software to achieve usable/enjoyable experiences. There are, of course, exceptions. But, for every Carmack, there's 10s of thousands of developers scrambling to make their deadline, doing just enough to ship.


I have heard of people recompiling the kernel to improve gaming performance (mostly to use a different scheduler or what be it), but don't recall seeing anything beyond single digit percentage improvements in performance. Which makes sense, since you can only recompile the kernel and a subset of open source libraries that the game may use. Those are going to be fairly well optimized to start with.

The games themselves though are a different story. Outside of open sources games (which are usually less demanding than commercial ones), you don't have the source code to rebuild it. Even if you did, enabling optimizations beyond what the developer used risks breakage so you will have to do your own testing. Even then, simply rebuilding the software wouldn't address the quality of the code created by those developers who are scrambling to meet a deadline with as little effort as possible.


I'll be the first to admit that I'm not a game developer and my exposure to commercial games' source has been very limited. The most exposure I've had was to Civ4 due to Firaxis releasing the source for the core game DLL for modding. Civ4 also used Python for scripting and Python (undeservedly, here) gets the blame for the game being slow, especially during late-game play.

Back in the day, I spent a fair amount of time working on gutting the DLL because frankly, it was atrocious. My memory is a little fuzzy as it's been +10 years since I've looked at it, but things I remember seeing:

* over use of C/C++ preprocessor macros where an inline function would have been more appropriate to say, get the array/list of CvPlots (tiles) that needed to be iterated over all the time. * lack of hoisting loop invariants out of loops. It is common to see usages of the non-trivial macros above in the bodies of tight loops, often tight nested loops. Optimizing compilers are great, but they're not _that_ great. * the exposure of the C++ types to Python was done...poorly. It was done using Boost Python (which, while a great library for its day had a _huge_ learning curve). For every Cv type in the C++ DLL, there was a corresponding Cy type to expose it to Python. Collection types were _copied_ every call into Python code, which was quite frequent. The collections should have been done as proxies into the underlying C++ collections, instead of as copies.

Most of the changes I made were quite small, but over the course of a month of part-time hacking on it, I'd gotten late game turns down to a couple of minutes from 30-minutes and memory usage was extremely reduced; and I never did get around to fixing the Python wrapper as it would have too intrusive to fix it properly. I could have made more aggressive changes if I had full access to the source, but being constrained by DLL boundaries and C++ types being exported limited what could be done w/o breaking the ABI (had to be extremely careful about not changing object sizes, affecting vtable layout, etc).

Frankly, I doubt the developers spent very much time at all, if any, with a profiler during the course of development with the game.


Yeah but you picked the one game where some dude patched the civ 4 binary for a 3-4 x increase in rendering performance :)

Civ had been a 2d game until then, it was their first 3d title.

Not to mention that it was turn based strategy, and the main performance problem was AI turn length in the endgame.


> Games are already optimized.

What AAA titles have you played around launch in the last 5 or so years?


I think the biggest case where Gentoo still makes sense is when you have a large fleet of identical machines. In that case, the effort put into tuning your installation will be multiplied across the number of machines it's applied to.

For a single machine home install, the biggest value Gentoo has to offer is the learning experience. I ran it for about a year like 4 years ago, and I definitely learned a lot in that time. Hopped around a bit and I've since landed on GNU Guix, and I'm probably set for life now in the GNU world.


I'm a Gentoo daily driver and I'm also looking real hard at Guix. I already live inside of Emacs, having everything in Lisp seems kind of nice.


When every ounce of power mattered, fine-tuning your OS made sense

I used to believe that and was a huge Gentoo user for years back when it was initially released. Then one day I benchmarked Gentoo and a default RedHat install on every work load I cared about, and everything was within the margin of error.


I like gentoo because I have it set up to compile everything with symbols and the source code in the image, and I can gdb anything I am curious about.


Made an ancient computer with very limited CPU usable for my siblings with Gentoo. The secret is to do USE="-* minimal" and enable things that are required from there. Compiling a custom kernel was actually necessary because it had a really old NVIDIA card that was no longer supported and I had to patch something to do with MTTR. Installed XFCE 4 and it idled with 70 MB of RAM used. Could play Youtube videos without the whole thing freezing whereas Debian could not. Gentoo is great.


The time I wasted with Gentoo in 2004 was enough to never try it again.

A full day compiling stuff only for the base install, let alone everything else I would eventually need.

On my case, I decided to become another Scientific Linux tester.


The thing is, that was 20 years ago. I basically did the same thing, at almost the same time, as you.

But computing power is much higher now. The same compilation now would probably take 1-2 hours, max. Updates would be super fast.

Gentoo itself is considered generally stable and a pretty solid distribution, or it used to be.

I wonder if these days the flexibility and the engineering behind Gentoo might be worth taking another go at it.


For kicks and giggles, I just set it up on a new system a couple of weeks back.

It's really not much different than working with Arch in terms of complexity. Initial setup takes a bit, but if you've installed arch you are pretty familiar with everything you need (in fact, arch docs are helpful for a gentoo setup :D).

The docs are VERY good and easy to google.

Compilation time can be nasty depending on what you install but not terribly bad. I just rebuilt the world because a GCC update broke lto that I'm running. With about 2k packages that took about 6 hours to complete on a Ryzen 7950.

General updates take almost no time at all (especially using git for syncing). Usually less than 10 minutes often less than 1. As I write this, I'm currently rebuilding kde (if you are using your computer, rebuilding doesn't really get in the way. Especially if you are already working with a multicore system).


“But computing power is much higher now. The same compilation now would probably take 1-2 hours, max. Updates would be super fast.”

I’m not so sure. A lot of the power comes from multiple cores. Years ago I had one core, now I have eight. A lot of the compiles don’t use all the cores.

Software has also gotten bigger. rustc is huge, for example. It didn’t even exist when I used Gentoo years ago.

These days I’m on the Mac and I just switched to Homebrew after using Macports for years. It was for one of the same reasons I stopped using Gentoo: compiling takes too long. Whenever I upgraded Mac OS versions, Macports required me to recompile everything. This was no problem at all for, say, tree. But something was pulling in gcc (emacs needed it for some reason??) and this took ages to compile.

At least Macports worked though. When I used Gentoo, it took so long to compile things that I would leave it overnight, and of course often in the morning I would see that the compilation stopped halfway through because something was broken. Hopefully that’s improved. Or of course maybe the binary packages will help with this.

But if I wanted a build-your-own, rolling-release binary system, I don’t see why I wouldn’t just use Arch.


Even 1-2 hours is too much for me.

I rather use programming languages ecosystems that favour binary libraries for a reason.


A lot of the reason depends upon what you hope to get from the labour and the overall environment that you are working within.

I was working with a 486 around 1995. Compiling your own software was the norm and compiling your own kernel could have significant performance benefits (even if it was just to conserve the limited memory supported by machines of the day, to head off some of the swapping). By the time I learned of Gentoo, that was not really the case: most of the software one could obtain was provided in binary form and compiler optimizations were much less relevant (unless you had a special workload).

The tooling provided is important too. I was using NetBSD for a while. For the most part you just started the compilation process and walked away until it was done. (I imagine Portage is similar.) You didn't get the instant gratification, but it was not time intensive in the sense that you had to attend to the process. That was very much unlike my earlier experiences in compiling software for Linux, stuff that isn't in the repos, since it did have to be attended to.


It surely wasn't the norm for me, in 1995's Summer, I got my first Linux distribution via Slackware 2.0, everything was already compiled and when chosing to download tarballs I would rather pick already compiled ones.

Later on, to take advantage of my Pentium based computer, I would get Mandrake, with its i585 optimized packages.

Most of my Linux based software would be sourced via Linux magazines CD-ROMs, or Walnut Creek CD-ROM collections.


The time I "wasted" with Gentoo in 2005 taught me enough about how Linux works to land me my first real IT job. I will forever be grateful to that distribution.


Sure, but you didn't need to suffer compiling stuff from scratch to know how UNIX works.

I used first Xenix in 1993-1994, and naturally wasn't compiling it from scratch.


I had toshiba satellite with some whooping 96MB RAM... as main computer even... happily ran Windows 98... them I got the book "Linux from scratch" the rest is history... now I am happy mac user.


Have you try'd "MacOS from scratch"? It's even harder ;)


But then you have to deal with big upgrades that might break your system and old packages (or start randomly adding PPAs etc.) A rolling distro means you can continually keep up with small changes and only adopt big new pieces (like systemd, pipewire, wayland etc.) if/when you are ready to.

I've installed Gentoo literally two times. Once per PC. Been using it for years. It's not like you have to keep tweaking it. It does help if you run a basic system like me, though (no DE, simple tiling WM, don't use much apart from Emacs and Firefox).


I never used gentoo. Making sure the graphic and wifi card in a laptop worked after every upddate on Ubuntu was hard enough for me.


Yea, same here back in the day. Stage 1 installations for Gentoo really made me interact with the kernel and software in a different way. It did not just work, but while solving the issues, I learned a lot on how things worked internally. It's a great thing to get really familiar with the workings under the hood.

But yea, now-a-days I'm on Ubuntu LTS.


Same, this was around 2004-2006ish when I maintained a Gentoo build for my Pentium 4 box. There was this somewhat draw to me of compiling my own binaries highly optimized to my processor and that Portage mostly works. But my gosh, gcc build times are killing the fun. When Ubuntu arrived and saw my peers being productive, I switched.


On my first pc assembled from used parts, I was able to squeeze every bit of compute out of gentoo. Being able to build smaller binaries by excluding dependencies seemed to help a lot. I used it until the first ubuntu was released and it just worked and worked well. The only problem was that it was an ugly brown.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: