While somewhat irrelevant for a majority of developers, it feels great to understand how computers actually work.
Even many higher education IT programs often barely cover the underpinnings anymore.
I'm currently embarking on a fun little "from first principles" challenge:
Imagine a post-apocalyptic world where computers are a relic of the past.
You stumble upon a functioning device with a screen and keyboard. When powered on, all you get is a bare metal "assembly interpreter". You can type in riscv64 assembly instructions and they are executed. You can print to the screen via a UART. So you don't even get an assembler/linker to build on.
From there you have to build up an entire new software stack, with a proper language with compiler/vm, an operating system, a standard library, basic userspace tools, a UI, ...
Simulating the device and console is easy with qemu-system-riscv64 and a a bit of assembly.
It's very challenging, but also rewarding to study up/relearn and implement all the basic underpinnings of computers, all the way from hardware (CPU, memory, IO devices, ISAs) up to a GUI stack.
Hacking together a simple OS isn't actually all that hard when your hardware is constrained to a specific device. And writing a very simple language in assembly also isn't that bad. You can build progressively better languages on top of that.
I'd love to turn this into a somewhat guided, interactive learning experience that runs in the browser via tinyemu [1] or similar, but I won't have time for that.
It's a popular course about building a computer from nand gates and eventually building a Tetris like game on it. All of this is done on a hardware simulator of course.
It's a series of videos where Ben Eater explains how to build an 8-bit computer using breadboards and basic elements of circuits. I find that his explanations are quite good as well.
I've seen some "practice oriented" bachelor degrees like "Software Development" that don't cover these topics at all.
I think more common is to have a few courses on it, but as a small part of the curriculum; which is largely forgotten by graduation time.
I guarantee that asking the average application developer about IO ports, out of order execution, cache misses, AST passes, stack machines, how a GC works, or even just how to implement a hash table or tradeoffs between arrays and linked lists will earn you mostly blank, confused stares.
Let's be honest, 90% of everything you "learn" at university is largely forgotten during graduation time. I've seen folks fresh out of uni who couldn't even run a Java program without an IDE, or who forgot how to write a switch statement.
I find it fascinating how inefficient the brain is, and how though it spends huge amount of our resources, it's actually not extremely good at picking up new complex information.
Is it not? If the programme teaches Java, "learning how to compile and execute Java programs" would surely be one of the top priorities of such a course.
Is it the job of a computer scientist to run programs on servers though? That seems like specific technical training, not the subject of a computer science degree.
I think if you take it to the extreme, the job of computer scientist isn't to run any programs. It is to write them. To quote Knuth:
> Beware of bugs in the above code; I have only proved it correct, not tried it.
No, as a computer scientist, your job isn't to run programs on servers (or, at all).
As a person who needs to eat and pay rent though, I think it is fairly reasonable to expect you to know how to run programs you write, with or without an IDE. The reason for running programs without IDE is that most often, executing the program won't be a "javac Main.java && java Main", but probably some rather complicated maven/ant/gradle/shell/... command that you'll have to program your IDE to do anyways. If one can't even use a simple javac/java commands, one might have a very difficult time with those concepts.
And now, the overarching point is, does one as a student mindlessly click the green run button in Eclipse, or was one curious (and paying attention, because javac and java are usually taught in the first lecture) and actually tried to understand what's going on after you press the green button?
I'd go as far to even say that the curiosity of "what happens after I press this button" indeed is a job of a computer scientist.
Just my 2c, and why I expect fresh graduates to know a bit more than "class Dog extends Animal".
My point is the precise technical details of how to run one particular language in one particular environment are irrelevant and not the business of a university. You can pick up this practical skill anytime you need.
You're supposed to be learning the concepts. You can do that using an IDE.
I have no idea how to run an Erlang program on a server. Could I pick it up in five minutes if I needed to? Sure. So why do I need to learn it? Why do you care if I can do it coming out of university?
> Could I pick it up in five minutes if I needed to? Sure.
Yes, because you do know the basic concepts of using a CLI, the concepts around building your project, and the concepts of project metadata, build tools and etc.
When people can't run their programs without an IDE, it's because they don't know any of that. This is an at minimum 6 months learning of the unwritten culture of a profession. This is not something one can pick up in 5 minutes. It is also a huge signal that those people are missing other fundamental and important pieces of knowledge.
(That is, unless the OP is literally complaining that he threw people on a random computer CLI and people couldn't make their code run there. Personally, I have never seen anybody making this point, but I guess it's possible.)
But IDEs on developer devices can remotely interface with servers, and how many developers ever have physical access to a server rather than using it remotely from a workstation?
As a developer (though not by education a computer scientist), the vast majority of the software I cause to be run on servers is triggered automatically by updates to git repositories, and most of the rest is done via web consoles. Sure, I know how to do more, and that's sometimes even relevant to my work in terms of scripting what happens in CI/CD pipelines, etc., but running software on servers directly isn't really a central job duty.
And for the more pure ops people for whom it is, they are even less likely to have jobs that looks for a CS background.
Not many folks are developing on servers. Now, I will agree that devs should be able to write the Makefile/Dockerfile/whatever to tell the CI system how to build the artifacts that are deployed to the server, which does probably exceed what an IDE can do.
I go to a pretty good uni (not elite, but global top 30) and they don’t have a compilers course, which I was very disappointed by. It’s even a theory heavy degree!
This really surprises me - looking down the THE and QS global rankings, I know that at least all the universities outside the Far East teach compilers. I guess you're not saying which university you're at to keep it anonymous, but I'd be interested to hear which it was.
Which universities are you looking at? Mine technically had a compilers course but it wasn't required coursework and was generally offered every 2-3 years or so
To be fair, they’ve given me a great education and very valuable opportunities. I’m very appreciative! I just wish I didn’t have to self teach compilers.
I took a couple of compiler courses in university, and I think for the most part it made me a better programmer, mainly because it enforced understanding of how specific programming concepts and constructs are generally implemented. That, and a more intense look at the "heap of abstractions" you're generally working with when you write a higher-level language.
I've always advocated that, if you want to really understand the tools you are using, you need to understand at least one level below the "surface abstraction" that you are working if. Even better if you can understand two or three levels down.
There's a great talk from game developer Jonathan Blow [1] that describes how knowledge is lost "generationally" due to our lack of understanding of the black boxes we build things on top of. Not sure I 100% agree with his thoughts, but it's an interesting take.
I'm currently working on a compiler from scratch for fun/learning. Currently just compiling Standard ML without modules because it presents an interesting challenge, compilation-wise.
I'm self taught (but have been programming 10+ years) so it's been a really interesting experience - I've gotten exposure/actual use of some different data structures and algorithms that I've never used before.
If you're into this kinda thing, I suggest checking out the ProgrammingLanguages subreddit, there's a discord server as well.
I spent a couple days writing a compiler for a custom shader language. It's a very simple lisp with minimal features, but enough to do everything I use in graphics. I compile this into glsl + a struct and functions for cpu side. A lot faster to prototype when I only have to modify my shader, and I can switch out the backend between webgl and opengl. Should be quick to add support for vulkan/metal/dx12 when I need that. I never would have thought to do this solution if I didn't already know how to write compilers. Somebody might say "oh, you could have used ____" - maybe, but it wasn't much effort and I can make it work in a convenient way for me.
(in position f2)
(in uv f2)
(all sprite tex)
(all color f4)
Even if you never write or work on a compiler in the sense of a program that turns source code text into assembly, many programs (or parts thereof) are compilers in the sense of turning data or code in one form into another form.
This is especially true of "code generators" or "source-to-source" transformation tools. I did write one of each recently, and often had to look up the compiler literature to go forward.
Even many higher education IT programs often barely cover the underpinnings anymore.
I'm currently embarking on a fun little "from first principles" challenge:
Imagine a post-apocalyptic world where computers are a relic of the past.
You stumble upon a functioning device with a screen and keyboard. When powered on, all you get is a bare metal "assembly interpreter". You can type in riscv64 assembly instructions and they are executed. You can print to the screen via a UART. So you don't even get an assembler/linker to build on.
From there you have to build up an entire new software stack, with a proper language with compiler/vm, an operating system, a standard library, basic userspace tools, a UI, ...
Simulating the device and console is easy with qemu-system-riscv64 and a a bit of assembly.
It's very challenging, but also rewarding to study up/relearn and implement all the basic underpinnings of computers, all the way from hardware (CPU, memory, IO devices, ISAs) up to a GUI stack.
Hacking together a simple OS isn't actually all that hard when your hardware is constrained to a specific device. And writing a very simple language in assembly also isn't that bad. You can build progressively better languages on top of that.
I'd love to turn this into a somewhat guided, interactive learning experience that runs in the browser via tinyemu [1] or similar, but I won't have time for that.
[1] https://bellard.org/tinyemu/