I've been doing the nim track on exercism.io (no affiliation, just a user) for the last two weeks. The mentors on the track have been fantastic. If you're looking for a "solve a problem and get some external feedback" approach to learning the language it's pretty good.
Personally I stopped using exercism, when it started to require me to work for Google for free and would not let me log in, if I did not. It's not an ethical platform, unfortunately.
Exercism co-founder here. We get bots spamming the site in every way possible if we don't have a recaptcha. Happy to accept other suggestions for stopping bots signing up if you have any more ethical alternatives that are as an effective?
Perhaps I would. I've supported many projects in the past, so I think, if it was really good and did not violate my privacy, there is a fair chance I would.
Dup is very interesting to me. Does that mean that in Nim all functions will be impure by default, and that it's considered idiomatic to modify the input? I guess it's nice that they're trying to make behavior consistent. It just seems like the opposite of the recent directions I've seen other languages take.
I haven't played with nim for a few months, but only parameters with var will get mutated. I suspect you want to use someone else's cursed function and don't want to poison your blessed code
I see. I know nothing about Nim but I do see in their example they used var when declaring the function. This way seems like it's hidden to the consumer of the function but hopefully editor tooling helps with this.
If you would like to see some simple to parse Nim code and get a feel for what a "1 month of experience during weekends" Nim looks like, check out my project:
Nim is a very interesting language, with a lot of upsides. It just needs that killer framework to get the ball rolling and allow devs to solve business needs quickly. Once that's done I think Nim will grow a lot! Tiny binaries, static binaries, fast, low memory usage, macros, compiles to C.
It really doesn't! It's GC story is pretty great, because you can choose out of several GC's, one of them being good for real-time systems: https://nim-lang.org/docs/gc.html
'TLSF' sounds very interesting, I'll have to keep an eye on that project. What are the reference-counting options for though, given the absence of weak references? Under what circumstances is it acceptable to permit memory-leaks in case of cycles? Short-lived processes (i.e. arena-based memory management)?
When I say 'primitive GC', I mean Boehm, which is obviously decades behind the state of the art, but easy to get off the ground. GNU Guile uses it, for instance. Serious garbage collectors are tremendously complex, and are something less popular languages unfortunately tend to lack. I believe even D is miles behind Java.
Can't say I know much about Go's GC.
Edit Oops I misread, TLSF isn't a GC, it's an allocator. It sounds then like Nim doesn't have a proper real-time solution, does it? useRealtimeGC is actually a leaky reference-counting system?
It isn't acceptable to just permit memory-leaks in case of cycles. There's no way we'd ever see something like that in a serious JVM.
Unless there's some serious machinery to provide assurances against reference cycles, of course. Which sounds like an interesting research project, come to think of it.
> It isn't acceptable to just permit memory-leaks in case of cycles.
That's only in the new --gc:arc which is still somewhat experimental. The old gc, --gc:markandsweep IIRC, does detect cycles, and has a deadline schedule (i.e. you can tell it "I now have 5ms, collect as much as you can"); But it has per-thread heaps, which means passing data between threads often entails a copy. You also have --gc:boehm (no cycles, shared heap), and --gc:none (no gc, you should take care of freeing memory yourself).
Primitive in what sense? Nim has a soft real-time garbage collector which doesn't sound primitive to me. Actually Nim supports more than one GC including boehm so you really have a lot of choice here.
As others have said, Nim has plugabble garbage collectors, some are quite advanced.
In addition there is active work on a swift like reference counting, lifetime based memory management option. This will be suitable for hard real-time use cases.
The collect macro is awesome, and way more idiomatic Nim than list comprehensions. Nim continues to hit the sweet spot between performance and developer ergonomics.
Nim might be great. But how can a great language get traction if nobody big is behind it?
Java and Python are being pushed by Silicon Valley and universities, Go is being pushed by Google, Kotlin is being pushed by Google, C# is backed by Microsoft and some big companies.
Who can help Nim, or Crystal, or Rust or other new language?
I look at many of them from time to time, see interesting developments, see exciting things but I refuse to learn the languages, their ecosystem and frameworks from the fear I have I will lose lots of valuable time since I might not be able to earn my living using them.
Surprisingly, a lot of large companies use Rust in-house. I work at Nando's, one of our biggest systems is in Rust, and we're a chicken restaurant business, so it's not just limited to bleeding-edge SV companies.
Hoping we can open-source some of this stuff in the near future too to kind of show the world that Rust is ready.
I feel like dark mode is a sign that people care. I think it's a user feature that is easy to do, but not done often. Mostly done by people who care. It brings so much polish to the experience.
It's not necessarily _that_ easy to do, especially depending on how your CSS was implemented (eg; do you use variables for all your colors, and do they have a sane, consistent, reversible brightness scale?)
However, I certainly agree with your "you do this because you _care_" sentiment.
Source: implemented dark mode for an application where the "sane css colors using variables" property was true.
There's a large button. It loads before the text does. When the text loads, you'll see in big letters, right next to the button, Dark Mode, italicization and all. If you click on the button, suddenly you get a very blue-purple dark mode experience.
I really with more people used NIM for web development. It really seems like the best of all worlds (e.g. perf, developer ergonomics, productivity, etc).
As fan of Wirthian languages and GC enabled systems languages it looks nice to me, but it is still miles away of the tooling experience available to Java and .NET developers, and yes, you can make use of AOT or JIT caches with them.
But the more the merrier, so I look forward to its bright future.
My guess is the GP misunderstood and was speaking about the Prolog programming language, not the web framework mentioned.
To address your question, the only complete back end web framework for Nim that has significant traction (that I know of) is Jester. Karax for the front side. The Prologue framework looks interesting though.
I've been building and deploying Nim web apps for clients over the past couple years. The language is just getting better and better and the library environment is filling out nicely too (not to subtract from the stdlib, it's very complete too). It's been a really great experience.
Also: available to hire if anyone is looking to work with Nim.
How would the dynamism and expressivity of something like Django port to Nim? I love the whole declarative paradigm of the Django forms and models API. Is Nim good at this kind of DSL-ish plasticity?
Extremely well I'd say. Nim has a very powerful macro system, and DSLs are very common across various modules. Haven't used Django myself, but something like Jester (https://github.com/dom96/jester) or Karax (https://github.com/pragmagic/karax) is a good place to start if you want to do web development in Nim.
i'd say so: i think many rails/django concepts are translate-able to metaprogramming in nim, not always with the same idioms, but usually there is a way to do it, and you can typecheck some things
Having only played a little with both (Kotlin and Nim), I enjoy Kotlin syntax but overall it feels as heavy as Java (need to setup an IDE for even getting started, choose and understand some package manager, slow compile times, etc.)
Nim feels as lightweight as Go, but with great syntax.
The main thing holding me back now is the lack of libraries, but it's definitely on my radar.
Nim is older than Kotlin, so your question is kinda backwards. But also:
1. no JVM (Kotlin Native isn’t considered a stable thing, or is it?),
2. subjectively nicer language,
3. An optional non-GC runtime upcoming (already in there as beta), which is interesting for low-memory applications... which in the time of the Cloud is potentially any application. The more (AWS) lambda time you can get for free, the better, right?
4. Writing command line tools. Practically nobody is writing them on Kotlin. Everybody’s writing them on Rust, though, for some reason...
Disclaimer, full disclosure: I’ve only read about Kotlin, of course, not written a single line. It seems like an excellent language, and if I had to do JVM work, I’d have it as a top3 candidate for sure.
subjectively nicer language
This is the real conversation I want to have.
Everybody talk about indirections (jvm, build system, devtools) but the real interesting talk as PL enthusiasts is about the language features, semantics and syntax.
If you start on the wrong path, optimizations can only go so far. That's why Android still sucks after all those years of investments. That's why IntelliJ is stills slow as a dog, even on the most beefy modern machine. I wish Jetbrains would have written their IDE in something else.
Even code written in C or Assembly can be dog slow, it is a problem between chair and monitor, not necessary from what tooling is capable of.
IntelliJ would be still dog slow even if written in C.
Non stopping indexing files on every startup isn't something that changes with the programming language, or the slowness doing code completion on C and C++ code, despite usage from clangd.
Look up Kotlin Native. It doesn't use a JVM (and compiles to native via LLVM).
There's also Kotlin JS, which targets the JS ecosystem, and comes with tools that let you import TypeScript type definitions into Kotlin, so you can easily interact with third-party JS modules.
A pure Kotlin library could easily be used on JS, JVM, and native/LLVM seamlessly. Which is pretty impressive. All your non-platform-dependent logic could be placed in a decoupled shared pure Kotlin library. Which avoid code duplication that is so common when you need web + iOS + Android client apps. (And the Intel Multi-OS Engine could be used to write your iOS app in Kotlin.)
I have nothing against Kotlin, it is a fine language, a pleasure to work with. Nim, however, has different priorities (utmost simplicity, native compilation, near-C performance, opt-out automatic memory management, excellent FFI capabilities, best in class macros).
The tooling is still JVM based and feels very heavy compared to Nim. (And the Kotlin/Native compiler is still very slow, although they're working on it.)
I'm not sure why so many people describe JVM based software as "heavy"? Is it possible to describe what "heavy" here means in more scientific terms?
Does it mean that startup is slow (possibly due to AOT)?
Does it mean that consumes a lot more memory than the equivalent C/C++/JS/Python/etc program? (I don't think so.)
Does it mean that code execution is too slow? (This hasn't been true for over a decade or close to two; the JVM's JIT is one of best ever.)
Does it mean the GC pauses are too painfully apparent? (I don't think this is true at all.)
Does it mean that users of JVM languages have a tendency to write code that is in inefficient/slow/bloated on average? (Again, I'd say this isn't true; and I've seen code to tend heavily more on the inefficient more so with dynamically typed languages in general.)
So, what really does "heavy" mean here?
(Lastly, while the core compiler tooling for Kotlin is probably reliant on the JVM, you don't have to use a JVM-based IDE for sure - VS Code (which is written in TypeScript/JS) should work fine as well.)
In an unrelated thread, someone posted this quote from the Mithril documentation on the differences from React, which sums it up pretty well: https://news.ycombinator.com/item?id=22777320
Java and React are fast by bringing a lot of sophisticated (heavy!) machinery. Nim and Mithril are fast by being small and simple.
For example, the JIT makes Java fast – eventually. But initially it's interpreted and slow, with the additional overhead of bringing up the JIT compilation machinery in the background. AOT compiled code reaches its normal speed from the start. So Java programs take a while to get fast, which makes them feel heavy.
Startup is slow. Java 11 is slower than Java 8 which is slower than Java 6. Class-data sharing can make it faster – sometimes. You still have to load all that data, so when it's not cached in RAM and you have a slow disk it's still slow. A smaller program is always fast to load. This makes Java feel heavy.
When it comes to memory, Java does clever optimizations like escape analysis at runtime so that the programmer doesn't have to bother with deciding between allocating on the heap or the stack. This can also make it fast in certain scenarios, after warm-up, but a language with explicit value types can be made fast from the start. (Which is why Project Valhalla is coming.)
The solution to that is to cache/save a pre-compiled binary on disk/persistent storage. This is what modern Android does.
You would just essentially need an installation step, where you compile the binary (maximally optimized for the architecture that it's running on), and save that to disk. All of the problems you described disappear with that -- no startup/AOT delay nor any JIT compilation delays.
Pre-compiling stuff is a small price to pay for the benefit of better-optimized higher-performance execution.
Another thing: you could do memory safety and other static analyses and security checks during the pre-compile/install phase. There's a lot of benefits to that.
For e.g. if you are able to statically verify and guarantee (ie mathematically provably) that the code will not commit any memory violations, then you could optimize away many of the bounds and other related checks. These sorts of verification must be done on the machine that is actually executing the code, since you can't simply trust a third-party, and must yourself verify such assurances/claims.
Sure, an AOT compiler for Java would solve some of these issues, at the cost of larger binaries and loss of the dynamic runtime optimizations that make Java fast for long-running programs (which is why Android uses a combination of AOT and JIT). Some people on this site often point out that expensive commercial AOT Java compilers have been around for a long time. JetBrains even used to provide an AOT-compiled binary of the Kotlin compiler for a while.
However, you were asking why people consider Java to be heavy. When people normally use or talk about Java, Java means a JRE derived from Sun's Java implementation. If you download Java to run the Kotlin compiler or IntelliJ, you're not downloading the Android runtime or some hypothetical AOT compiler - you're using something based on OpenJDK, which suffers from the heaviness I described.
Startup is always slow (I am really sensitive to latency, and I prefer to have some immediate response from software). You either need to tell people to install JRE beforehand (which nobody will do), or pack it with your distribution, adding 100 MB or so even for simpler programs. JVM-based software is fine on the server side, but I'd prefer something else on the client side.
And somehow Nim has more support? This is a double standard non-sens.
Btw you can address the startup time (which is reasonable btw, and x2 faster since kotlin 1.4-m1) without aot with tools like http://martiansoftware.com/nailgun/
I'm just saying that currently it's not trivial to compile a kotlin/jvm project to native. Once you start using dependencies things start to get really hairy. It's not as easy as `go build`.
I myself am hoping that either kotlin or Scala can finish their native compilers or Graal can make compiling jvm bytecode trivial.
> If you were relying on some edge case and/or buggy behaviour which is now fixed, try if --useVersion:1.0 gives you back the results you expect to see. If that doesn’t work and you need the old behaviour, please open an issue in our bug tracker and we will try to make it work in future bugfix releases
So useVersion doesn't simply checkout the old codebase, but they still promise to maintain all prior version's bugs? Seems like a losing battle to me... why not just tell people relying on version 1.0 behaviour to use version 1.0 and create tooling to make sure switching between versions is easy (a. la. nvm)?
Do you think the gcc -std flag would exist if every distribution of gcc included a command to easily download alternate versions and swap them out depending on project configuration? -std seems more like a bandaid around a difficult install/config than a feature, in my opinion.
Obviously nim is free to make implement whatever options they want, but it seems to me like every second spent triaging/fixing/etc bugs about not properly maintaining broken things (not to mention the end user time spent experiencing them and debugging them!) is time that could be better spent improving the language.
Yes, the -std flag would definitely still exist: the thing being changed by that flag is a lot of stuff in the parser, some stuff in the standard library, and some semantics bits further towards the backend, but at some point the versions converge and go through the same optimisation pipeline. A pipeline that definitely has seen improvements in more recent versions, so it's totally worth it to use a new compiler with an old language standard on old code: it might very well produce a faster executable. Just using an older compiler would also revert back to the old performance.
Additionally, a C/C++ compiler in an older standard mode would disable certain features, of course, but not completely: it's still able to give diagnistics in the vein of "this thing isn't valid with the current version setting, but it would be valid in C++17", for example. That's useful to a programmer.
A newer GCC will likely have better optimizations, better safety checks, or what have you that would benefit old codebases using older versions of C's syntax. Similar deal with a newer Clang/LLVM, for that matter.
That is, if we had a GCC version manager that worked the same way things like choosenim or rustup or pyenv or rvm or what have you work, I can guarantee you GCC will ship with an -std flag, because you can bet someone needs to compile a C89 codebase and wants the latest and greatest compiler smarts to do it. Same deal for Clang, or Visual Studio, or `zig cc`, or whatever.
Great example of this (that I just had to deal with this week, lol) is Wine, which mandates C89-compliant code, and yet absolutely benefits from any compiler optimization improvements introduced with newer versions of GCC.
I am currently working on a tiny game in nim using Raylib. I had some initial friction with Nim, but i like using the language now.
You'll do well to stop trying to use Nim as an oops language and just use it as a better C.
The language syntax is clean, and the std lib has all that I need.
One bit which I still hate is simply how imports work in Nim. My project pretty much has a header like file containing function and variable signatures which can be called from other files.
For someone used to modern languages, it just feels so archaic.
But overall it is still a fantastic language and I am excited to try the ARC gc
> One bit which I still hate is simply how imports work in Nim. My project pretty much has a header like file containing function and variable signatures which can be called from other files.
You mean how `import` works or are you using `include`? If the latter then I would strongly discourage you from using it.
As for the former, you can use `from module import nil` everywhere if you really want.
As someone who's heard of Nim, can someone who's used it tell me a little bit more about its benefits and how it compares to other common languages like JavaScript, Python, etc?
The major features are: Python like syntax, Lisp like macro system, potential for C like performance (YMMV of course).
This is my person opinion here:
Nim is part of the "new generation of system programming languages" from the last decade or so. Some other examples in this category include Zig, D lang, Go lang, Swift, Rust, etc...
The whole idea is to provide "modern high level" language features and ergonomics while still producing efficient low level code.
For Pythonistas, The familiar syntax is attractive to Python programmers looking for a more performant language (data science is a prime example.)
Nim is also attractive to lispers looking to move to a more traditional Algol or C like language while still keeping much of power and flexibility that they are used to from macros.
hope they nim developers, port a lot of python data science tools e.g pandas, numpy etc so they can increase adoption in the language. Julia is supposed to fill that space, but feel strongly nim is better positioned
As I said in another comment, the familiar python like syntax, while having the potential for much better performance.
Many data scientists are familiar with Python. They can write similar code and will run much faster by default (or with very little tweaking).
Then, if that's not fast enough, The language is positioned such that you can optimize the code to a much higher degree than you can with Python.
I'm not saying that Python code can't be optimized, but there is a limit to the speed of python code even with optimization. At some point you have to drop down to a C extension (what Numpy, Pandas, etc... does) and then you are just writing glue code in Python to drive the C. You could also use one of the Python optimized compilers, but that has limits and edge cases as well. There is "developer friction" there.
In Nim you can just have the whole program in one language. The program can be optimized to the same level as optimized C code with no special tools. This is much less friction.
What's missing is the libraries. But that will come with time.
I would say gc:arc beta quality. Very usable for many programs, but needs more battle hardening. Please try it and send bug reports if you find them! That's how we improve!
> would you say "always use gc:arc" is a safe bet?
This is the first release to offer `--gc:arc` so there are still some rough edges, but I would follow @treeform's and @rayman22201's advice from sibling comments: give it a try, it might work for you without any problems, and if not - report it.
I would still try out --gc:arc on your programs. For mine it just runs without issue. But keep in mind if you do have issues you might have to create a bug report and wait for a fix. You would not reply on --gc:arc working. But it's fun to play around with. It's real now.