Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Nim version 2.0.0 release candidate (nim-lang.org)
184 points by treeform on Dec 21, 2022 | hide | past | favorite | 118 comments


Default values and named parameters, thank you! Golang has neglected both of these. Both seemingly rare yet highly productive features to be found in a compiled language.

Love the quick compile times and go-like single binary compilation.

Case-insensitivity seems very cool for interoperability. Total underappreciated gem.

My only hangup is the syntax feels wordy, but it's leagues better compared to Rust or Go in that regard. A bit concerned about the performance cost of using macros to cut down on the wordiness (I would do this).

Otherwise great to see a 2.0 and always keeping a close eye on the Nim ecosystem as I would definitely consider adopting it.


Macros shouldn’t have any performance implications.

The whole point is that they happen at compile time.


He did mention "quick compile times" as a draw and macros can slow down compile times/that kind of performance, but sure run-time performance of compiled code should be unimpacted.

EDIT: FWIW, compile-times impact mostly just depends on what he wants to do. The macro evaluator not that slow a virtual machine. Simple substitution macros tend to run quite fast. But it's also easy (with loops!) to generate a large pile of Nim code that generates a ginormous pile of C code that the backend might have to chew on for quite a while.


100% about compile times. Sorry, wasn't clear about that.


Have written quite a lot of macros, unless you're doing something really crazy you should be fine. The one time I've really run into macros slowing down compilation a lot was when I wrote my protobuf library which parses a protoc file during compile-time and generates all the required types and procedures. And that was mostly because the parser I used was really poorly optimized.


As a casual nim user, there's plenty I probably don't appreciate enough in there, but thing I'm most looking forward to are the default values for objects.


I am looking forward to them supporting distinct DateTime types (basically used-defined clones of the DateTime type with custom behaviour). Currently using the constructor package to do the same thing, but being able to throw out a dependency is always nice.


I suspect that is a headliner "syntax" feature for many more serious Nim users. Initialization, like error handling, is one of those things that is easy to mess up.


This looks great. I’m loving nim for game dev right now. Godot 3.x bindings are working great.

There is also a really nice looking binding for unreal engine 5 that is pretty far along


Are the nim godot bindings also good for full cross-platform development?


among nim, zig and rust, I'm most likely to learn nim, it has been there for a while and it is solid and has so many good stuff in it, it just needs more 'marketing'. in particular, python really should help its popularity as their syntax are similar and both are very expressive.


Is syntax really that big of an obstacle to learning a first programming language? The semantics of a C-like language like Nim could hardly be more different to Python's, they're pretty much at opposite ends in the stack.


It can be. I would've loved to pick up Rust, the patterns for code structuring etc. that are evolving from it seem interesting to me. But getting the syntax right is just fairly difficult compared to other languages and at some point you just lose your patience. At the same time there's not enough learning material that works for me to dive into the deeper contexts to make it "click" for me.

Contrasting that to my experience with nim, I could get going almost immediately and thus didn't get held up much with the basics, which gave me more time to dive into more interesting concepts that it provides, like templates, compile-time-evaluation (new for me as a pythonista back then), macros etc.. I still got stuck here and there, mind you, but that was for more complex question than "How do I get a config from place X without the compiler yelling at me", and the discord was very helpful for that.


Nim feels much more like python than c to me, and the minimal number of non-alphanumeric characters in source is a big part of that.


> Is syntax really that big of an obstacle to learning a first programming language?

Of course. Compare, at the extremes, languages like APL or Brainfuck to something like Python or Scratch. Syntax is a huge factor in people wanting to learn to code or not.


Those languages have actually very different semantics too. I would say lisp is a better example of the importance of syntax.

People obsess (wrongly) about superficial details in my opinion.


I know of an example first hand where syntax only mattered: OCaml vs ReasonML. ReasonML is merely a syntactic change from OCaml as it's simply OCaml underneath, but people, especially coming from web languages like Javascript, liked ReasonML more than OCaml and were more willing to use it.

Another example, Gleam (an Erlang BEAM based language like Elixir) [0]:

> Shot in the wind, but is there a plan for different syntax? For those who've become comfortable in different fp camps, I can't really see myself picking up curly brackets and C-style code again. I really like the idea(s) of this project, but it's the thought of staring at that kind of code that's just a bit too much. I'd rather do something like [Nim,Scala,Clojure,F#]->JS if I needed to have that.

> > Unlikely I'm afraid. We used to have an ML style syntax but once we switched to a more mainstream syntax we had a big surge in popularity and interest.

> > I am very fond of the ML syntax, but I think it is the semantics and type system that really matter, so I am very happy to sacrifice syntax in order to make Gleam more widely accessible.

[0] https://news.ycombinator.com/item?id=27063049


Syntax is pretty much the entire reason that people stay away from languages like Erlang and Lisp


Or you know, types. Or too much expressivity.


Nim community is pretty much against python developers. The founder is vocal about that.


This is not really true, most of the times the discussion goes like this

Q: "Why haven't you implemented feature X -- python has it, so nim should too"

A: "Nim is not python, it is a different language, not all things implemented there are good, so we don't implement everything"

So this is not "against python" it is against specific attitude towards nim where people think it should be a python clone instead of a separate language


Given I used to be a python dev myself before I picked up nim and since then have helped out a fair bit of python-folks picking up nim, I'm not sure what that's based on.

The fact that nim doesn't have exactly the same stdlib defined as python (though there's a package for that)? That Araq has said that nim isn't a variation of python?


No one is against Python developers, you're clearly exaggerating, I would ask for a source but we both know it doesn't exist. What people try to convey the whole time to people coming from python, which I was years ago, is that nim is not compiled python. It's a totally different language. I am sorry that message sometimes gets lost in translation but I don't know what would be clearer than this recent thread.

https://forum.nim-lang.org/t/9737


There's my problem with nim. I was really excited about it two years ago and tried to get into it. Unfortunately I found the community not very welcoming. All the issues I had, had been raised before and dismissed.

I've since switched to Rust. The syntax isn't quite as nice but it's outweighed by the fact that the community is great and the memory safety is a game changer!


My problem with rust, is that community is overzealous to promote their language, even in the cost of spreading lies and misinformation and harassment. I was routinely attacked by anonymous accounts. The common thing they say is people using other language than rust are dumb. A few weeks ago some people joined the nim discord and besides promoting rust, they started name calling. I stay miles away from rust, it has by far the worst community. It's shameful.


There is an entire wiki page to welcome developers coming from python, and show where the two languages differ: https://github.com/nim-lang/Nim/wiki/Nim-for-Python-Programm...


Nim’s arc and sink/lent annotations seem a lot simpler than Rust’s owned pointers and region annotations. Has anybody had the opportunity to compare the costs and benefits?


> Has anybody had the opportunity to compare the costs and benefits?

Heap allocators are very significant cost over stack. Nim's designed for different use cases. Rust statically checks memory usage, and provides Arc for use cases that can only be modeled dynamically.


ARC in Nim does not mean the same thing as in Rust. I'm still not fully up to speed on ARC/ORC nor went very far on Rust, so I might be wrong here, but from what I understand:

In Nim using ARC/ORC the compiler statically checks the memory usage, possibly helped by lent and sink annotations (similar to borrow and lifetime annotations in Rust, but not as powerful yet) and for what can't be proved statically it adds reference counting. And even under ARC you will probably be using stack allocations a lot in Nim.

I think the main difference between Rust and Nim approaches to memory management is opt-in vs opt-out. In Rust you are forced from the start to think about it and be more explicit about it, while in Nim the lifetime annotations are more like optional optimizations that may be added gradually to help the compiler, that might otherwise be adding reference counting and copies behind your back to maintain correctness.

For some use cases, being forced from the start to think in a lower level of abstraction is helpful and makes the performance more predictable and transparent. People still use C partially because of that. But I like Nim's approach where I can use it as productively as an scripting language avoiding premature optimization, but can easily delve as low level as I want to write a keyboard firmware or GBA game.


The name is confusing, but Nim's Arc is not the same as Rust's: https://nim-lang.org/blog/2020/10/15/introduction-to-arc-orc.... Rust (and apparently Swift) uses Arc to refer to atomic reference counting, whereas Nim's means automatic. There's still overhead, but it would be more comparable to Rust's Rc, I believe.


Swift uses ARC to mean automatic, and its use comes from Objective-C's ARC.


Huh, the Nim blog post appears to be wrong, then, or I misunderstood it.


Nim has good stack value support too. Heap vs stack is just treated as more an optimization thing. Rust's borrow checker is most useful for heap memory, not stack.

For stack based values Nim's 'var' and 'openArray[T]' are roughly equivalent to simple implicit lifetimes. I don't know the proper parlance vs more complicated lifetime situations like "capturing" a lifetime.

Its not too different to C++ references either. Sounds like D will also check for similar "lifetime" violations soon as well.

The real difference is heap memory. There Rust's lifetimes allow you to give ownership away. Nim's var parameters can't do that, but instead it gives you fast and cheap non-atomic ref counting. Rusts way does encourage slightly faster code on average at the expense of more effort on programmers.

Whether Rust or Nim, allocations are expensive. Though in my experience Nim's allocator is amazing. It can sometimes be faster to use ref's than stack values.


Rc and Arc require a general heap in Rust (pending support for local allocators, or Storages or whatever they end up with).


I wonder if anything from this forum thread [1], which is summarized here [2], got implemented.

[1] https://forum.nim-lang.org/t/9132

[2] https://gist.github.com/j-james/08cd36b7475d461d6291416381ce...


Looking at the summery [2] many of the things got implemented, just a quick list:

* Remove backwards-compatible switches / deprecated features

* Remove every memory option except orc, arc, none, go

* Shrink stdlib

* Comprehensive stdlib cleanup

* Support default values in object construction

* Language features designed for parallelism

* Lock in --threads:on and --mm:arc/--mm:orc

* Official support for overloadable enum names

* Robust concurrency/multithreading

* Fix bugs!


> * Remove every memory option except orc, arc, none, go

I feel like memory option removals remain more "planned/maybe revisable/maybe not really agreed upon as the right direction" with only a change of AMM defaults happening in 2.0. (I'm also not sure what value in removal really is except less to learn about/more inflexibility..). The rest of the list seems represented, though.


In my understanding, Nim at the moment is really a transpiled language, instead of compiled. Transpiled to C, then tooling uses clang or gcc to do compilation from C to target platforms.

It is like TypeScript to JS in C/C++ world.

Very clever to stand this way on the shoulders of giants, but the amount of moving parts is staggering and horrifying.


The distinction between compilation and transpiling is a bit imprecise, but yes. But oddly the basic C that Nim compiles to is fairly resilient and stable. I'd actually expect the Nim compiler to bitrot way less than languages based on LLVM, which have a very short half life.

So Nim's approach has some challenges, but surprisingly has less moving parts than you'd think. I'd be confident I could get the current Nim compiler up and running in 5, or 10 years with minimal effort.

However, there was a lot of effort to get things like pointers and strings in the stdlib to play nicely across the NimVM, JS, and C backends. Theres some gross details there. But in the end its beautiful.


Besides @elcritch's and @pjmlp's excellent points {language "level" is vague & all compilers translate}, C as a backend target has other virtues (though is by no means The Only Way). E.g., TinyC/tcc optimizes for time to compile (it is a single pass over its input) not generated object code qualities like run-time/file-size. This can give Nim code almost interpreter-like edit-compile-debug devel cycles, but even just changing the default gcc -Og to gcc -O0 can be a huge improvement. [1] (`passL:"-lm" --mm:markAndSweep` and recently `--threads:off` became necessary for tcc due to atomics/C11 dependencies, but it is already a kind of "debugging compilation mode".) Similarly, work on space-saving/alternate libc's like musl [2] can be leveraged, [3] although using `config.nims`/NimScript can also lengthen compile-times by 100 milliseconds or so.

[1] https://forum.nim-lang.org/t/8677 [2] https://musl.libc.org/ [3] https://github.com/kaushalmodi/hello_musl/blob/master/config...


It is a compiled language, how the backend is implemented it is an implementation detail, transpiled was a term the JS community came up with, which you won't find in languages like Eiffel that always compiled to native code via C for the last 30 years.

Or C++ and Objective-C in their yearly days, or the P2C Pascal compiler from 40 years ago.


Well, then I have to agree

    the program is nothing more than a sequence of plain characters, stored in a text file. This abstraction is a complete mystery for the computer, which understands only instructions written in machine language. Thus, if we want to execute this program, the first thing we must do is parse the string of characters of which the high-level code is made, uncover its semantics—figure out what the program seeks to do—and then generate low-level code that reexpresses this semantics using the machine language of the target computer. The result of this elaborate translation process, known as compilation, will be an executable sequence of machine language instructions.
    Of course, machine language is also an abstraction—an agreed upon set of binary codes. To make this abstraction concrete, it must be realized by some hardware architecture. And this architecture, in turn, is implemented by a certain set of chips—registers, memory units, adders, and so on. Now, every one of these hardware devices is constructed from lower-level, elementary logic gates. And these gates, in turn, can be built from primitive gates like Nand and Nor. These primitive gates are very low in the hierarchy, but they, too, are made of several switching devices, typically implemented by transistors. And each transistor is made of—Well, we won’t go further than that, because that’s where computer science ends and physics starts.
Excerpt from https://www.nand2tetris.org/book


C seen by some for a long time as a "high-level" assembler :D, and if you choose a safe subset.. and given also e.g. how the LLVM pipeline and other compiling works, and how old and widespread C is, I think it is a clever portability and reuse a lot approach, and the fear of too many parts is exaggerated and not much different from other transpiling&compiling (that difference is also very blurry one and depends on viewpoint, don't think there is one precise definition for that?) pipelines... also not the first to do so.


What isn’t these days?

Even the C code (on many platforms) is going to transport to LLVM IR


> In my understanding, Nim at the moment is really a transpiled language, instead of compiled. Transpiled to C, then tooling uses clang or gcc to do compilation from C to target platforms.

If I understood correctly, like the Vala language: https://vala.dev/ (Note: Vala is strongly integrated with GObject).


Greatly looking forward to experimenting with zero-overhead interop, as that is a pre-requisite for using it as a configuration language for certain types of high-performance servers


Any die-hard lovers of Nim? Or heavy users? Why do you use it over other languages? What was your ah-ha moment?


For me, the biggest advantage of Nim compared to other hagh-performance languages (C, C++, Rust, …) is that it doesn't overcomplicate simple things. I wrote an article about it: https://xigoi.neocities.org/nim-doesnt-get-in-the-way.html

The syntax is very pleasant to write and read, even I have some small issues with it. No unnecessary noise like braces and semicolons, no misusing the less-than and greater-than signs as brackets.

Also, it uses terminology more correctly than other languages. Procedures are procedures, not “functions”. Resizable arrays are called sequences, not “vectors”. Immutable variables are immutable variables, not “constants”.

The standard library is quite good (though it could use improvements) and extensive enough that you don't get hundreds of dependencies per project like in JavaScript or Rust.


As a Python user, I had my a-ha moment with Nim when I realized I didn't need a repl.

With Python, I used repl all the time. There's bpython, ptpython, ipython, and probably a couple more great repls because repl is really important for Python development.

Nim's INim is no match for those. But here's the great part: with Nim you don't have to test code snippets all the time. I get all my error messages before they happen. This felt liberating after Python.


Agree to disagree, I use inim all the time to test out those code snippets folks post in nim's discord to see what's broken in them :-P


I'd just save a snippet to a file and wait a second for nimsuggest to highlight the errors :-)


I only know a hand full of languages so make of that what you will. I came from a python background and hadn't learned a compiled language before, so I wanted to experiment. Tried Rust for quite a while, but found the syntax not clicking with me though a lot of ideas seemed intriguing (pattern matching and Result types). Didn't want to try C/C++ because I wasn't that hard into figuring out how memory works and at the same time was pretty shocked at how C considers types.

That only leaves go after that, which I heard about around the same time as nim. I started out with nim and I liked it enough so I stuck with it.

Why I use it over other languages? I feel like I can express myself in english readable sentences (which gives it the python vibe for me) and at the same time I have static typing, a really nice type system in general and very little chance of crap like nullpointers occurring.

My first ah-ha moment was pretty much 3 hours in when I noticed I could already already write simple stuff and only needed to consult the std lib docs here and there. The second was when I reimplemented a webserver backend that I previously had in Django (not for practical reasons, just to see how fast it could go even very little optimization) and found it performing roughly 2-5 times faster (measured by looking at request response time), despite having optimized Django's ORM to as few queries as physically possible to get the data I needed. (For reference: That's quite surprising given a decent chunk of that time is literally just network latency)


I tried Nim on a lark 3 years ago. I was writing a learn-to-code tool in Godot and was getting frustrated with some of the limitations of GDScript, so I decided to build it again in Nim. It was very much a "I don't want to do what I'm supposed to be doing right now, so instead I'll play with something shiny" moment, and I fully expected to hit a wall within a few hours and get back to doing things the normal way. But the wall didn't come, and after a few days with Nim I tossed the GDScript version.

In no particular order, things I like about Nim:

- It has most of the benefits of a scripting language, without most of the tradeoffs. Hello World is a one-liner, there isn't much syntactic noise, and it's very easy to write short, simple, useful programs that look like pseudocode, live in a single file without any dependencies, and can be kicked off with a shebang line. However, unlike scripting languages, it scales very well to large projects.

- Progressive disclosure. You can use Nim effectively while knowing very few of its features, but there's a lot of functionality available when you're ready.

- The "if it compiles, it works" factor is quite high. Not as high as Rust or Haskell, but higher than Java or Go, in my experience.

- The "it compiles on the first try" factor is also quite high. Higher than any language I've tried. "I wonder if this will work..." code usually does, the advanced features of the language mostly stay out of the way until you need them, and I rarely find myself working just to make the compiler happy. Just as an example, unless you add specific constraints, generics are "duck typed". If I pass a type to a generic proc the compiler will verify that it has the properties and functions the proc needs, but I don't have to define a specific interface up front.

- Similar to the above, the productivity vs safety balance seems right, at least for me. Code is fairly safe by default, but it's easy to work around the compiler if you need to do something it doesn't want you to do. It's also pretty easy to enforce additional safety when you need it, like ensuring a function can't throw exceptions or have side-effects.

- It's very good at building abstractions and eliminating boilerplate. Nim templates and generics are easy to use and quite powerful, and macros are there if you need something more advanced. Many features that need explicit compiler support in other languages, like async/await and string interpolation, are implemented in the Nim sdlib with macros, not the compiler.

- Nim produces standalone binaries that are both small and fast.

- The compiler is faster than most.

- The compiler can be imported as a library, making it pretty easy to write tools that understand Nim code.

- Nim programs are usually compiled, but there's also an interpreter. The compiler uses this for macro evaluation and for running build scripts, and it's easy to embed if you want your program to be scriptable.

- Nim can run on pretty much anything.

- Nim can be used to build almost anything. You can use it for systems programming, webdev (frontend and backend), games, ML, scripting, scientific computing, and basically anything else. There are definitely some domains where library support is lacking currently, but the language itself is suitable for any type of program.

- It's very flexible. Most Nim code is imperative, but it's easy to write functional, declarative, or OO style code if that's your thing.

If you prefer languages like Go that favor an abstraction-free style, where everyone's code looks more or less the same, you probably won't like Nim. However, if you want something more expressive like Ruby or Lisp, but don't want to sacrifice performance or safety, Nim is definitely worth a look.


I'm a full time Nim developer and have been for over half a decade. It's completely spoiled me for other languages, it is just ridiculously productive. Rather than an ah-ha moment, it was more a gradual transition from "where's the catch?" to "there's no catch!".

The simple answer is this: anything I want to do, I can do in Nim easier than other languages, while also having direct access to C/C++/JS ecosystems as well.

Productivity:

1. I write pseudocode, it compiles fast and runs at C speeds. Programming is fun again!

2. No `::!>><>^<<@{};` cruft everywhere. Write as you want to read, even spanning multiple lines is clean and without separators.

3. Procedural: only data types and code. No need for OOP/Trait abstractions to be forced into everything (it's there if you must).

4. UFCS and overloading by parameter types make everything naturally extendable: `len("string")` can also be written `"string".len` or `len "string"` - you don't have to remember which way round it goes, and 99% of the organisational benefit of OOP emerges from this lexing rule. A compile time error if you use the same name and parameter types, so no ambiguity.

5. Sensible defaults everywhere: type inference, stack allocated and pre-zero'd variables by default, extend with GC/manual management by type (`type HeapInt = ref int`), GC is deterministic with scope based destructors and move semantics as an optimisation rather than a straight jacket, detailed opt-in control down to assembly as you wish.

6. Arguably the best compile time support of any language.

7. AST procedural metaprogramming is a core language feature. It's hard to express how powerful and well integrated this is. You can just chuck code around and recombine it effortlessly. Whether it's simple DRY replacements, automating serialisation from types, custom DSLs at your convenience, or even generating entire frameworks at compile time from data, you effectively have another dimension to programming. I can't go back to flatland, now.

8. Flexible static typing that's as strict (`type specialId = distinct int`) or generic as you want, with concepts matching any statement against a type. You can also calculate or compose types from static values which is really nice.

9. Low overhead and high control makes it great for embedded: https://github.com/EmbeddedNim

10. Fantastic FFI that can even use C++ templates, along with loads of converters/wrappers like c2nim, futhark, pas2nim that add even more sugar to FFI interop.

Portability and glue:

- Single portable executable output.

- Compiles to C89/C99, which covers basically every piece of hardware.

- Compiles to C++ so you have C++ ABI compatibility.

- Compiles to JavaScript.

- Compiles to ObjC.

- Compiles to LLVM.

- Excellent Python interop (see: Nimpy).

- Libraries for interfacing with C# and .Net.


Thank you for your thoughtful reply. Is it fair to say that Nim and Dart share similar mission statements? Any idea of a comparison between the two?


I was hoping case sensitivity would be implemented but it seems like it was too controversial. I'm thinking at this point there might be too much resistance.


Case sensitivity is worse in every way IMO. Look at Nim's implementation of UFCS, which means `foo(bar) == bar.foo`. This means only one possible `foo` that takes type of `bar` can be invoked. There's no ambiguity if `foo` is a method, a proc, or a field - if there were, you'd get a compile time error.

Similarly, case insensitivity, of course, means `foo == fOO == fOo == foO`. Reducing these to one identifier means less ambiguity to the programmer and encourages either using sensible names that can't easily be confused, and/or unambiguous mechanisms like the type system designed for this purpose.

Say you have a constant for 'light enabled' Instead of naming it `lEn`, it's better to use `type LightState = enum disabled, enabled` and write `light.set enabled` for so many reasons. Same goes for pretty much everything. It's worth noting as well that the language is very good at resolving overloads by type, so if you have your own object and create a `len` for it, it's not going to get confused with the string `len` or even a `lEn` constant. Even if you have a `lEn` and `len` that are used in the same context with the same types (why though), you can qualify it anyway with `module1.lEn` or `module2.len`.

The language has so many easy tools to make things more explicit and easier to read at the same time. Ultimately, case sensitivity only really ends up encouraging bad naming practice without adding anything back.


The `--styleCheck:error` flag can be passed to make everything follow NEP1 style guide [0]. `--styleCheck:usages` can also be passed to instead make identifiers follow their original naming (Basically turning off style insensitivity)

- [0]: https://nim-lang.org/docs/nep1.html



I think I agree with Carmack on this. I'm hyper aware of case now, but this was probably the major cause of errors for me when learning (didn't help that tooling was less good then, so there were no linters pointing out my errors).


I guess for me it's one of those things I got tripped up on initially but came to appreciate later. It taught me how to be more precise about characters and reasoning about them in code.

There's also cases where case sensitivity should matter in naming in my experience, and it's not possible without it.


Personally this preciseness is just a distraction, but it sounds like you have learned from the constraints and can now take the training wheels off again.

Very curious about those cases, yet to find one myself that was not because of a bad design decision. Though I am also very big on the idea of `ambiguous grammar, rigorous implementation`, for natural language too. (Toki Pona is a somewhat extreme example but the contextual grammar, and holistic minimalism is fascinating)

Not to make it longer than needs to be; But take the example code in this readme: https://github.com/guzba/mummy

Being able to quickly script this without thinking about cases or style on the first pass is a really lovely way of reducing friction imho.


> There's also cases where case sensitivity should matter in naming in my experience, and it's not possible without it.

I'm really curious, do you have any examples? I can't think of any that wouldn't be better served by using the type system.


I've come to increasingly accept it over time? I mean, I'd really like it if my_function() and myFunction() just weren't allowed in the same section of code but if you allow the user to have both it's probably better that they refer to the same function than different functions.


It is a strange feature but it does have a good underlying idea.

Why would having two functions, named makeFile and make_file in the same program ever be a good idea?

Think of it as less of a language syntax feature and more of a code style enforcement paradigm.

Good lsp support makes it mostly fine imo.


Imo it goes beyond that. You have folks from Java etc. coming over with their camel-case as well as python folks with snake-case. Now despite both of them writing in their own styles, their code can interact, because when the java-programmer gives you a `validateObject` procedure from their package, the python developer can just use it as `validate_object`.

Not being forced into other people's style choices is a really nice boon to me.


It's not so much that anybody thinks that mix is a good idea, but it's just what happens immediately in any C/C++ codebase if you're using any nontrivial combination of libraries at all. The feature is about providing consistency, not ignoring it.


My_function vs myFunction Is fine but do_me vs dome start to differ and do_me vs d_ome vs dom_e start to make things very much different.

That's the real problem


Rarely a problem. Why would you have two procedures named do_me and dome with the exact same signature?


Because the other procedure does something with some dome?


Then is should be named doSomethingWithDome.


It's apparent from the RFC discussion that there is no consensus on the topic. With little (proven) tangible benefit of making the change, and a large potential for backlash within the small existing community, I'm not certain it's worth the risk for Nim to make this move.

https://github.com/nim-lang/RFCs/issues/456


Anyone who wants to use grep will just keep ignoring nim. Case insensitivity is pretty silly, and underscore insensitivy is just really silly.

The way that modern languages force a single style is great, and it's a big strike against nim, which is an otherwise nice language.


Agree to disagree. I don't want to force my users to use snake-case just because I use camel-case and vice versa.

There's options to turn it off, options to turn it into a warning when you have style issues so you can make them consistent and it prevents similar names that shouldn't exist in the first place by causing compiler errors (if you have is_ok you should not have isOk in the same scope, what are you doing). It is just not a big deal, at all. And as stated, it is not hard to ensure consistency within your codebase by heeding compiler warnings/errors.


As someone who has been writing nim for many years now, using grep, rg, and various other search tools I can say that I've never had any issues with finding things I wanted. This is just one of those imagined issues that your realize doesn't exist when you actually try to do it.


Thank you, I have tried it, and it was frustrating.


I had this gripe about it as well, but then I learned that you can force style in your own projects with a switch.


As someone who has never explored nim but was was just about to follow the link, you have effectively convinced me to not bother. I can imagine the arguments for case insensitivity (not agree with them but I could live with them for certain tradeoffs), I can't find a place in my brain where "underscore insensitivity" could have existed as a concept before reading those words.


You should still try it. It's a pretty decent language that builds fast and runs fast. I think of it as a slightly better Go.


Just treat Nim as style-sensitive and you'll probably never run into a problem with it. Seriously.


Its not an actual issue, it is the projected experience of another language and/or IDE/ tooling. Fyi the newest GitHub ignore case. But I have been using grep with nim code daily and it was not an issue.


Oh yes, just like you can never find anything in Google because it's case insensitive.


Ironically they reject tabs and require space indentation. Something something the complexity of supporting both. Yet we have style insensitivity..


That actually makes whitespace syntax in Nim much better. Can't recall the number of times I copied Python code snippets and had to track down lurking tabs.


You can't mix tabs and spaces in the same file since Python 3, which completely solves this problem without imposing one or the other choice on the users.


To be fair, you can tell your IDE to just turn tabs into spaces and call it a day, that's an issue for all of 5 minutes.

What you can't do is tell your IDE to interpret camel-case as snake-case if you want (or vice versa), just because this package you really need uses camel-case and you want to use snake-case.


> It's apparent from the RFC discussion that there is no consensus on the topic.

That's your take? I can see a couple of actual nim users complaining, the rest are some randos that tried to derail any productive discussion with "I have an old C library that uses three different styles for init, how would I bind it to Nim?" and although a bunch of workarounds exist, they were too loud and stubborn. For me it showed that most nimmers prefer style insensitivity as an option, although they don't use it in the same project.


I didn't know this existed. There's a lot I like about Nim, but total rejection of tab indentation, and now this style insensitivity thing, mean that for me personally, Nim will just stay an interesting language I see on HN from time to time.


The partial case-insensitivity thing is not new, it's been part of Nim for a long time, maybe since its very beginning but I'm not sure about that.

https://nim-lang.org/docs/manual.html#lexical-analysis-ident...


I am not a fan of case insensitivity either. But it must be easy to write a case insensitive grep. I wager that a nim grep would already exist.

I do agree that this is a red flag decision.


https://nim-lang.org/docs/nimgrep.html

If you use choosenim to install Nim several executables get built/downloaded along with the compiler:

   $ pwd && ls
   /Users/me/.nimble/bin
   choosenim  nim  nim-gdb  nimble  nimgrep  nimpretty  nimsuggest  testament
https://github.com/dom96/choosenim#readme


It's one of those things that, as a Nim hobbyist programmer, I think is a neat feature to have, but I could see being a big hesitation for use in professional environments.


Given that you tend to use linters in those environments as well and that you can turn it off anyway, I'd disagree on that one. There are other factors that would make me hesitate in a professional environment much more, such as community size and talentpool.


Why?


Well, crap. I just bought his book about version 1 lol.


> Don’t panic! One of our design goals was to make it easy to write code that works with Nim version 1 and 2.


The language didn't change that much.


Please share to me if you don't mind.


Share the book? Unfortunately I had to buy a physical copy because Andreas refuses to make a pdf version (to avoid pirating I think?).

Honestly though, I'm a bit let down in that it doesn't seem to cover what I find the most interesting aspect of Nim : its ability to compile to multiple intermediate programming languages like C, C++ or Javascript and use their libraries. I was hoping to find out how to write VSCode extensions entirely in TypeScript (I know it's possible because the Nim vscode extension itself is now 100% Nim, but there seems to be no tutorial for how to do it online)

I had started reading "Nim in Action" and I might finish that first since it does cover FFI, it it is rather old (it was released before version 1.0)

You can find Nim in Action on the Mannings website : https://www.manning.com/books/nim-in-action


I am pleased to see a language that focuses on good multithreaded support.

I tend to use Java and C for multithreaded problems but having the availability of Nim which looks similar to Python is really promising.


If you want good multithreaded support and don't care much about ensuring perfect thread-safety at all times, you can't beat Go for real-world use.


Yes I like Go but I am yet to do any professional development in it.

Go is a M:N scheduler with M kernel threads and N lightweight threads

I wrote a 1:M:N lightweight scheduler which preempts hot loops. I don't know when Golang preempt goroutines outside of a channel send - I believe it's in stack growth or in other words when a method is called.

My userspace scheduler preempts while true and for loops by setting the looping variable to the limit.

I wrote it in Java, C and Rust.

https://GitHub.com/samsquire/preemptible-thread

Does anybody know if Nim loop variables can be easily mutated from another thread?

It looks they cannot


No, not directly but Nim supports async. Effectively it turns the function into an Iterator whose state can be passed around. Those are put into Futures.

Its possible to write your own async engine, or you could use the iterator yourself if you really wanted. It's actually not too hard.

Checkout: https://github.com/status-im/nim-taskpools

Edit: also Nim compiles to C, so its possible to do whatever you can do in C with a bit of fiddling. Also theres coro https://nim-lang.org/docs/coro.html


Elixir fits in here as well, if the workload is thread-bound and not computation bound.

I think Elixir's syntax and idioms are better than Go's, but that's personal preference.


I haven't heard of Nim until now. I've been a Go programmer for years now, and I see a lot of potential. I wonder why Nim hasn't taken off. What's the catch?


A contributing factor may be lack of a major sponsor bootstrapping the user base. Go had Google, Rust had Mozilla, Nim was/is largely indie.


Like the most popular language per Tiobe, python.


To be fair, python back then didn't have great competitors in its space. I guess Perl, but from what I still remember from Perl before I dropped it like a hot potato a couple days in, it wasn't particularly great.

Nim does.


Python vs Ruby seemed like a real fight for a while, especially when there was Rails and python did not really have anything equivalent.


Python languished as an obscure niche language from its inception in the late 80s until at least 2000. I recall discussing it with friends all during the 90s to blank stares and shaking heads. Rapid adoption (of, well, anything) is a network effects game. In the very early 2000s Python really took off (basically Python 2.x).

It surely helped that well known companies like Google (pre-IPO, "ad free & proud of it" back then) started openly saying they used it, but like many network effect things, it's probably hard to pin down any one "cause" (but easy to fool yourself into thinking you have, e.g. Google also said they used Perl). The Numeric module existed, but data science was not very strong until Travis united the numarray/Numeric things into NumPy. [1]

https://en.wikipedia.org/wiki/History_of_Python has more details, but it does not have a nice plot/chart of "How weirdly do your software friends look at you when you bring up Python..." which has, shall we say, a "data collection problem". :-) Proxies for such data might be interesting case studies in the sociology of programming languages, though.

[1] https://en.wikipedia.org/wiki/NumPy


Actually a really good point! I entirely forgot about Ruby! Though in terms of competition that does still seem less crowded of a field as one where your other contenders are C, C++, Rust (and for application and web-development there's also Go, to some degree also Java and Python)


Nim 1.0 was released 2019. Python3 was released 2008, Go 1.0 2012, Typescript 1.0 2014, Rust 1.0 2015, Kotlin 1.0 2016.

It might be premature to judge Nim's popularity, respective timelines considered. At the same time language options are probably nicer in 2022 compared to 2010, so it will be harder to take off.


they are missing a cute mascot

like gophers for go, crabs for rust, dinosaurs for zig


To be honest, Go's mascot just makes me hate the language even more.


Are there any products/companies that use Nim in production?


There's a list in the very announcement: https://github.com/nim-lang/Nim/wiki/Organizations-using-Nim

though I'd imagine the number will remain small as Nim's too artisanal for enterprise to grok.


Goodboy Galaxy is a game in development for Game Boy Advance writen in Nim. They did a presentation about coding for GBA in NimConf 2020: https://www.youtube.com/watch?v=sZUM7MhWr88


Status makes heavy use of Nim in production:

https://github.com/orgs/status-im/repositories?q=&type=all&l...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: