Correct me if I'm wrong, but I don't think D has the ability to just import C headers and seamlessly use them without having generated or manually written `extern` declarations?
C (or even C++) functions can be called directly from D. There is no need for wrapper functions, argument swizzling, and the C functions do not need to be put into a separate DLL.
I think Walter Bright achieved this via implementing a full-blown C++ parser in the dlang compiler. A [God-Tier] achievement.
Not quite as seamless as Zig, but dstep is an external program that leverages libclang to do the same thing (and generates a D module for you), as well as e.g., smartly convert #define macros to inlineable templates functions :)
It may outclass it in performance. However, all my life I've been waiting for something that runs over 10h on a single charge. I am naturally surprised that this is not something that ppl truly appreciate in a laptop. Hell, I even bought a cheap Atom laptop once just to have more battery time.
So as of now, there's literally no decent competitor to M1 laptops. One must be living under a rock to buy anything but Apple and this is coming from someone who doesn't have Apple products and always hated their walled off ecosystem. I am reconsidering my life choices :)
What if I want to use the same machine both at my desk and at work? The ability to use the machine on-the-go is one thing, but the actual portability is second.
But you already knew that, didn't you.
Not saying I agree with the "One must be living under a rock to buy anything but Apple" part, that's nonsense.
Sure, having a laptop in that case is unavoidable, but you won't _require_ 10 hours of battery life either.
In the long term I don't think laptops are exactly the right answer for portability. I think the ideal would be that when we get up from our desks, all of our running programs (even the whole OS) would migrate to our phones. As soon as we open our laptop, they would all migrate there.
That’s just your opinion. You don’t need large monitors to code (lines of code are 80-100 characters), and moving around and changing positions (desk to couch etc) every couple hours while working is generally better for health and concentration. Not having to hunt around for a power adapter makes it that much easier and better.
Using a laptop for 10h is super useful for days where you have to travel somewhere, have an appointment, and travel another 5h by train back. Avoids having to constantly have it plugged in in the train, and allows using it even in trains without power sockets
What?! There are static languages that compile faster: Nim, D... Honestly, I used ggplotD library and it takes roughly two seconds to compile and miliseconds to show the plot.
Julia is fast but not faster and very memory hungry. If you care about squeezing every bit of performance you generally go to C++. If you too lazy for C++ you go D.
Hence to me, Julia is an odd tool. Interesting but more as a language with some cool scientific libs that Python might not yet have.
> It's amazing to realize that writing down the first thing that comes to your head is usually like 80% as fast as a good, performant implementation. (As someone who has done a decent amount of work in performance engineering for embedded platforms, I really enjoy squeezing out the last drop of performance from most programs, but doing this for every single first-pass at a program, like in Python, is rather annoying if this is what is needed to get a usable implementation.)
Yes, Julia is not necessarily faster than a good C implementation (that doesn't leak, etc), but, like Python, what would be a 300-line C implementation, where one has to somewhat carefully manage typing and the abstraction is really rather complicated for something that is mathematically simple, we can usually write 20 lines of very performant Julia that is 95% as fast.
Attempting to do something relatively similar in Python is often slow enough that giving a practical implementation essentially needs to be coded in C and interfaced with Python, where we return (again!) back to the same problem we had before: writing a 300+ line C file for something that should be rather simple, mathematically speaking.
Another important consideration is library development. As someone developing a large library, when we were previously developing in C++ we would encounter large barriers where some design goal we had either couldn't be done in the language in a practical sense or would require hours of tweaking things like template constraints to work correctly. Our users also had such a hard time extending our library's features they usually gave up and just waited for us to add features for them.
Now that we're developing in Julia we can realize our designs as we envisioned them with very little code. Features like multiple dispatch have been a lifesaver for us. And we are getting performance that is quite close to C++. Now we're looking forward to using features like composable multithreading.
One could wonder "why not use python?" but for performance reasons one ends up with the 'two-language problem' which we wanted to avoid.
True, the D numeric ecosystem cannot be compared to Julia's and I would pick Julia if I was a scientist of course. Notice, I was talking about performance though.
Which is also meaningless if the eco-system doesn't grow as much as the competition.
D users put the language into a pedestal of language design, but that isn't what grows an eco-system, getting new users and libraries does.
I used to love the language, but so many mistakes have been made during the last 10 years, that it will hardly recover unless some company champions it, Swift/Kotlin style.
I agree. However, I wish more people valued solid foundations over tooling because no tooling is worth building on "good-enough" basis. If that ever was true we would have never had the situation when we desperately seek for alternatives and create numerous solutions each coming with its baggage of gotchas like "yeah, but you need to use this PackageCompiler library to make it faster".
So after reading the release notes about 6 sec. of start-up times and people keep complaining about warm-ups, I am smirking about how Julia is a dynamic language when compiling & executing a script written in a static language other than let's say Scala/Rust/C++ can be comparable or even faster in some situations.
Being a dynamic language is not about start-up time, interpretation or compilation, but about types being a part of the value instead of the container (the variable definition). Julia is definitely dynamic.
The warm-up period is definitely an annoyance, but a surprisingly small one. Even if it takes one minute to compile all the libraries and code I'm working on, my programming session is usually much longer than a few minutes, so that warm-up becomes insignificant as I keep the program alive during all the development process and any new addition are pretty much instantaneously compiled (unlike static languages that have to be frequently recompiled, and it's faster even compared to incremental compilation in languages like Scala) and at the same time running faster after warm-up saves time over the session compared to interpreted language as well. Never bothered with PackageCompiler.
It's a matter of different workflows, and since Julia isn't the same as the usual dynamic languages or the usual compiled languages, it's easy to end up with suboptimal ones especially at the start (which I assume does hurt the image of the language as first impressions are key). That said I'd definitely want the ability of creating small static binaries for deployment or end users (even if they don't help during development, which I'm already more than satisfied).
Anonymous lambdas instead of comprehensions, delegates instead of closures. D does not have decorators in the sense Python does, well function attributes might come close. What I do enjoy in D syntax-wise is UFCS: https://tour.dlang.org/tour/en/gems/uniform-function-call-sy...
Array slicing, ranges, compile-time execution. Probably the most succinct and easy-to-read syntax out of many compiled languages. It's an investment (as with any other language) but the learning curve is not steep if you get scared away by C++ or Rust for that matter. Julia comes to mind too in terms of syntax but no UFCS and its interoperability with C++/C looks like a lot of work compared to D.
True. I forgot about implicits (because I try not to use them :).
Unfortunately it looks more like a kludge tbo. Sort and uniq get also hidden in a class which can be tucked away from your eyes and this is never good.
Anyways, since Walter is probably reading this: Any new insights around this quote five years later?
> I know a lot of the programming community is sold on exclusive constraints (C++ concepts, Rust traits) rather than inclusive ones (D constraints). What I don't see is a lot of experience actually using them long term. They may not turn out so well. –Walter Bright
Yes, they are frequently compared. If you like style-insensitive (case insensitivity, underscore ignored) languages with hacker mentality, Nim is for you. This is simply a no-go for me. I briefly tried Nim before even knowing about D and it never stuck with me due to awkward syntax (subjective). Later I learnt that Nim (being a younger language) frequently breaks backward compatibility. Maybe it changed now but all the above was enough to pass it and move on.
The way to interpret that statement IMO is that D is a good alternative to using a scripting language. I use it that way all the time - it's almost completely replaced Ruby, which was my previous scripting language of choice.
On Linux, put this into a file, chmod +x it, and execute. The first time it will take a while (downloading and compiling packages). After that on my crappy laptop it takes:
0.66user 0.06system 0:01.25elapsed 57%CPU
That is a lot more convenient than venv in Python.
I do not follow D closely, but i get the impression that the language breaks backwards compatibility every now and then - i remember some posts here or Reddit some months ago by someone complaining that Walter Bright introduced some changes to the language that broke existing code.
IMO a mature language is a language that you can depend on for your existing code to keep working in a timespan of decades - like C and C++ for example. A language that willingly breaks backwards compatibility is a toy, not something to be taken seriously for long term work.
The Wikipedia page history section [0] talks about stability. The most relevant part is this:
> The release of Andrei Alexandrescu's book The D Programming Language on June 12, 2010, marked the stabilization of D2, which today is commonly referred to as just "D".
In other words, D is backwards compatible for 10 years now. At least, I don't know any breaks and the little code I have in D never broke.
The transition from D1 to D2 did break backwards compatibility in 2007. The change is comparable to the Python2 to Python3 transition but in a much smaller community. Outdated news from that time still pop up sometimes. Maybe you heard something related to that?
So basically if i write some D code now it'll keep working (assuming no OS ABI changes) and compiling in 20 years from today? I'm ok with very minor changes due to compiler bugs or whatever.
Then I guess C and C++ are toys as well, given there are a couple of breaking changes.
K&R C, gets, Annex K, VLA, gone by now in C17.
gets, exception specifications, external templates, std::auto_ptr now gone, RVO semantics changed, and a couple of other minor semantic changes by C++20.
I know, i've tried a bunch of them and have several C compilers installed but AFAIK none has removed support for gets or VLA (or anything else they bothered to implement). When i wrote "none" i wasn't referring to the standard but to the actual compilers. I do not care what the standard says is deprecated or to be removed, what i care is what compilers actually do since that is what affects existing code.
They were made optional in C11, so any C compiler that was yet to implement them doesn't need to actually bother implementing them to achieve compliance with more recent ISO versions.
If you look beyond clang and gcc, there are a couple of candidates, specially on proprietary embedded toolchains.
Thanks for the link. Interesting reasoning around the choice of the 'negative' __STDC_NO_VLA__ macro, rather than a 'positive' macro like __STDC_VLA__.
What happens some times is that changes in the compiler fixes issues that were not well defined in the specification, or were bugs in the implementation. Programs that used the feature wrongly or that were in fact buggy then break when compiled with the better specified feature.
It's comparable to compiling a K&R conforming C program with C89 or C99 enforcing: it will reveal bugs that were none in K&R (type punning, uninitialized variable, prototype violations, etc.).
Rust is not a very good example here considering the amount of time the op might have spent waiting for the compiler and steep learning curve of the language...
With all due respect, Nim is a great language performance wise however I fail to see how is it different from D. Many advertised features have been present in D for a long time.
Syntax wise Nim it is a step back. It's hard to read and understand while any C/C++ dev will have next to no effort reading through D code. And of course while D is C ABI compliant it interoperates with C++ well too.
I seriously doubt that Nim or any other language in this regard has better metaprogramming than D.
Nim might have a speed overhead in certain tasks but that depends on a benchmark. Besides, does Nim have anything similar to NumPy which is actually faster? D does.
* For a full comparison of features D vs Nim see [0], or a more succint selection of items by Nim's creator [1]; these also contains some remarks about metaprogramming and C/C++ interoperability.
* syntax-wise you could also argue that is a step forward: it embraces Python syntax and it would be easy to read and understand for all Python devs.
* Nim does have a "faster NumPy" by the same author of the article: [2]
Thanks. So here are the points for D according to the table. Very few breaking changes because language is pretty mature. Not sure if Nim is as stable but just this one fact is enough for me personally.
D has slices, ranges and lazy evaluation which is a joy when you do data number crunching. Nim does not have them.
Also no mention of dpp in a table is surprising tbo.
D PRs take time mostly because they go through a rigorous community review. And by that I mean your PR does need to hit the quality bar. Which is only good and does not make a language a sandbox of community features like C++. See the PR for macros in D and why it didn't happen. Another reason is the lack of people of course.
I read through Arraymancer readme and the rationale and got very mixed feelings about the project and its purpose. It's everything and nothing at the same time. NumPy like syntax and functionality, sklearn algorithms and look, deep learning too, here is and example and some screens which we borrowed from Scipy. Seriously? Sorry if that might be too judgemental but it sounds and looks amateurish. And I thought of benchmarking it against D mir... Also, the name... And this is what I feel about Nim in general, rushed and undercooked.
I agree that it is strange that, in all of the discussions surrounding relatively modern languages, and new technologies, while nim and rust and many others frequently get mentioned, so rarely it seems that anyone talks about D. Wonder why that is? I’m genuinely curious.
Because D is effectively dead/has lost its momentum completely?
If you look at the development surrounding D, the stdlib got its allocator module around 5 years back IIRC in experimental. It is still not stable. Same for many other modules. Due to lack of resources not much work is being poured into the stdlib. Same for developer tooling, some people have created ide tools and stuff, however they are not AST based and so most of the things that should work don't work.
Yet, instead of pouring resources into these issues, the resources are poured into developing 3 compilers dmd, gcc based d, and llvm based d.
Also, the readability issue posted by the op are subjective. To me, and many others, Nim is much much more readable and elegant than C/D/C++/Rust.
Also regarding OP's
>Many advertised features have been present in D for a long time.
Nim has macros and many features are based on or around macros, whereas D will never get macros. IIRC D uses reflection based stuff?
> I seriously doubt that Nim or any other language in this regard has better metaprogramming than D.
Well, you are wrong (Unless you like reflection more than macros). Nim, haxe, etc. languages have very powerful AST based macros.
Are you the same person who attacked ppl on D forum. The nickname is familiar and the style ;)
Chill. All languages have dusty corners and given the amount of ppl D has it's obvious that not all things are fixed in time but they were and they will, history already proved it. Besides, complaining is easier right?
> Are you the same person who attacked ppl on D forum
Nope. I am the person who asked about dlang VS plugin and new ds module recently. I don't remember attacking anybody.
> All languages have dusty corners and given the amount of ppl D has it's obvious that not all things are fixed in time but they were and they will, history already proved it.
Well, it was you guys who asked why D is not discussed/used more. I just gave my honest opinion. D's ide tooling is very bad compared to newer languages, even Zig. It's stdlib is not being worked on. Now, what do you expect? Why would people use D? How is creating 3 compilers not a waste of resources when there are 2 important issues that are not being worked on by anybody? Do you think more people/orgs will flock to D because it has 3 compilers even though its tooling is very bad?
> Besides, complaining is easier right?
I don't like how this is posted everywhere somebody points out flaws in something. I am not a compiler developer. I don't have much knowledge about low level stuff. I want to use a programming language for my field of expertise. If the language/ecosystem is not good for that, what do you expect me to do? Do you want me to leave everything aside, set apart 2 years for learning compilers and how its tooling is made. Then leave everything aside for 2 more years, and start contributing to D. Is this what you want me to do?
When you release a project, if you want it to be used by people, it is your responsibility to make sure that it is usable. It is not the customer's responsibility to fix the product.
Look man, I intend no disrespect to anybody. I respect walter and other people. However, what I said is a fact. D/Nim/Crystal etc. have few to no resources for development. It is upto their leads to prioritise things for sustainability. I understand how difficult it is for a project this size. But, you people can't be putting the blame on people like me for not contributing to the project or for saying that the reason why people not use D is because its tooling is nonexistent and its stdlib is dead. AFAIK, many core people also agree that dmd should be deprecated and focus/resources should be put elsewhere.
Ok. But going back to the issue. I live in Munich and know at least two big companies who use D in production and organize meetups. I have yet to see a single company that uses Nim.
Well, mainly Status.im uses Nim, and there is one based in China I think (Some of their employees are regulars in the Nim gitter/irc/discord). Also, there is https://github.com/nim-lang/Nim/wiki/Companies-using-Nim , though I don't know how up to date it is.
Anyhow, there is not much jobs for either nim/d/haxe/... So, they are mostly used for personal projects. I wanted to use D however it didn't work out, so onto other languages :(
Basically, D was too little, too late, with a lack of build tools, while Rust, Nim etc are designed from the beginning to have certain features and have easy to use build tools like cargo.
There is a perception that D has shot its bolt, having been around for quite some time without making a noticeable impact, either by itself or by influencing the mainstream.
Depends on where do you read ;) But tbo D community concentrates around D forum and in majority consists of former C/C++ veterans. Being a younger language Nim lures younger and active members who probably never heard of D anyway.
Historically D never advertised itself enough imho and that is a shame. I blame the lack of proper leadership and management at the start. Its early development progress has been a rocky ride many languages would not survive at all but it did and for me personally it's a testament to its maturity and resilience newer languages still have to prove.
> Many advertised features have been present in D for a long time.
They have also been in Nim for a very long time. The initial Nim version was released in 2008, and is comparable to D2 (started 2007) rather than D1 (started 2001). So they are effectively contemporary, and gained a lot of the features at comparable times (and with some cross pollination, I'm sure, although I wasn't there to witness it).
> Syntax wise Nim it is a step back. It's hard to read and understand while any C/C++ dev will have next to no effort reading through D code.
There is no accounting for taste, but Nim syntax is Pythonic; Python is generally regarded as one of the easiest (if not THE easiest) real programming languages to pick up. Both D and Nim get hairy when you use advanced metaprogramming features, but at the surface level -- Nim is likely to be easier to pick up except if your audience is exclusively C/C++ programmers (and maybe even then).
> I seriously doubt that Nim or any other language in this regard has better metaprogramming than D.
You should learn some Lisp then, the mother of all metaprogramming systems :) and also, you should look at Nim, it's not quite Lisp but it goes as far as (and perhaps farther than) Lisp-without-reader-macros. This example[0] embeds a compile time type-checked syntax-checked SQL dialect into Nim.
> Nim might have a speed overhead in certain tasks but that depends on a benchmark. Besides, does Nim have anything similar to NumPy which is actually faster? D does.
Yes, Nim has ArrayMancer (by mratsim, who wrote this raytracer as well) which is considerably faster than Numpy and also natively supports CUDA and OpenCL, IIRC, even though it's still younger so it's not as complete as Numpy. But Nim also has Nimpy which lets you mix Python and Nim with the least friction (and generates one executable that works, and works equally well and quickly, with whatever Python you happen to use at runtime - Py27, Py36, Py37, Py38 - not familiar with anything else that does). There was also NimBorg which provided similar mixing with Lua, but it seems to be abandoned now for lack of interest.
> And of course while D is C ABI compliant it interoperates with C++ well too.
Nim is source level as well as ABI level compliant with C, C++ and Objective C (you can use C++ exceptions and objects natively, no need for an "extern C" wrapper; same with Objective C). And also natively compatible with Javascript (though obviously not in the same compilation unit ...) . D is under appreciated, for sure, but Nim is definitely not lesser, and it's about as old.
> Syntax wise Nim it is a step back. It's hard to read and understand while any C/C++ dev will have next to no effort reading through D code. And of course while D is C ABI compliant it interoperates with C++ well too.
On the other hand, people coming from Python, Ruby, Pascal, Ada will appreciate the minimal amount of sigils.
> I seriously doubt that Nim or any other language in this regard has better metaprogramming than D.
Can you write an Embedded DSL + compiler running at compile-time in D
Can you generate a state machine that lowers down to optimized computed gotos with no dynamic allocation suitable for multithreading runtimes and embedded devices and able to display the actual graph at compiletime?
Can you emulate classes with ADTs to solve the expression problem, avoid cache misses and multithreading problem due to OOP and the double indirection due to the visitor pattern?
> Nim might have a speed overhead in certain tasks but that depends on a benchmark. Besides, does Nim have anything similar to NumPy which is actually faster? D does.
I suspect d's metaprogramming is not as good as nim's (haven't used the latter), though it's better than metaprogramming of every language I've used without macros.
That said, it can do all of the metaprogramming-related tasks you mention.
I gave quite a few statically-compilable language a try a few years back before settling on nim as my "fat binary" language of choice.
D was a contender, but ultimately the reason I dropped it was I was unable to compile a hello world on my laptop. Now, that machine is not fast, but it is actually pretty new, it's a very low end 2017 dell machine, and it turned out that bootstrapping a D environment requires a midspec machine or it literally cannot complete.
I donno man, D seems to have a "last 5%" problem. It looks good on the surface, but as you start looking into it you discover that the bootstrap tools are fat as hell, the core library has a weird split in GC styles, the doc is inconsistent. Everything you do in D is 5% harder than it needs to be, nothing is buttery smooth. Overall, all those 5% multiply together to make it a 20-30% worse experience overall, although I couldn't point at any one thing and say "that is what has killed D".