Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There must be something else, because most of what Rust brings to the table is what functional languages have been providing for ages, just with rebranded names.


Existing functional languages had their own issues:

1. Haskell: had to deal with cabal hell

2. Scala: Java toolchain, VM startup time, dependencies requiring differing Scala versions.

3. F#: .NET was considered too Microsofty to be taken seriously for cross platform apps.

4. OCaml: "That's that weird thing used by science people, right?" - Even though Rust took a decent amount of ideas from it, it got validated by its early users like Mozilla and Cloudflare, so people felt safer trying it.

5. Lisp: I don't think I need to retell the arguments around Lisp. Also a lot of the things Rust touts as FP-inspired benefits around type systems really aren't built into lisp, since it's dynamically typed, these come more from the Haskell/Scala school.


> OCaml

Also as much multithreading as Python (aka only for io).


There is truth to that, the "something else" is a different set of trade-offs for some other things that have usually been associated with FP languages.

Rust feels like the love-child of part of ocaml (for the sum types), part of C (very small runtime, ability to generate native code, interrop with C libs, etc..), part of npm (package manager integrated with tooling, large discoverable list of libraries), etc...

Borrow-checking seems a bit newer-ish - but I'm pretty sure there is an academic FP language that pionnered some of the research.

No-one is planning to give Rust the medal of best-ever-last-ever language any time soon.

And none of that is a "bad thing" (tm.)


In my case, I use it because it is dead simple to get a standalone, lean, fast, native executable (on top of the other functional programming features). Cargo is a huge part of what I love about rust.


I have a great example. We have 100s of Markdown files. I needed a link checker with some additional validations. Existing tools took 10-20 minutes to run.

I cooked up a Rust validator that uses the awesome pulldown-cmark, reqwest and rayon crates. Rayon let me do the CPU bits concurrently, and reqwest with streams made it dead simple to do the 1000s of HTTP requests with a decent level of concurrency. Indicatif gave me awesome console progress bars.

And the best part, the CPU bound part runs in 90ms instead of minutes, and the HTTP requests finish in around 40 seconds, primarily limited by how fast the remote servers are over a VPN halfway around the world.

No attempt made to optimise, .clone() and .to_owned() all over the place. Not a single crash or threading bug. And it will likely work one year from now too.


Reading your comment made me realize another thing: Using rust often feels like the language is a successful attempt to take the best parts of a lot of other languages and put them together into a single, rational collection of features. Most of what's in there isn't new, but it all fits together well in one place so I don't feel like I have to make a devil's bargain for important features when I start out.


Most good languages seem to boil down to this.


Are the checks somewhat time-stable? Couldn't some of the checking (and network requests) be avoided by caching? For example by assuming that anything OK'd withing the last hour is still OK.


> most of what Rust brings to the table is what functional languages have been providing for ages

In relatively familiar package & paradigm, with great package management (Haskell is the best of the functional langages there and it’s a mess), and with predictable and excellent performances.


That thing is runtime performance.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: