As we've been discussing in another thread, it's possible! One can "opt out" of Julia's dynamic behaviors on a function-by-function basis using the JET.jl tool for static analysis: https://aviatesk.github.io/JET.jl/dev/optanalysis/
Why not make this the default everywhere? Well, there are a lot of scientific use cases where it's convenient to have Python-style dynamic typing and interactivity. A cool thing about Julia is that it allows that, while _also_ allowing to achieve high-performance, all within a single language.
For the record, I do also love the static type system and overall design of Rust. But for my day job (research in numerical methods and computational physics), I find Julia to be the most efficient way to get the job done -- rapid algorithm prototyping, data analysis, plot generation, etc.
It may be worthwhile to understand where dynamic typing is helpful since this gets mentioned a lot. Python and other dynamic languages are increasingly reliant on static type checkers.
> We've had test code that goes from taking 10 minutes to run to over 2 hours because of type instability in a single line of code.
For those who might not be familiar, tooling can sometimes help a lot here. The ProfileView.jl package (or just calling the @profview macro in VSCode) will show an interactive flamegraph. Type instabilities are highlighted in red and memory allocations are highlighted in yellow. This will help to identify the exact line where the type instability or allocation occurs. Also, to prevent regressions, I really like using the JET.jl static analysis package in my unit tests.
If they are so easy to identify, why not just make it a JIT error. Manually inspecting all of this sounds awful. I'd rather my compiler just do it for me.
Dynamic behaviors can be a nice default for all the spots where performance is not critical. So Julia lets you code like Python in places where performance doesn't matter, and then code like C++ or Fortran in places where it does.
Julia is my tool of choice for writing numerical code where performance is critical. I work in computational physics, and have found Julia and its ecosystem to be far nicer than Rust in this space.
It's true that accidental dynamic behaviors are a real concern and can be a performance killer. Fortunately, the language has nice tooling. In VSCode, I often use the visual profiling tool `@profview` to get a flame graph. Anything dynamic gets highlighted in red, and is quick to diagnose. There also exist nice static analysis packages like JET.jl. During development, one can use `report_opt` to statically rule out accidental dynamical behaviors. Such checks can also be incorporated into a project's unit tests. In practice, it's not much of an issue for me anymore. But to be fair, there is a big learning curve for new Julia users. See, e.g., https://docs.julialang.org/en/v1/manual/performance-tips/
Those tools didn’t actually exist the year or so I was writing Julia professionally (and that was only 3 years ago) so it’s nice to see the language coming along.
At the same time, I would expect the Rust ecosystem to overtake Julia’s in that domain in the next couple of years. Polars is already nicer than pandas, I’ve seen a some promising work on numpy-style tensor libraries, and I’m pretty impressed by the progress with getting enzyme integrated into Rust (I could never make it work with Julia). Here’s a nice example repo I saw recently:
Not the person you’re respond to, but Rust is never going to have a REPL and is likely to never compile very quickly. For a lot of numerical and scientific use cases that’s a fundamentally restraining factor. WRT performance, you’re ofc correct but that’s not always as paramount as it may seem if you need to tweak the data dozens or hundreds or thousands of time. In that case, having to wait more than say 5 seconds or so is prohibitively annoying.
I think something in between Zig and Rust will emerge someday as a sort of optimal compromise between compile speed, safety, and programmability wrt memory and performance tradeoffs.
Agreed. Julia's combination of REPL + JIT + Revise.jl can feel like magic. The compiler automatically detects changes to your source code and provides hot-code reloading of fully optimized machine code, in the blink of an eye!
Also, it's worth emphasizing that the user experience of Julia has been improving greatly, even in just the last 3 years. Julia 1.9 introduced caching of native code [1], and now at Julia 1.11 the time-to-first-plot in a new Julia process is typically less than a second.
Having said all this, Rust is an absolutely fantastic language too, and might be preferred for large-scale software development efforts where static analysis is prioritized over an interactive development workflow.
There was also some other if I remember correctly I saw it in the comments of https://youtu.be/eRHlFkomZJg, but now I don't see it , I think it was evcxr only.
It is really easy to write python bindings for Rust, which is probably the easiest way to “consume” a high-performance library (e.g. a physics simulator, data-frames, graphing, a type-checker, etc).
I'll speak as someone who began as a Julia skeptic, but now finds it invaluable for my day-to-day work. Here are some reasons why you might prefer Rust today:
- Julia's lack of formal interface specification. Julia gets a lot of flexibility from its multi-method dispatch. This feature is often considered a major selling point in allowing code reuse. Many Julia packages in the ecosystem can be combined in a way that "just works". Consider, for example, combining CuArrays.jl with KrylovKit.jl to get GPU acceleration of sparse iterative solvers (https://github.com/Jutho/KrylovKit.jl/issues/15). But it's not always clear who actually "owns" such integrations. Because public interfaces aren't always well documented in Julia, things are prone to breakage, and it can sometimes feel like "action at a distance". This was especially painful with the OffsetArrys.jl package, which suddenly introduced arrays that could begin at any integer index. (That was the major theme of Yuri's blog post, and the simple solution for most people was to avoid OffsetArrays.) Rust's community philosophy and formal trait system err on the side of providing static guarantees for correctness. But these constraints also take away flexibility to fit distinct packages together. For example, Julia has always had excellent support for type specialization, and this has been notoriously challenging to fit into Rust, even in a very limited form: https://users.rust-lang.org/t/the-state-of-specialization/11.... Conversely, there have been many discussions about designing a formal interface system in Julia, but it remains a challenge: https://discourse.julialang.org/t/proposal-adding-optional-s...
- Julia is designed around just-in-time compilation. For example, every time a function is called with new argument types, it will be freshly compiled for that specialization. This is great when you care about getting optimal performance. Also, because Julia allows to reify syntax as value-level objects, you can assemble Julia code that is custom optimized to run-time values. All of this is amazingly powerful for certain kinds of number crunching codes. But carrying around a full LLVM system is clearly a blocker for distributing small, precompiled binaries. Hence the LWN discussion about the preview juliac feature, which will offer a mode for fully static compilation.
- Rust's borrow checker is something to envy. In any other language, I miss the ability to safely passing around references to stack allocated variables, or to know that a referenced value cannot be mutated.
Finally, I would probably recommend Python (not Rust!) for most machine learning or data analysis projects that aren't too "bespoke". There's just so much momentum behind PyTorch and JAX. The Julia community is developing some very interesting packages in this space. Notably, Lux.jl, Enzyme.jl, Reactant.jl, and all of SciML. These are super powerful, but still very researchy. For simple things, Python will probably be less friction.
The best language will depend on your use case. Julia serves its niche very well, even if it doesn't fit every possible use case.
Can you expand on the intriguing comment that “because Julia allows to reify syntax as value-level objects, you can assemble Julia code that is custom optimized to run-time values” (ideally with an example)?
Another tool in this regard is https://github.com/JuliaLang/AllocCheck.jl, "a Julia package that statically checks if a function call may allocate by analyzing the generated LLVM IR of it and its callees using LLVM.jl and GPUCompiler.jl"
Makie has grown to become a very powerful tool and an important part of the Julia ecosystem. A few Makie features are especially worth highlighting:
1. Using the "observable" system, it's quite easy to put together simple GUIs with toggles, sliders, etc. This is really nice for interactive data visualization, real time physical simulations, etc.
2. One can (for the most part) smoothly switch between various backend options. GLMakie is the high-performance OpenGL backend. WGLMakie builds on WebGL and can be used in browser context, which is nice in Jupyter notebooks, or in HPC scenarios where the calculation is being performed on a remote cluster. CairoMakie allows for exporting publication-quality vector graphics.
3. A couple years ago there were severe issues with "time to first plot". I'm happy to say that this issue is now (almost completely) fixed! The initial Makie installation will take a couple minutes to precompile and cache object files. But after that, the time-to-first plot in a new Julia session only takes about 1 second.
The town of Los Alamos is beautiful, but it's small. It can be a great place if you already a have family that enjoys outdoors activities. Many people prefer to commute from Santa Fe (45 min drive with mountain views, negligible traffic). Certain groups allow a flexible hybrid home/office working mode.
I was only a guest scientist at LBL, but LBL is practically in Berkeley.
But yeah, generally I assume this is true, though you may just find that you need a car and that if you have one it's not so bad. For instance, NREL (where I was also briefly a guest scientist) is in an incredibly gorgeous area near Golden which isn't super small but everything is still very spread out and it'd take a while to walk anywhere for sure.
I guess it also depends on your definition of "middle of nowhere" I suppose. Golden is hardly Oakland, but I am pretty sure I could find people to date as long as I included Denver... and if you have a car in Colorado you will find that Denver is considered close to a lot of things you might not at first consider it close to. It's only an hour drive from Boulder (where I lived and drove to NREL as needed).
I'm a computational physicist at Los Alamos and would echo these sentiments.
Note that there are two main types of DOE labs: NNSA (Sandia, Los Alamos, Livermore) and Office of Science (Brookhaven, Berkeley, Oak Ridge, Argonne, ...). Although the former is more focused on "nuclear weapons stockpile stewardship", there is still much basic science at all DOE labs, especially where computer science meets physics and other domain sciences.
Perhaps relevant to HN, I would mention the Applied Computer Science group at Los Alamos, which is in hiring mode (https://www.lanl.gov/org/ddste/aldsc/computer-computational-...). Besides supporting computational physicists in code development efforts, this group does a variety of researchy things like designing programming model, doing compiler development, building ML models, especially with an eye towards large scale scientific computing. The pay at a DOE lab is less than FAANG (PhD student interns might be around $80k/yr and starting staff scientists maybe $130k/yr), but the tradeoff for some people would be the research-flavor of the work, and the flexibility. Many of the LANL codes being developed are open source, for example. Other DOE labs have similar computer science divisions. For example, Oak Ridge, Argonne, and Berkeley all have "leadership computing" facilities.
Curious, is there room at LANL for a senior full-stack generalist engineer who hasn't necessarily been doing published research work? I've looked at LANL Careers and can't gauge how biased you guys are toward research backgrounds.
LANL definitely needs and uses generalist engineers. I interned there as one, many years back. Standard CRUD-app stuff to help the lab do its main goal of nuclear stockpile stewardship.
But it will help if you enjoy research culture. The location is rather isolated and you will probably be friends / co-workers with people doing research. (I say this because not everyone likes being around PhDs; I've met several people in software who disdain academics.)
Think they need any janitors/maintenance techs? I can't do much but I can relocate to Oak Ridge fairly easily and I can surely unclog a toilet and clean up blood and paint some baseboards pink for breast cancer awareness month and reset some tripped breakers and change your backup generator fluids and swap out a jiggly door handle and refill the printer paper and repair the reserved parking sign that fell over and set rat traps on the drop tile ceiling space and swap out a malfunctioning automated transfer panel and buff the floors and refill/repair the vending machines and hang the pictures of the boss on the wall and upgrade the circuit breaker to accommodate all the space heaters and check/replace the fire extinguishers and AEDs and rekey the RFID door readers and I dunno, whatever else that "engineers" don't even realize needs to be done. HMU I got an expired TS but nothing precluding a renewal except for (currently) slightly disagreeable ideas that I don't express beyond internet fora.
But you can only work for these labs if you're a US citizen, am I right? :) Also are there any such labs or outposts in Hawaii? Would be a great beautiful place to live
You need to get security clearance for most roles - even a joint citizenship may be problematic:
Sandia is required by DOE to conduct a pre-employment drug test and background review that includes checks of personal references, credit, law enforcement records, and employment/education verifications. Applicants for employment need to be able to obtain and maintain a DOE Q-level security clearance, which requires U.S. citizenship. If you hold more than one citizenship (i.e., of the U.S. and another country), your ability to obtain a security clearance may be impacted.
Applicants offered employment with Sandia are subject to a federal background investigation to meet the requirements for access to classified information or matter if the duties of the position require a DOE security clearance. Substance abuse or illegal drug use, falsification of information, criminal activity, serious misconduct or other indicators of untrustworthiness can cause a clearance to be denied or terminated by DOE, resulting in the inability to perform the duties assigned and subsequent termination of employment.
I work at Berkeley Lab and there are tons of internationals here. I think it differs between labs - just an hour away at the Livermore Lab there are a lot fewer internationals because of what they do. We don't do anything classified and this lab hands out visas like they're candy.
Sandia is required by DOE to conduct a pre-employment drug test and background review that includes checks of personal references, credit, law enforcement records, and employment/education verifications. Applicants for employment need to be able to obtain and maintain a DOE Q-level security clearance, which requires U.S. citizenship. If you hold more than one citizenship (i.e., of the U.S. and another country), your ability to obtain a security clearance may be impacted.
Applicants offered employment with Sandia are subject to a federal background investigation to meet the requirements for access to classified information or matter if the duties of the position require a DOE security clearance. Substance abuse or illegal drug use, falsification of information, criminal activity, serious misconduct or other indicators of untrustworthiness can cause a clearance to be denied or terminated by DOE, resulting in the inability to perform the duties assigned and subsequent termination of employment.
Why not make this the default everywhere? Well, there are a lot of scientific use cases where it's convenient to have Python-style dynamic typing and interactivity. A cool thing about Julia is that it allows that, while _also_ allowing to achieve high-performance, all within a single language.
For the record, I do also love the static type system and overall design of Rust. But for my day job (research in numerical methods and computational physics), I find Julia to be the most efficient way to get the job done -- rapid algorithm prototyping, data analysis, plot generation, etc.