Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There are a few other active projects porting weather, ocean, and climate models to Julia, C++, or Python.

Getting a weather or climate model from zero to production grade requires approximately 100 person-years, or $20M (personal experience). Because of extremely high scientific expertise needed to correctly implement such models, it's more difficult to leverage open source contributions from a broader community, like it is with web or ML frameworks. So most of the development in these projects is done by full-time hires, and to a lesser extent by contributions from early adopters.

The key technical arguments that I hear/read for such transition projects are ease of portability to accelerators (e.g. GPUs), and higher availability of programmers in the workforce.

My intuition is that a $20M budget, if carefully allocated to compiler and hardware accelerator teams, could solve running Fortran on any accelerator in its entirety.

With Fortran's tooling and compiler support for offloading standard Fortran to accelerators slowly but surely improving over time, the rationale for porting to other languages becomes increasingly more questionable and is being revisited.

But regardless of technical arguments, it's a good thing for large models and frameworks to be written and explored in diverse programming languages.



Coding for GPU is more than language support. I'm currently rewriting a CFD engine that is implemented in Fortran with computer shaders and there are several cases where the logic of the code has to be rebuilt from scratch to fit the GPU model.

Also, I understand the only Fortran compiler that supports GPU is the one from nvidia which is proprietary. I prefer to rely on open source for a code base that will last at least ten years...

But reading this HN's thread, I understand that Fortran is more alive than I thought. How many new developments are done with Fortran ? I mean, to me, Fortran is a bit like Cobol: it is so entrenched that, obviously, it still have a lot of activity but the momentum is moving towards more modern languages... But, well, that's all guesses an impressions...


For example the Exclaim project at ETH Zurich trying to rewrite ICON in python (granted, they use special decorators that JIT to GPU instructions). It looks like a bit of a fools errand to me to try an write such a complex and performance -sensitive software in a dynamic language. I wonder why Chapel hasn't seen widespread adoption yet


The recent "Neural General Circulation Models" pre-print [1] from Google Research indicates that the team built a spectral dycore from scratch using JAX; in Appendix A they note that it comes in at just over 20,000 lines of code! This model isn't quite "production grade" from the perspective of NWP (lacks microphysics and therefore any prognostic precipitation in the forecast), but it's the only project I can think of that has rapidly produced such a atmospheric model "from scratch" with a high degree of engineering rigor.

[1]: https://arxiv.org/abs/2311.07222


there is a fairly major assumption being made here which is that none of the $20M cost is coming from Fortran. Part of the reason for porting the models is ease of development and if you make development easier, that $20m can go down pretty quickly.


As a real-world counter-example, CLiMA (Julia) lists 54 people on their Sci&Eng team, and the project has been going for several years now. It is far from production ready. Granted, they tackle everything at once (atmosphere, ocean, data assimilation, machine learning etc.). I'm a big fan of the project and wish it the best. This is to illustrate what a huge endeavor it is to build a system like this, regardless of the implementation language.


Developing projects like this in Fortran is not any more difficult than other languages. I can't speak to Julia because I don't understand it well enough, but in my experience writing this kind of code in Fortran is more natural and easier than in Python + numeric/array/accelerator libraries. The key unfulfilled need is to have Fortran compilers do offloading efficiently for you. Then, there's no comparison.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: