Hacker News new | past | comments | ask | show | jobs | submit login

How does this compare to MLX? As far as I understand MLX is equivalent to PyTorch but optimized for Apple Silicon.

Is this meant for training MLX models in a distributed manner? Or what is its purpose?




It looks like MLX is a part of this initiative. https://github.com/apple/corenet lists "MLX examples" as one of the components being released in April.


As mentioned in the "mlx_examples/open_elm": "MLX is an Apple deep learning framework similar in spirit to PyTorch, which is optimized for Apple Silicon based hardware."


Just skimming the README it looks like it’s a layer above MLX. So looks like a framework around it to ease ML


It's a layer on top of PyTorch, and it has code to translate PyTorch models into MLX.


So, is CoreNet the equivalent of Keras, whereas MLX is the Jax/PyTorch equivalent?


Sounds reasonable. Apple writes the following about MLX: "The design of MLX is inspired by frameworks like NumPy, PyTorch, Jax, and ArrayFire."


Not quite. The closest equivalent would be something like fairseq. It's config (yaml) driven.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: