It may not be quite as mature, but it's getting there quickly.
It's also far more interoperable because of Julia's multiple dispatch and abstract types.
For example, the https://github.com/alan-turing-institute/MLJ.jl ML framework (sklearn on steroids), works with any table object that implements the Tables.jl interface out of the box, not just with dataframes.
I was using Julia before getting frustrated of how hard it is to use cutting edge AI on it, and decided to move to Python (fast.ai in specific, but moving more to PyTorch, as fast.ai 2's hook system made things much worse than if it would be something simpler).
I don't want mature, I want to use cutting edge AI algorithms and data pipelines on the newest NVIDIA GPU, so to make me move to Julia, the researchers have to move first.
At the same time for my non-AI data processing work I'm staying with Julia.
To be clear, I'm a big Julia fan and would love to be able to move to Flux.jl (I have not looked at the framework you linked but will now). But Julia is still overall more fragmented, sparsely supported, and not as easy to work with as python for ML projects. If I have a budget to work against, I cant justify trying to deal with the rough edges in Julia.
It may not be quite as mature, but it's getting there quickly.
It's also far more interoperable because of Julia's multiple dispatch and abstract types.
For example, the https://github.com/alan-turing-institute/MLJ.jl ML framework (sklearn on steroids), works with any table object that implements the Tables.jl interface out of the box, not just with dataframes.
That's just one example.