I think it is a truely beutiful thing how the Flux and DiffEq are totally seperate, and it takes ~100lines to make them work together.
Now suddenly you don't just have 2 or 3 ODE solvers you have dozens.
I wonder if this type of collaboration is easy to achieve in other languages? Of course not many have an awesome ODE ecosystem like Julia's. But it feels like an unique strengths of Julia where it's easy to integrate heavy duty libraries and have them work together!
I think Numba and the Cythan ecosystem will struggle to achieve this with this level of (relative) ease, although I see the original implementation was done in PyTorch. The ODE and ML experts probably made it look easy!
Now I wonder about the relative performance of DiffEqFlux vs the original implementation
the numerical performance ecosystem in Python is historically more siloed than in Julia, due to reliance on C-like language for performance. Relative latecomers like Numba and Autograd(compare to NumPy) didn't get enough traction for it all to cohere nicely.
(I don't use Julia, but have done several evaluations for our hpc work)
Fascinating! Such a great demonstration of Julia's strength. I wish more academic papers are written in this fashion or include a tutorial-like/reproducible post like this.
This is so true! I was struggling with a paper yesterday and I was hoping it gave more examples. The paper felt terse but i think this could be a space constraint issue with physical journals. In the internet age, papers should be longer form with more examples and clear explanations. Heck, they shouldn't even be called papers!
Even when physical copies aren't being manufactured, length constraints would still exist. Different publication venues serve different purposes and these purposes are best served by articles of a certain length.
The purpose of a short Nature article is very different than a journal with 100+ page submissions.
https://nextjournal.com/ pretty much tries to establish this kind of paper writing - all fully reproducable and runnable in a docker container :) It also blurs the line between normal blog posts, jupyter like IDE and papers ;)
Really nice work. I just started using Julia and the Flux machine learning library a few months ago. Julia is great for numeric processing - I would say much better than Python except for the deep learning libraries like TensorFlow I rely on for work. That said, for my personal projects I have been using Flux and Julia and am happy with the ease of setup and the dev environment.
Great. It’s already here. I remember Chris had to convince some commenter on HN about why would someone want to use ODEs and Neural nets together. NeurIPS best paper goes to Neural ODEs. Great paper, long live Julia.