The Julia Programming Language
Scientific machine learning combines differentiable programming, scientific simulation (differential equations, nonlinear solvers, etc.), and machine learning (deep learning) in order impose physical constraints on machine learning and automatically learn biological models. Given the composibility of Julia, many have noted that it is positioned as the best language for this set of numerical techniques, but how to do actually “do” SciML? This workshop gets your hands dirty.
In this workshop we’ll dive into some of the latest techniques in scientific machine learning, including Universal Differential Equations (Universal Differential Equations for Scientific Machine Learning), Physics-Informed Neural Networks (Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations), and Sparse Identification of Nonlinear Dynamics (SInDy, Discovering governing equations from data by sparse identification of nonlinear dynamical systems). The goal is to get those in the workshop familiar with what these methods are, what kinds of problems they solve, and know how to use Julia packages to implement them.
The workshop will jump right into how to model the missing part of a physical simulation, describe how universal approximators (neural networks) can be used in this context, and show how to transform such problems into an optimization problem which is then accelerated by specializing automatic differentiation. The set of packages that is involved in this is somewhat intense, using many tools from JuliaDiffEq (DiffEqFlux.jl, DifferentialEquations.jl, DiffEqSensitivity.jl, ModelingToolkit.jl, NeuralPDE.jl, DataDrivenDiffEq.jl, Surrogates.jl, etc.) combined with machine learning tools (Flux.jl), differentiation tooling (SparseDiffTools.jl, Zygote.jl, ForwardDiff.jl, ReverseDiff.jl, etc.), and optimization tooling (JuMP, Optim.jl, Flux.jl, NLopt.jl, etc.) all spun together in a glorious soup that automatically discovers physical laws at the end of the day. Thus this workshop has something different to offer for everyone: new users of Julia will get a nice overview of the unique composibility of the Julia package ecosystem, while experienced Julia users will learn how to bridge some area that they are comfortable with (such as machine learning) to a whole new set of phenomena. Meanwhile, even though who are only knee deep in coding can gain a lot from learning these new mathematical advances, meaning that even a casual observer likely has a lot to learn!
Torchdiffeq benchmarks: https://gist.github.com/ChrisRackauckas/cc6ac746e2dfd285c28e0584a2bfd320
Neural ODE training benchmarks: https://gist.github.com/ChrisRackauckas/4a4d526c15cc4170ce37da837bfc32c4
torchsde benchmarks: https://gist.github.com/ChrisRackauckas/6a03e7b151c86b32d74b41af54d495c6
Source
The concept is very powerful. Thanks for the presentation.
I loved when people were asking for a break! Chris is a machine, such a dense and well-explained lecture! I love it. Thank you very much for sharing, can't wait to take a look at the publicly available MIT lectures about the same topic.
Excellent talk!
Intros ends in https://youtu.be/QwVO0Xh2Hbg?t=748
How could someone be this incredible?
So much great stuff here. I agree that Chris is incredible- a great teacher/explainer. Keep up the great work Chris and everybody!
How did you get the image of rabbits and wolves in VS Code?
Chris is just an incredible speaker/teacher, thanks so much Chris!!
Great presentation Chris. Thank you!
Amazing work!, incredibly helpful the part of the live coding thanks !
Could you please add timestamps ? Would be easier to navigate the talk as it is very large
Excellent work! I am very excited to start using this in my research. Thank you for all the hard work on this by the whole community
Great presentation!
I am getting an error message indicating that format is incorrect, when I try to download the video through a video downloader for latter review on my iPad etc.
Very good presentation. I am interested in applied math problems and the focus on Lotka-Volterra equations is questionable WRT predator/prey models, where recently it has been shown that the cycles are more likely due repeating weather patterns. In fact so much of applied differential equations are driven by potentially unknown forcing functions, which brings an entirely different focus on solution approaches.