Solving the Euler-Lagrange equation for the system we get the following set of equations.
As long as we can put the equation like this, we don’t have to worry about the data of derivative. We can obtain it by solving RHS through diff equation solvers. And further do calculations using this as well.
The following code returns the derivative with respect to the time.
In order to obtain the trajectory back from the learned Lagrangian Neural Network, is expressed as equations of gradients of our LNN, and integrated twice using odeint — a built in function in script to integrate.
x0 = np.array([3*np.pi/7, 3*np.pi/4, 0, 0], dtype=np.float32)
t = np.arange(N, dtype=np.float32) # time steps 0 to N
x_train is generated by using the function solve_analytical : (Initial state, times) → Integration of dynamics equation (f_analytical).
xt_train is generated by applying f_analytical to x_train
x_train = jax.device_get(solve_analytical(x0, t)) # dynamics for first N time steps
xt_train = jax.device_get(jax.vmap(f_analytical)(x_train)) # time derivatives of each state
y_train = jax.device_get(analytical_step(x_train)) # analytical next step
Note that xt is generated by the applying the RHS function on the solution which was generated by numerically integrating that same function.
noise = np.random.RandomState(0).randn(x0.size)
t_test = np.arange(N, 2*N, dtype=np.float32) # time steps N to 2N
x_test = jax.device_get(solve_analytical(x0, t_test)) # dynamics for next N time steps
xt_test = jax.device_get(jax.vmap(f_analytical)(x_test)) # time derivatives of each state
y_test = jax.device_get(analytical_step(x_test)) # analytical next step
Replace the Lagrangian with a parametric model. Here the function learned_lagrangian is a function of learnable params and returns the Lagrangian function defined inside it as a neural network.