AttentionLayer.jl
Implements the Attention mechanism in Julia as a modular Lux layer
Discovering deep physics models with differentiable programming
Many physics models feature terms that are either partially unknown or too expensive to simulate. Discovering effective equations that represent such terms is a fundamental challenge in computational science. Multi-scale models are a prominent example: the large-scale behaviour is of main interest, but this cannot be obtained without resolving the fine scales. A well-known example occurs in climate models, which rely on the effect of clouds for accurate forecasts, but simulating clouds individually is computationally intractable. We propose a new software framework to extend generic physics models with data-driven neural networks (NNs) that represent the effect of small scales on large scales. The framework will use differentiable programming, allowing to couple multi-scale models and NNs while embedded in a learning environment. We test our framework on turbulent fluid flows. In particular, we develop new differentiable wind-turbine wake models, to be used for optimal control of wind farms.
Democratizing multi-physics simulations with high-productivity high-performance finite element software
Implements the Attention mechanism in Julia as a modular Lux layer
BestieTemplate.jl is a template focused on best practices for package development in Julia.
Julia package to implement Convolutional Neural Operators
CoupledNODE.jl is a SciML repository that extends NODEs (Neural Ordinary Differential Equations) to C-NODEs (Coupled Neural ODEs), providing a data-driven approach to modelling solutions for multiscale systems when exact solutions are not feasible.
This package implements energy-conserving solvers for the incompressible Navier-Stokes equations on a staggered Cartesian grid. It is based on the Matlab package INS2D/INS3D. The simulations can be run on the single/multithreaded CPUs or Nvidia GPUs.