AttentionLayer.jl
Implements the Attention mechanism in Julia as a modular Lux layer
Julia package to implement Convolutional Neural Operators
This package implements Convolutional Neural Operators following this. The CNOs can then be used as custom Lux models and they are compatible with closure modeling.
using Pkg
Pkg.add(url="git@github.com:DEEPDIP-project/ConvolutionalNeuralOperator.jl.git")
You probably want to use the cno
function to create a closure model, which can be used in CoupledNODE or as a Lux model.
closure, θ_start, st = cno(
T = T,
N = N,
D = D,
cutoff = cutoff,
ch_sizes = ch_,
activations = act,
down_factors = df,
k_radii = k_rad,
bottleneck_depths = bd,
rng = rng,
use_cuda = false,
)
to get the closure model, and then use it as a Lux model, or in CoupledNODE
l, trainstate = CoupledNODE.train(
closure,
θ,
st,
dataloader,
loss;
tstate = trainstate,
nepochs = 2,
alg = Adam(T(1.0e-3),
)
Look in test/
for more detailed examples on how to use the package, or look at the documentation.
If you use ConvolutionalNeuralOperators.jl in your work, please cite using the reference given in CITATION.cff.
If you want to make contributions of any kind, please first that a look into our contributing guide directly on GitHub or the contributing page on the website
Discovering deep physics models with differentiable programming
Implements the Attention mechanism in Julia as a modular Lux layer
CoupledNODE.jl is a SciML repository that extends NODEs (Neural Ordinary Differential Equations) to C-NODEs (Coupled Neural ODEs), providing a data-driven approach to modelling solutions for multiscale systems when exact solutions are not feasible.