AttentionLayer.jl

Implements the Attention mechanism in Julia as a modular Lux layer

1
contributor
Get started
17 commitsLast commit ≈ 1 month ago1 star0 forks

Cite this software

What AttentionLayer.jl can do for you

AttentionLayer

Stable Documentation In development documentation Build Status Test workflow status Lint workflow Status Docs workflow Status Coverage DOI Contributor Covenant All Contributors BestieTemplate

This package implements the attention mechanism as a Lux layer. It can then be used for closure modeling.

Install

using Pkg
Pkg.add("git@github.com:DEEPDIP-project/AttentionLayer.jl.git")

Usage

Look in test/ for examples on how to use the package.

How to Cite

If you use AttentionLayer.jl in your work, please cite using the reference given in CITATION.cff.

Contributing

If you want to make contributions of any kind, please first that a look into our contributing guide directly on GitHub or the contributing page on the website


Contributors

Keywords
Programming language
  • Julia 100%
License
</>Source code

Participating organisations

Netherlands eScience Center
Natural Sciences & Engineering
Natural Sciences & Engineering
CWI

Contributors

Related projects

DEEPDIP

Discovering deep physics models with differentiable programming

Updated 7 months ago
In progress

Related software

ConvolutionalNeuralOperators.jl

CO

Julia package to implement Convolutional Neural Operators

Updated 1 month ago
1

CoupledNODE

CO

CoupledNODE.jl is a SciML repository that extends NODEs (Neural Ordinary Differential Equations) to C-NODEs (Coupled Neural ODEs), providing a data-driven approach to modelling solutions for multiscale systems when exact solutions are not feasible.

Updated 4 weeks ago
5