Package: attention 0.4.0

attention: Self-Attention Algorithm

Self-Attention algorithm helper functions and demonstration vignettes of increasing depth on how to construct the Self-Attention algorithm, this is based on Vaswani et al. (2017) <doi:10.48550/arXiv.1706.03762>, Dan Jurafsky and James H. Martin (2022, ISBN:978-0131873216) <https://web.stanford.edu/~jurafsky/slp3/> "Speech and Language Processing (3rd ed.)" and Alex Graves (2020) <https://www.youtube.com/watch?v=AIiwuClvH6k> "Attention and Memory in Deep Learning".

Authors:Bastiaan Quast [aut, cre]

attention_0.4.0.tar.gz
attention_0.4.0.zip(r-4.5)attention_0.4.0.zip(r-4.4)attention_0.4.0.zip(r-4.3)
attention_0.4.0.tgz(r-4.4-any)attention_0.4.0.tgz(r-4.3-any)
attention_0.4.0.tar.gz(r-4.5-noble)attention_0.4.0.tar.gz(r-4.4-noble)
attention_0.4.0.tgz(r-4.4-emscripten)attention_0.4.0.tgz(r-4.3-emscripten)
attention.pdf |attention.html
attention/json (API)
NEWS

# Install 'attention' in R:
install.packages('attention', repos = c('https://bquast.r-universe.dev', 'https://cloud.r-project.org'))

Peer review:

Bug tracker:https://github.com/bquast/attention/issues

On CRAN:

4 exports 1 stars 1.88 score 0 dependencies 5 dependents 6 scripts 447 downloads

Last updated 10 months agofrom:64e8f97e0f. Checks:OK: 7. Indexed: yes.

TargetResultDate
Doc / VignettesOKSep 09 2024
R-4.5-winOKSep 09 2024
R-4.5-linuxOKSep 09 2024
R-4.4-winOKSep 09 2024
R-4.4-macOKSep 09 2024
R-4.3-winOKSep 09 2024
R-4.3-macOKSep 09 2024

Exports:attentionComputeWeightsRowMaxSoftMax

Dependencies:

Complete Self-Attention from Scratch

Rendered fromcomplete_attention.Rmdusingknitr::rmarkdownon Sep 09 2024.

Last update: 2022-06-23
Started: 2022-06-23

Simple Self-Attention from Scratch

Rendered fromsimple_attention.Rmdusingknitr::rmarkdownon Sep 09 2024.

Last update: 2022-06-24
Started: 2022-06-22