Package: attention 0.4.0

attention: Self-Attention Algorithm

Self-Attention algorithm helper functions and demonstration vignettes of increasing depth on how to construct the Self-Attention algorithm, this is based on Vaswani et al. (2017) <doi:10.48550/arXiv.1706.03762>, Dan Jurafsky and James H. Martin (2022, ISBN:978-0131873216) <https://web.stanford.edu/~jurafsky/slp3/> "Speech and Language Processing (3rd ed.)" and Alex Graves (2020) <https://www.youtube.com/watch?v=AIiwuClvH6k> "Attention and Memory in Deep Learning".

Authors:Bastiaan Quast [aut, cre]

attention_0.4.0.tar.gz
attention_0.4.0.zip(r-4.5)attention_0.4.0.zip(r-4.4)attention_0.4.0.zip(r-4.3)
attention_0.4.0.tgz(r-4.4-any)attention_0.4.0.tgz(r-4.3-any)
attention_0.4.0.tar.gz(r-4.5-noble)attention_0.4.0.tar.gz(r-4.4-noble)
attention_0.4.0.tgz(r-4.4-emscripten)attention_0.4.0.tgz(r-4.3-emscripten)
attention.pdf |attention.html
attention/json (API)
NEWS

# Install 'attention' in R:
install.packages('attention', repos = c('https://bquast.r-universe.dev', 'https://cloud.r-project.org'))

Peer review:

Bug tracker:https://github.com/bquast/attention/issues

On CRAN:

5.18 score 1 stars 5 packages 6 scripts 495 downloads 4 exports 0 dependencies

Last updated 1 years agofrom:64e8f97e0f. Checks:OK: 7. Indexed: yes.

TargetResultDate
Doc / VignettesOKNov 08 2024
R-4.5-winOKNov 08 2024
R-4.5-linuxOKNov 08 2024
R-4.4-winOKNov 08 2024
R-4.4-macOKNov 08 2024
R-4.3-winOKNov 08 2024
R-4.3-macOKNov 08 2024

Exports:attentionComputeWeightsRowMaxSoftMax

Dependencies:

Complete Self-Attention from Scratch

Rendered fromcomplete_attention.Rmdusingknitr::rmarkdownon Nov 08 2024.

Last update: 2022-06-23
Started: 2022-06-23

Simple Self-Attention from Scratch

Rendered fromsimple_attention.Rmdusingknitr::rmarkdownon Nov 08 2024.

Last update: 2022-06-24
Started: 2022-06-22